Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián
2009-01-01
This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975
An Event-Based Approach to Distributed Diagnosis of Continuous Systems
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon
2010-01-01
Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.
Safety design approach for external events in Japan sodium-cooled fast reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamano, H.; Kubo, S.; Tani, A.
2012-07-01
This paper describes a safety design approach for external events in the design study of Japan sodium-cooled fast reactor. An emphasis is introduction of a design extension external condition (DEEC). In addition to seismic design, other external events such as tsunami, strong wind, abnormal temperature, etc. were addressed in this study. From a wide variety of external events consisting of natural hazards and human-induced ones, a screening method was developed in terms of siting, consequence, frequency to select representative events. Design approaches for these events were categorized on the probabilistic, statistical and deterministic basis. External hazard conditions were considered mainlymore » for DEECs. In the probabilistic approach, the DEECs of earthquake, tsunami and strong wind were defined as 1/10 of exceedance probability of the external design bases. The other representative DEECs were also defined based on statistical or deterministic approaches. (authors)« less
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Event-based analysis of free-living behaviour.
Granat, Malcolm H
2012-11-01
The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Jiang, Huaiguang; Tan, Jin
This paper proposes an event-driven approach for reconfiguring distribution systems automatically. Specifically, an optimal synchrophasor sensor placement (OSSP) is used to reduce the number of synchrophasor sensors while keeping the whole system observable. Then, a wavelet-based event detection and location approach is used to detect and locate the event, which performs as a trigger for network reconfiguration. With the detected information, the system is then reconfigured using the hierarchical decentralized approach to seek for the new optimal topology. In this manner, whenever an event happens the distribution network can be reconfigured automatically based on the real-time information that is observablemore » and detectable.« less
Characterization of GM events by insert knowledge adapted re-sequencing approaches
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-01-01
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events. PMID:24088728
Characterization of GM events by insert knowledge adapted re-sequencing approaches.
Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing
2013-10-03
Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.
2006-01-01
Event-based prospective memory involves remembering to perform an action in response to a particular future event. Normal younger and older adults performed event-based prospective memory tasks in 2 experiments. The authors applied a formal multinomial processing tree model of prospective memory (Smith & Bayen, 2004) to disentangle age differences…
Multigroup Monte Carlo on GPUs: Comparison of history- and event-based algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Steven P.; Slattery, Stuart R.; Evans, Thomas M.
This article presents an investigation of the performance of different multigroup Monte Carlo transport algorithms on GPUs with a discussion of both history-based and event-based approaches. Several algorithmic improvements are introduced for both approaches. By modifying the history-based algorithm that is traditionally favored in CPU-based MC codes to occasionally filter out dead particles to reduce thread divergence, performance exceeds that of either the pure history-based or event-based approaches. The impacts of several algorithmic choices are discussed, including performance studies on Kepler and Pascal generation NVIDIA GPUs for fixed source and eigenvalue calculations. Single-device performance equivalent to 20–40 CPU cores onmore » the K40 GPU and 60–80 CPU cores on the P100 GPU is achieved. Last, in addition, nearly perfect multi-device parallel weak scaling is demonstrated on more than 16,000 nodes of the Titan supercomputer.« less
Multigroup Monte Carlo on GPUs: Comparison of history- and event-based algorithms
Hamilton, Steven P.; Slattery, Stuart R.; Evans, Thomas M.
2017-12-22
This article presents an investigation of the performance of different multigroup Monte Carlo transport algorithms on GPUs with a discussion of both history-based and event-based approaches. Several algorithmic improvements are introduced for both approaches. By modifying the history-based algorithm that is traditionally favored in CPU-based MC codes to occasionally filter out dead particles to reduce thread divergence, performance exceeds that of either the pure history-based or event-based approaches. The impacts of several algorithmic choices are discussed, including performance studies on Kepler and Pascal generation NVIDIA GPUs for fixed source and eigenvalue calculations. Single-device performance equivalent to 20–40 CPU cores onmore » the K40 GPU and 60–80 CPU cores on the P100 GPU is achieved. Last, in addition, nearly perfect multi-device parallel weak scaling is demonstrated on more than 16,000 nodes of the Titan supercomputer.« less
Courses of action for effects based operations using evolutionary algorithms
NASA Astrophysics Data System (ADS)
Haider, Sajjad; Levis, Alexander H.
2006-05-01
This paper presents an Evolutionary Algorithms (EAs) based approach to identify effective courses of action (COAs) in Effects Based Operations. The approach uses Timed Influence Nets (TINs) as the underlying mathematical model to capture a dynamic uncertain situation. TINs provide a concise graph-theoretic probabilistic approach to specify the cause and effect relationships that exist among the variables of interest (actions, desired effects, and other uncertain events) in a problem domain. The purpose of building these TIN models is to identify and analyze several alternative courses of action. The current practice is to use trial and error based techniques which are not only labor intensive but also produce sub-optimal results and are not capable of modeling constraints among actionable events. The EA based approach presented in this paper is aimed to overcome these limitations. The approach generates multiple COAs that are close enough in terms of achieving the desired effect. The purpose of generating multiple COAs is to give several alternatives to a decision maker. Moreover, the alternate COAs could be generalized based on the relationships that exist among the actions and their execution timings. The approach also allows a system analyst to capture certain types of constraints among actionable events.
Aligning observed and modelled behaviour based on workflow decomposition
NASA Astrophysics Data System (ADS)
Wang, Lu; Du, YuYue; Liu, Wei
2017-09-01
When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.
On event-based optical flow detection
Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko
2015-01-01
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470
ERIC Educational Resources Information Center
McDonald, Sharyn; Ogden-Barnes, Stephen
2013-01-01
Service learning and problem-based learning (PBL) are distinct, yet related educational approaches. When collaborative learning events which encourage the application of the PBL principles to real world challenges faced by Not-For-Profit organizations (NFPs), these learning approaches become potentially synergistic. However, there is limited…
The 10-year Absolute Risk of Cardiovascular (CV) Events in Northern Iran: a Population Based Study
Motamed, Nima; Mardanshahi, Alireza; Saravi, Benyamin Mohseni; Siamian, Hasan; Maadi, Mansooreh; Zamani, Farhad
2015-01-01
Background: The present study was conducted to estimate 10-year cardiovascular disease events (CVD) risk using three instruments in northern Iran. Material and methods: Baseline data of 3201 participants 40-79 of a population based cohort which was conducted in Northern Iran were analyzed. Framingham risk score (FRS), World Health Organization (WHO) risk prediction charts and American college of cardiovascular / American heart association (ACC/AHA) tool were applied to assess 10-year CVD events risk. The agreement values between the risk assessment instruments were determined using the kappa statistics. Results: Our study estimated 53.5%of male population aged 40-79 had a 10 –year risk of CVD events≥10% based on ACC/AHA approach, 48.9% based on FRS and 11.8% based on WHO risk charts. A 10 –year risk≥10% was estimated among 20.1% of women using the ACC/AHA approach, 11.9%using FRS and 5.7%using WHO tool. ACC/AHA and Framingham tools had closest agreement in the estimation of 10-year risk≥10% (κ=0.7757) in meanwhile ACC/AHA and WHO approaches displayed highest agreement (κ=0.6123) in women. Conclusion: Different estimations of 10-year risk of CVD event were provided by ACC/AHA, FRS and WHO approaches. PMID:26236160
Client-Side Event Processing for Personalized Web Advertisement
NASA Astrophysics Data System (ADS)
Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad
The market for Web advertisement is continuously growing and correspondingly, the number of approaches that can be used for realizing Web advertisement are increasing. However, current approaches fail to generate very personalized ads for a current Web user that is visiting a particular Web content. They mainly try to develop a profile based on the content of that Web page or on a long-term user's profile, by not taking into account current user's preferences. We argue that by discovering a user's interest from his current Web behavior we can support the process of ad generation, especially the relevance of an ad for the user. In this paper we present the conceptual architecture and implementation of such an approach. The approach is based on the extraction of simple events from the user interaction with a Web page and their combination in order to discover the user's interests. We use semantic technologies in order to build such an interpretation out of many simple events. We present results from preliminary evaluation studies. The main contribution of the paper is a very efficient, semantic-based client-side architecture for generating and combining Web events. The architecture ensures the agility of the whole advertisement system, by complexly processing events on the client. In general, this work contributes to the realization of new, event-driven applications for the (Semantic) Web.
Event-Based Robust Control for Uncertain Nonlinear Systems Using Adaptive Dynamic Programming.
Zhang, Qichao; Zhao, Dongbin; Wang, Ding
2018-01-01
In this paper, the robust control problem for a class of continuous-time nonlinear system with unmatched uncertainties is investigated using an event-based control method. First, the robust control problem is transformed into a corresponding optimal control problem with an augmented control and an appropriate cost function. Under the event-based mechanism, we prove that the solution of the optimal control problem can asymptotically stabilize the uncertain system with an adaptive triggering condition. That is, the designed event-based controller is robust to the original uncertain system. Note that the event-based controller is updated only when the triggering condition is satisfied, which can save the communication resources between the plant and the controller. Then, a single network adaptive dynamic programming structure with experience replay technique is constructed to approach the optimal control policies. The stability of the closed-loop system with the event-based control policy and the augmented control policy is analyzed using the Lyapunov approach. Furthermore, we prove that the minimal intersample time is bounded by a nonzero positive constant, which excludes Zeno behavior during the learning process. Finally, two simulation examples are provided to demonstrate the effectiveness of the proposed control scheme.
Scalable and responsive event processing in the cloud
Suresh, Visalakshmi; Ezhilchelvan, Paul; Watson, Paul
2013-01-01
Event processing involves continuous evaluation of queries over streams of events. Response-time optimization is traditionally done over a fixed set of nodes and/or by using metrics measured at query-operator levels. Cloud computing makes it easy to acquire and release computing nodes as required. Leveraging this flexibility, we propose a novel, queueing-theory-based approach for meeting specified response-time targets against fluctuating event arrival rates by drawing only the necessary amount of computing resources from a cloud platform. In the proposed approach, the entire processing engine of a distinct query is modelled as an atomic unit for predicting response times. Several such units hosted on a single node are modelled as a multiple class M/G/1 system. These aspects eliminate intrusive, low-level performance measurements at run-time, and also offer portability and scalability. Using model-based predictions, cloud resources are efficiently used to meet response-time targets. The efficacy of the approach is demonstrated through cloud-based experiments. PMID:23230164
From IHE Audit Trails to XES Event Logs Facilitating Process Mining.
Paster, Ferdinand; Helm, Emmanuel
2015-01-01
Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.
Assessing performance and validating finite element simulations using probabilistic knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolin, Ronald M.; Rodriguez, E. A.
Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrencemore » results are used to validate finite element predictions.« less
NASA Astrophysics Data System (ADS)
Li, Yangdong; Han, Zhen; Liao, Zhongping
2009-10-01
Spatiality, temporality, legality, accuracy and continuality are characteristic of cadastral information, and the cadastral management demands that the cadastral data should be accurate, integrated and updated timely. It's a good idea to build an effective GIS management system to manage the cadastral data which are characterized by spatiality and temporality. Because no sound spatio-temporal data models have been adopted, however, the spatio-temporal characteristics of cadastral data are not well expressed in the existing cadastral management systems. An event-version-based spatio-temporal modeling approach is first proposed from the angle of event and version. Then with the help of it, an event-version-based spatio-temporal cadastral data model is built to represent spatio-temporal cadastral data. At last, the previous model is used in the design and implementation of a spatio-temporal cadastral management system. The result of the application of the system shows that the event-version-based spatio-temporal data model is very suitable for the representation and organization of cadastral data.
Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S
2009-09-01
The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.
Complex Event Processing for Content-Based Text, Image, and Video Retrieval
2016-06-01
NY): Wiley- Interscience; 2000. Feldman R, Sanger J. The text mining handbook: advanced approaches in analyzing unstructured data. New York (NY...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval
Event generators for address event representation transmitters
NASA Astrophysics Data System (ADS)
Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe
2005-06-01
Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were freezed to transmit any further events during this time window. This limited the maximum transmission speed. In order to improve this speed, Boahen proposed an improved 'burst mode' scheme. In this scheme after the row arbitration, a complete row of events is pipelined out of the array and arbitered out of the chip at higher speed. During this single row event arbitration, the array is free to generate new events and communicate to the row arbiter, in a pipelined mode. This scheme significantly improves maximum event transmission speed, specially for high traffic situations were speed is more critical. We have analyzed and studied this approach and have detected some shortcomings in the circuits reported by Boahen, which may render some false situations under some statistical conditions. The present paper proposes some improvements to overcome such situations. The improved "AER Generator" has been implemented in an AER transmitter system
2018-01-01
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events. PMID:29614060
Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just
2018-04-03
Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.
A case for multi-model and multi-approach based event attribution: The 2015 European drought
NASA Astrophysics Data System (ADS)
Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle
2017-04-01
Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2017-07-01
Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.
Behavior coordination of mobile robotics using supervisory control of fuzzy discrete event systems.
Jayasiri, Awantha; Mann, George K I; Gosine, Raymond G
2011-10-01
In order to incorporate the uncertainty and impreciseness present in real-world event-driven asynchronous systems, fuzzy discrete event systems (DESs) (FDESs) have been proposed as an extension to crisp DESs. In this paper, first, we propose an extension to the supervisory control theory of FDES by redefining fuzzy controllable and uncontrollable events. The proposed supervisor is capable of enabling feasible uncontrollable and controllable events with different possibilities. Then, the extended supervisory control framework of FDES is employed to model and control several navigational tasks of a mobile robot using the behavior-based approach. The robot has limited sensory capabilities, and the navigations have been performed in several unmodeled environments. The reactive and deliberative behaviors of the mobile robotic system are weighted through fuzzy uncontrollable and controllable events, respectively. By employing the proposed supervisory controller, a command-fusion-type behavior coordination is achieved. The observability of fuzzy events is incorporated to represent the sensory imprecision. As a systematic analysis of the system, a fuzzy-state-based controllability measure is introduced. The approach is implemented in both simulation and real time. A performance evaluation is performed to quantitatively estimate the validity of the proposed approach over its counterparts.
An object-based approach to weather analysis and its applications
NASA Astrophysics Data System (ADS)
Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew
2013-04-01
The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate the use of such system-oriented predictors for nowcasting. Columns of differential reflectivity ZDR measured by polarimetric weather radars are prominent signatures associated with thunderstorm updrafts. Since greater vertical velocities can loft larger drops and water-coated ice particles to higher altitudes above the environmental freezing level, the integrated ZDR column above the freezing level increases with increasing updraft intensity. Validation of atmospheric models concerning precipitation representation or prediction is usually confined to comparisons of precipitation fields or their temporal and spatial statistics. A comparison of the rain rates alone, however, does not immediately explain discrepancies between models and observations, because similar rain rates might be produced by different processes. Within the event-based approach for validation of models both observed and modeled rain events are analyzed by means of proxies of the precipitation process. Both sets of descriptors represent the basis for model validation since different leading descriptors - in a statistical sense- hint at process formulations potentially responsible for model failures.
Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhury, Indranil
2010-01-01
We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach
Recognizing Bedside Events Using Thermal and Ultrasonic Readings
Asbjørn, Danielsen; Jim, Torresen
2017-01-01
Falls in homes of the elderly, in residential care facilities and in hospitals commonly occur in close proximity to the bed. Most approaches for recognizing falls use cameras, which challenge privacy, or sensor devices attached to the bed or the body to recognize bedside events and bedside falls. We use data collected from a ceiling mounted 80 × 60 thermal array combined with an ultrasonic sensor device. This approach makes it possible to monitor activity while preserving privacy in a non-intrusive manner. We evaluate three different approaches towards recognizing location and posture of an individual. Bedside events are recognized using a 10-second floating image rule/filter-based approach, recognizing bedside falls with 98.62% accuracy. Bed-entry and exit events are recognized with 98.66% and 96.73% accuracy, respectively. PMID:28598394
Command Center Training Tool (C2T2)
NASA Technical Reports Server (NTRS)
Jones, Phillip; Drucker, Nich; Mathews, Reejo; Stanton, Laura; Merkle, Ed
2012-01-01
This abstract presents the training approach taken to create a management-centered, experiential learning solution for the Virginia Port Authority's Port Command Center. The resultant tool, called the Command Center Training Tool (C2T2), follows a holistic approach integrated across the training management cycle and within a single environment. The approach allows a single training manager to progress from training design through execution and AAR. The approach starts with modeling the training organization, identifying the organizational elements and their individual and collective performance requirements, including organizational-specific performance scoring ontologies. Next, the developer specifies conditions, the problems, and constructs that compose exercises and drive experiential learning. These conditions are defined by incidents, which denote a single, multi-media datum, and scenarios, which are stories told by incidents. To these layered, modular components, previously developed meta-data is attached, including associated performance requirements. The components are then stored in a searchable library An event developer can create a training event by searching the library based on metadata and then selecting and loading the resultant modular pieces. This loading process brings into the training event all the previously associated task and teamwork material as well as AAR preparation materials. The approach includes tools within an integrated management environment that places these materials at the fingertips of the event facilitator such that, in real time, the facilitator can track training audience performance and resultantly modify the training event. The approach also supports the concentrated knowledge management requirements for rapid preparation of an extensive AAR. This approach supports the integrated training cycle and allows a management-based perspective and advanced tools, through which a complex, thorough training event can be developed.
Multilevel joint competing risk models
NASA Astrophysics Data System (ADS)
Karunarathna, G. H. S.; Sooriyarachchi, M. R.
2017-09-01
Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).
A three-tiered approach for linking pharmacokinetic ...
The power of the adverse outcome pathway (AOP) framework arises from its utilization of pathway-based data to describe the initial interaction of a chemical with a molecular target (molecular initiating event; (MIE), followed by a progression through a series of key events that lead to an adverse outcome relevant for regulatory purposes. The AOP itself is not chemical specific, thus providing the biological context necessary for interpreting high throughput (HT) toxicity screening results. Application of the AOP framework and HT predictions in ecological and human health risk assessment, however, requires the consideration of chemical-specific properties that influence external exposure doses and target tissue doses. To address this requirement, a three-tiered approach was developed to provide a workflow for connecting biology-based AOPs to biochemical-based pharmacokinetic properties (absorption, distribution, metabolism, excretion; ADME), and then to chemical/human activity-based exposure pathways. This approach included: (1) The power of the adverse outcome pathway (AOP) framework arisesfrom its utilization of pathway-based data to describe the initial interaction of a chemical with a molecular target (molecular initiating event; (MIE), followed by a progression through a series of key events that lead to an adverse outcome relevant for regulatory purposes. The AOP itself is not chemical specific, thus providing the biological context necessary for interpreti
NASA Astrophysics Data System (ADS)
Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.
2015-12-01
Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.
Discriminative Prediction of A-To-I RNA Editing Events from DNA Sequence
Sun, Jiangming; Singh, Pratibha; Bagge, Annika; Valtat, Bérengère; Vikman, Petter; Spégel, Peter; Mulder, Hindrik
2016-01-01
RNA editing is a post-transcriptional alteration of RNA sequences that, via insertions, deletions or base substitutions, can affect protein structure as well as RNA and protein expression. Recently, it has been suggested that RNA editing may be more frequent than previously thought. A great impediment, however, to a deeper understanding of this process is the paramount sequencing effort that needs to be undertaken to identify RNA editing events. Here, we describe an in silico approach, based on machine learning, that ameliorates this problem. Using 41 nucleotide long DNA sequences, we show that novel A-to-I RNA editing events can be predicted from known A-to-I RNA editing events intra- and interspecies. The validity of the proposed method was verified in an independent experimental dataset. Using our approach, 203 202 putative A-to-I RNA editing events were predicted in the whole human genome. Out of these, 9% were previously reported. The remaining sites require further validation, e.g., by targeted deep sequencing. In conclusion, the approach described here is a useful tool to identify potential A-to-I RNA editing events without the requirement of extensive RNA sequencing. PMID:27764195
Secure access control and large scale robust representation for online multimedia event detection.
Liu, Changyu; Lu, Bin; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.
Mortensen, Martin B; Afzal, Shoaib; Nordestgaard, Børge G; Falk, Erling
2015-12-22
Guidelines recommend initiating primary prevention for atherosclerotic cardiovascular disease (ASCVD) with statins based on absolute ASCVD risk assessment. Recently, alternative trial-based and hybrid approaches were suggested for statin treatment eligibility. This study compared these approaches in a direct head-to-head fashion in a contemporary population. The study used the CGPS (Copenhagen General Population Study) with 37,892 subjects aged 40 to 75 years recruited in 2003 to 2008, all free of ASCVD, diabetes, and statin use at baseline. Among the population studied, 42% were eligible for statin therapy according to the 2013 American College of Cardiology/American Heart Association (ACC/AHA) risk assessment and cholesterol treatment guidelines approach, versus 56% with the trial-based approach and 21% with the hybrid approach. Among these statin-eligible subjects, the ASCVD event rate per 1,000 person-years was 9.8, 6.8, and 11.2, respectively. The ACC/AHA-recommended absolute risk score was well calibrated around the 7.5% 10-year ASCVD risk treatment threshold and discriminated better than the trial-based or hybrid approaches. Compared with the ACC/AHA risk-based approach, the net reclassification index for eligibility for statin therapy among 40- to 75-year-old subjects from the CGPS was -0.21 for the trial-based approach and -0.13 for the hybrid approach. The clinical performance of the ACC/AHA risk-based approach for primary prevention of ASCVD with statins was superior to the trial-based and hybrid approaches. Our results indicate that the ACC/AHA guidelines will prevent more ASCVD events than the trial-based and hybrid approaches, while treating fewer people compared with the trial-based approach. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Improving Distributed Diagnosis Through Structural Model Decomposition
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew John; Roychoudhury, Indranil; Biswas, Gautam; Koutsoukos, Xenofon; Pulido, Belarmino
2011-01-01
Complex engineering systems require efficient fault diagnosis methodologies, but centralized approaches do not scale well, and this motivates the development of distributed solutions. This work presents an event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, by using the structural model decomposition capabilities provided by Possible Conflicts. We develop a distributed diagnosis algorithm that uses residuals computed by extending Possible Conflicts to build local event-based diagnosers based on global diagnosability analysis. The proposed approach is applied to a multitank system, and results demonstrate an improvement in the design of local diagnosers. Since local diagnosers use only a subset of the residuals, and use subsystem models to compute residuals (instead of the global system model), the local diagnosers are more efficient than previously developed distributed approaches.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Continuous event monitoring via a Bayesian predictive approach.
Di, Jianing; Wang, Daniel; Brashear, H Robert; Dragalin, Vladimir; Krams, Michael
2016-01-01
In clinical trials, continuous monitoring of event incidence rate plays a critical role in making timely decisions affecting trial outcome. For example, continuous monitoring of adverse events protects the safety of trial participants, while continuous monitoring of efficacy events helps identify early signals of efficacy or futility. Because the endpoint of interest is often the event incidence associated with a given length of treatment duration (e.g., incidence proportion of an adverse event with 2 years of dosing), assessing the event proportion before reaching the intended treatment duration becomes challenging, especially when the event onset profile evolves over time with accumulated exposure. In particular, in the earlier part of the study, ignoring censored subjects may result in significant bias in estimating the cumulative event incidence rate. Such a problem is addressed using a predictive approach in the Bayesian framework. In the proposed approach, experts' prior knowledge about both the frequency and timing of the event occurrence is combined with observed data. More specifically, during any interim look, each event-free subject will be counted with a probability that is derived using prior knowledge. The proposed approach is particularly useful in early stage studies for signal detection based on limited information. But it can also be used as a tool for safety monitoring (e.g., data monitoring committee) during later stage trials. Application of the approach is illustrated using a case study where the incidence rate of an adverse event is continuously monitored during an Alzheimer's disease clinical trial. The performance of the proposed approach is also assessed and compared with other Bayesian and frequentist methods via simulation. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Ahmad, Kashif; Conci, Nicola; Boato, Giulia; De Natale, Francesco G. B.
2017-11-01
Over the last few years, a rapid growth has been witnessed in the number of digital photos produced per year. This rapid process poses challenges in the organization and management of multimedia collections, and one viable solution consists of arranging the media on the basis of the underlying events. However, album-level annotation and the presence of irrelevant pictures in photo collections make event-based organization of personal photo albums a more challenging task. To tackle these challenges, in contrast to conventional approaches relying on supervised learning, we propose a pipeline for event recognition in personal photo collections relying on a multiple instance-learning (MIL) strategy. MIL is a modified form of supervised learning and fits well for such applications with weakly labeled data. The experimental evaluation of the proposed approach is carried out on two large-scale datasets including a self-collected and a benchmark dataset. On both, our approach significantly outperforms the existing state-of-the-art.
NASA Astrophysics Data System (ADS)
Konapala, Goutam; Mishra, Ashok
2017-12-01
The quantification of spatio-temporal hydroclimatic extreme events is a key variable in water resources planning, disaster mitigation, and preparing climate resilient society. However, quantification of these extreme events has always been a great challenge, which is further compounded by climate variability and change. Recently complex network theory was applied in earth science community to investigate spatial connections among hydrologic fluxes (e.g., rainfall and streamflow) in water cycle. However, there are limited applications of complex network theory for investigating hydroclimatic extreme events. This article attempts to provide an overview of complex networks and extreme events, event synchronization method, construction of networks, their statistical significance and the associated network evaluation metrics. For illustration purpose, we apply the complex network approach to study the spatio-temporal evolution of droughts in Continental USA (CONUS). A different drought threshold leads to a new drought event as well as different socio-economic implications. Therefore, it would be interesting to explore the role of thresholds on spatio-temporal evolution of drought through network analysis. In this study, long term (1900-2016) Palmer drought severity index (PDSI) was selected for spatio-temporal drought analysis using three network-based metrics (i.e., strength, direction and distance). The results indicate that the drought events propagate differently at different thresholds associated with initiation of drought events. The direction metrics indicated that onset of mild drought events usually propagate in a more spatially clustered and uniform approach compared to onsets of moderate droughts. The distance metric shows that the drought events propagate for longer distance in western part compared to eastern part of CONUS. We believe that the network-aided metrics utilized in this study can be an important tool in advancing our knowledge on drought propagation as well as other hydroclimatic extreme events. Although the propagation of droughts is investigated using the network approach, however process (physics) based approaches is essential to further understand the dynamics of hydroclimatic extreme events.
Not Another Quiz: An Approach to Engage Today's Students in Meaningful Current Events Discussions
ERIC Educational Resources Information Center
Wright, Leigh L.; Shemberger, Melony; Price, Elizabeth
2016-01-01
Journalism professors are concerned with how effectively students understand current news events and engage with mainstream news sources. This essay is based on a survey administered to students in a newswriting course and analyzed the kinds of current news that students followed in weekly assignments designed with a digital, interactive approach.…
Choi, Yun Ho; Yoo, Sung Jin
2018-06-01
This paper investigates the event-triggered decentralized adaptive tracking problem of a class of uncertain interconnected nonlinear systems with unexpected actuator failures. It is assumed that local control signals are transmitted to local actuators with time-varying faults whenever predefined conditions for triggering events are satisfied. Compared with the existing control-input-based event-triggering strategy for adaptive control of uncertain nonlinear systems, the aim of this paper is to propose a tracking-error-based event-triggering strategy in the decentralized adaptive fault-tolerant tracking framework. The proposed approach can relax drastic changes in control inputs caused by actuator faults in the existing triggering strategy. The stability of the proposed event-triggering control system is analyzed in the Lyapunov sense. Finally, simulation comparisons of the proposed and existing approaches are provided to show the effectiveness of the proposed theoretical result in the presence of actuator faults. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
An analysis of the 2016 Hitomi breakup event
NASA Astrophysics Data System (ADS)
Flegel, Sven; Bennett, James; Lachut, Michael; Möckel, Marek; Smith, Craig
2017-04-01
The breakup of Hitomi (ASTRO-H) on 26 March 2016 is analysed. Debris from the fragmentation is used to estimate the time of the event by propagating backwards and estimating the close approach with the parent object. Based on this method, the breakup event is predicted to have occurred at approximately 01:42 UTC on 26 March 2016. The Gaussian variation of parameters equations based on the instantaneous orbits at the predicted time of the event are solved to gain additional insight into the on-orbit position of Hitomi at the time of the event and to test an alternate approach of determining the event epoch and location. A conjunction analysis is carried out between Hitomi and all catalogued objects which were in orbit around the estimated time of the anomaly. Several debris objects have close approaches with Hitomi; however, there is no evidence to support the breakup was caused by a catalogued object. Debris from both of the largest fragmentation events—the Iridium 33-Cosmos 2251 conjunction in 2009 and the intentional destruction of Fengyun 1C in 2007—is involved in close approaches with Hitomi indicating the persistent threat these events have caused in subsequent space missions. To quantify the magnitude of a potential conjunction, the fragmentation resulting from a collision with the debris is modelled using the EVOLVE-4 breakup model. The debris characteristics are estimated from two-line element data. This analysis is indicative of the threat to space assets that mission planners face due to the growing debris population. The impact of the actual event to the environment is investigated based on the debris associated with Hitomi which is currently contained in the United States Strategic Command's catalogue. A look at the active missions in the orbital vicinity of Hitomi reveals that the Hubble Space Telescope is among the spacecraft which may be immediately affected by the new debris.[Figure not available: see fulltext.
A Typological Approach to Translation of English and Chinese Motion Events
ERIC Educational Resources Information Center
Deng, Yu; Chen, Huifang
2012-01-01
English and Chinese are satellite-framed languages in which Manner is usually incorporated with Motion in the verb and Path is denoted by the satellite. Based on Talmy's theory of motion event and typology, the research probes into translation of English and Chinese motion events and finds that: (1) Translation of motion events in English and…
ERIC Educational Resources Information Center
McDermott, Kathleen B.; Szpunar, Karl K.; Christ, Shawn E.
2009-01-01
In designing experiments to investigate retrieval of event memory, researchers choose between utilizing laboratory-based methods (in which to-be-remembered materials are presented to participants) and autobiographical approaches (in which the to-be-remembered materials are events from the participant's pre-experimental life). In practice, most…
Masquerade Detection Using a Taxonomy-Based Multinomial Modeling Approach in UNIX Systems
2008-08-25
primarily the modeling of statistical features , such as the frequency of events, the duration of events, the co- occurrence of multiple events...are identified, we can extract features representing such behavior while auditing the user’s behavior. Figure1: Taxonomy of Linux and Unix...achieved when the features are extracted just from simple commands. Method Hit Rate False Positive Rate ocSVM using simple cmds (freq.-based
NASA Astrophysics Data System (ADS)
Keilis-Borok, V. I.; Soloviev, A.; Gabrielov, A.
2011-12-01
We describe a uniform approach to predicting different extreme events, also known as critical phenomena, disasters, or crises. The following types of such events are considered: strong earthquakes; economic recessions (their onset and termination); surges of unemployment; surges of crime; and electoral changes of the governing party. A uniform approach is possible due to the common feature of these events: each of them is generated by a certain hierarchical dissipative complex system. After a coarse-graining, such systems exhibit regular behavior patterns; we look among them for "premonitory patterns" that signal the approach of an extreme event. We introduce methodology, based on the optimal control theory, assisting disaster management in choosing optimal set of disaster preparedness measures undertaken in response to a prediction. Predictions with their currently realistic (limited) accuracy do allow preventing a considerable part of the damage by a hierarchy of preparedness measures. Accuracy of prediction should be known, but not necessarily high.
On-Board Event-Based State Estimation for Trajectory Approaching and Tracking of a Vehicle
Martínez-Rey, Miguel; Espinosa, Felipe; Gardel, Alfredo; Santos, Carlos
2015-01-01
For the problem of pose estimation of an autonomous vehicle using networked external sensors, the processing capacity and battery consumption of these sensors, as well as the communication channel load should be optimized. Here, we report an event-based state estimator (EBSE) consisting of an unscented Kalman filter that uses a triggering mechanism based on the estimation error covariance matrix to request measurements from the external sensors. This EBSE generates the events of the estimator module on-board the vehicle and, thus, allows the sensors to remain in stand-by mode until an event is generated. The proposed algorithm requests a measurement every time the estimation distance root mean squared error (DRMS) value, obtained from the estimator's covariance matrix, exceeds a threshold value. This triggering threshold can be adapted to the vehicle's working conditions rendering the estimator even more efficient. An example of the use of the proposed EBSE is given, where the autonomous vehicle must approach and follow a reference trajectory. By making the threshold a function of the distance to the reference location, the estimator can halve the use of the sensors with a negligible deterioration in the performance of the approaching maneuver. PMID:26102489
A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS)
Rigi, Amin
2018-01-01
In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments. PMID:29364190
Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection
Liu, Changyu; Li, Huiling
2014-01-01
We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks
Zhang, Shukui; Chen, Hao; Zhu, Qiaoming
2014-01-01
The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690
A multidisciplinary approach to trace Asian dust storms from source to sink
NASA Astrophysics Data System (ADS)
Yan, Yan; Sun, Youbin; Ma, Long; Long, Xin
2015-03-01
Tracing the source of dust storm (DS) in mega-cities of northern China currently suffers ambiguities from different approaches including source-sink proxy comparison, air mass back trajectory modeling, and satellite image monitoring. By integrating advantages of all three methods, we present a multidisciplinary approach to trace the provenance of dust fall in Xi'an during the spring season (March to May) of 2012. We collected daily dust fall to calculate dust flux variation, and detected eight DS events with remarkable high flux values based on meteorological comparison and extreme detection algorithm. By combining MODIS images and accompanying real-time air mass back trajectories, we attribute four of them as natural DS events and the other four as anthropogenic DS events, suggesting the importance of natural and anthropogenic processes in supplying long-range transported dust. The primary sources of these DS events were constrained to three possible areas, including the northern Chinese deserts, Taklimakan desert, and Gurbantunggut desert. Proxy comparisons based upon the quartz crystallinity index and oxygen isotope further confirmed the source-to-sink linkage between the natural DS events in Xi'an and the dust emissions from the northern Chinese deserts. The integration of geochemical and meteorological tracing approaches favors the dominant contribution of short-distance transportation of modern dust fall on the Chinese Loess Plateau. Our study shows that the multidisciplinary approach could permit a better source identification of modern dust and should be applied properly for tracing the provenance fluctuations of geological dust deposits.
Forbes, David; Lewis, Virginia; Varker, Tracey; Phelps, Andrea; O'Donnell, Meaghan; Wade, Darryl J; Ruzek, Josef I; Watson, Patricia; Bryant, Richard A; Creamer, Mark
2011-01-01
International clinical practice guidelines for the management of psychological trauma recommend Psychological First Aid (PFA) as an early intervention for survivors of potentially traumatic events. These recommendations are consensus-based, and there is little published evidence assessing the effectiveness of PFA. This is not surprising given the nature of the intervention and the complicating factors involved in any evaluation of PFA. There is, nevertheless, an urgent need for stronger evidence evaluating its effectiveness. The current paper posits that the implementation and evaluation of PFA within high risk organizational settings is an ideal place to start. The paper provides a framework for a phasic approach to implementing PFA within such settings and presents a model for evaluating its effectiveness using a logic- or theory-based approach which considers both pre-event and post-event factors. Phases 1 and 2 of the PFA model are pre-event actions, and phases 3 and 4 are post-event actions. It is hoped that by using the Phased PFA model and evaluation method proposed in this paper, future researchers will begin to undertake the important task of building the evidence about the most effective approach to providing PFA in high risk organizational and community disaster settings.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-08-01
SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.
A semi-supervised learning framework for biomedical event extraction based on hidden topics.
Zhou, Deyu; Zhong, Dayou
2015-05-01
Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely described by hidden topics and structures of the sentences. Copyright © 2015 Elsevier B.V. All rights reserved.
Clinical risk management approach for long-duration space missions.
Gray, Gary W; Sargsyan, Ashot E; Davis, Jeffrey R
2010-12-01
In the process of crewmember evaluation and certification for long-duration orbital missions, the International Space Station (ISS) Multilateral Space Medicine Board (MSMB) encounters a surprisingly wide spectrum of clinical problems. Some of these conditions are identified within the ISS Medical Standards as requiring special consideration, or as falling outside the consensus Medical Standards promulgated for the ISS program. To assess the suitability for long-duration missions on ISS for individuals with medical problems that fall outside of standards or are otherwise of significant concern, the MSMB has developed a risk matrix approach to assess the risks to the individual, the mission, and the program. The goal of this risk assessment is to provide a more objective, evidence- and risk-based approach for aeromedical disposition. Using a 4 x 4 risk matrix, the probability of an event is plotted against the potential impact. Event probability is derived from a detailed review of clinical and aerospace literature, and based on the best available evidence. The event impact (consequences) is assessed and assigned within the matrix. The result has been a refinement of MSMB case assessment based on evidence-based data incorporated into a risk stratification process. This has encouraged an objective assessment of risk and, in some cases, has resulted in recertification of crewmembers with medical conditions which hitherto would likely have been disqualifying. This paper describes a risk matrix approach developed for MSMB disposition decisions. Such an approach promotes objective, evidence-based decision-making and is broadly applicable within the aerospace medicine community.
ERIC Educational Resources Information Center
Ball, B. Hunter; Brewer, Gene A.
2018-01-01
The present study implemented an individual differences approach in conjunction with response time (RT) variability and distribution modeling techniques to better characterize the cognitive control dynamics underlying ongoing task cost (i.e., slowing) and cue detection in event-based prospective memory (PM). Three experiments assessed the relation…
The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
Accounting for Heaping in Retrospectively Reported Event Data – A Mixture-Model Approach
Bar, Haim Y.; Lillard, Dean R.
2012-01-01
When event data are retrospectively reported, more temporally distal events tend to get “heaped” on even multiples of reporting units. Heaping may introduce a type of attenuation bias because it causes researchers to mismatch time-varying right-hand side variables. We develop a model-based approach to estimate the extent of heaping in the data, and how it affects regression parameter estimates. We use smoking cessation data as a motivating example, but our method is general. It facilitates the use of retrospective data from the multitude of cross-sectional and longitudinal studies worldwide that collect and potentially could collect event data. PMID:22733577
An interdisciplinary approach for earthquake modelling and forecasting
NASA Astrophysics Data System (ADS)
Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.
2016-12-01
Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.
Theodoro, Daniel; Bausano, Brian; Lewis, Lawrence; Evanoff, Bradley; Kollef, Marin
2010-04-01
The safest site for central venous cannulation (CVC) remains debated. Many emergency physicians (EPs) advocate the ultrasound-guided internal jugular (USIJ) approach because of data supporting its efficiency. However, a number of physicians prefer, and are most comfortable with, the subclavian (SC) vein approach. The purpose of this study was to describe adverse event rates among operators using the USIJ approach, and the landmark SC vein approach without US. This was a prospective observational trial of patients undergoing CVC of the SC or internal jugular veins in the emergency department (ED). Physicians performing the procedures did not undergo standardized training in either technique. The primary outcome was a composite of adverse events defined as hematoma, arterial cannulation, pneumothorax, and failure to cannulate. Physicians recorded the anatomical site of cannulation, US assistance, indications, and acute complications. Variables of interest were collected from the pharmacy and ED record. Physician experience was based on a self-reported survey. The authors followed outcomes of central line insertion until device removal or patient discharge. Physicians attempted 236 USIJ and 132 SC cannulations on 333 patients. The overall adverse event rate was 22% with failure to cannulate being the most common. Adverse events occurred in 19% of USIJ attempts, compared to 29% of non-US-guided SC attempts. Among highly experienced operators, CVCs placed at the SC site resulted in more adverse events than those performed using USIJ (relative risk [RR] = 1.89, 95% confidence interval [CI] = 1.05 to 3.39). While limited by observational design, our results suggest that the USIJ technique may result in fewer adverse events compared to the landmark SC approach.
NASA Astrophysics Data System (ADS)
Bialas, James; Oommen, Thomas; Rebbapragada, Umaa; Levin, Eugene
2016-07-01
Object-based approaches in the segmentation and classification of remotely sensed images yield more promising results compared to pixel-based approaches. However, the development of an object-based approach presents challenges in terms of algorithm selection and parameter tuning. Subjective methods are often used, but yield less than optimal results. Objective methods are warranted, especially for rapid deployment in time-sensitive applications, such as earthquake damage assessment. Herein, we used a systematic approach in evaluating object-based image segmentation and machine learning algorithms for the classification of earthquake damage in remotely sensed imagery. We tested a variety of algorithms and parameters on post-event aerial imagery for the 2011 earthquake in Christchurch, New Zealand. Results were compared against manually selected test cases representing different classes. In doing so, we can evaluate the effectiveness of the segmentation and classification of different classes and compare different levels of multistep image segmentations. Our classifier is compared against recent pixel-based and object-based classification studies for postevent imagery of earthquake damage. Our results show an improvement against both pixel-based and object-based methods for classifying earthquake damage in high resolution, post-event imagery.
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
Chen, Yen-Lin; Liang, Wen-Yew; Chiang, Chuan-Yen; Hsieh, Tung-Ju; Lee, Da-Cheng; Yuan, Shyan-Ming; Chang, Yang-Lang
2011-01-01
This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions. PMID:22163990
Embedded security system for multi-modal surveillance in a railway carriage
NASA Astrophysics Data System (ADS)
Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry
2015-10-01
Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.
Predicting performance and safety based on driver fatigue.
Mollicone, Daniel; Kan, Kevin; Mott, Chris; Bartels, Rachel; Bruneau, Steve; van Wollen, Matthew; Sparrow, Amy R; Van Dongen, Hans P A
2018-04-02
Fatigue causes decrements in vigilant attention and reaction time and is a major safety hazard in the trucking industry. There is a need to quantify the relationship between driver fatigue and safety in terms of operationally relevant measures. Hard-braking events are a suitable measure for this purpose as they are relatively easily observed and are correlated with collisions and near-crashes. We developed an analytic approach that predicts driver fatigue based on a biomathematical model and then estimates hard-braking events as a function of predicted fatigue, controlling for time of day to account for systematic variations in exposure (traffic density). The analysis used de-identified data from a previously published, naturalistic field study of 106 U.S. commercial motor vehicle (CMV) drivers. Data analyzed included drivers' official duty logs, sleep patterns measured around the clock using wrist actigraphy, and continuous recording of vehicle data to capture hard-braking events. The curve relating predicted fatigue to hard-braking events showed that the frequency of hard-braking events increased as predicted fatigue levels worsened. For each increment on the fatigue scale, the frequency of hard-braking events increased by 7.8%. The results provide proof of concept for a novel approach that predicts fatigue based on drivers' sleep patterns and estimates driving performance in terms of an operational metric related to safety. The approach can be translated to practice by CMV operators to achieve a fatigue risk profile specific to their own settings, in order to support data-driven decisions about fatigue countermeasures that cost-effectively deliver quantifiable operational benefits. Copyright © 2018 Elsevier Ltd. All rights reserved.
Forecast Based Financing for Managing Weather and Climate Risks to Reduce Potential Disaster Impacts
NASA Astrophysics Data System (ADS)
Arrighi, J.
2017-12-01
There is a critical window of time to reduce potential impacts of a disaster after a forecast for heightened risk is issued and before an extreme event occurs. The concept of Forecast-based Financing focuses on this window of opportunity. Through advanced preparation during system set-up, tailored methodologies are used to 1) analyze a range of potential extreme event forecasts, 2) identify emergency preparedness measures that can be taken when factoring in forecast lead time and inherent uncertainty and 3) develop standard operating procedures that are agreed on and tied to guaranteed funding sources to facilitate emergency measures led by the Red Cross or government actors when preparedness measures are triggered. This presentation will focus on a broad overview of the current state of theory and approaches used in developing a forecast-based financing systems - with a specific focus on hydrologic events, case studies of success and challenges in various contexts where this approach is being piloted, as well as what is on the horizon to be further explored and developed from a research perspective as the application of this approach continues to expand.
Assessing the Value of Information for Identifying Optimal Floodplain Management Portfolios
NASA Astrophysics Data System (ADS)
Read, L.; Bates, M.; Hui, R.; Lund, J. R.
2014-12-01
Floodplain management is a complex portfolio problem that can be analyzed from an integrated perspective incorporating traditionally structural and nonstructural options. One method to identify effective strategies for preparing, responding to, and recovering from floods is to optimize for a portfolio of temporary (emergency) and permanent floodplain management options. A risk-based optimization approach to this problem assigns probabilities to specific flood events and calculates the associated expected damages. This approach is currently limited by: (1) the assumption of perfect flood forecast information, i.e. implementing temporary management activities according to the actual flood event may differ from optimizing based on forecasted information and (2) the inability to assess system resilience across a range of possible future events (risk-centric approach). Resilience is defined here as the ability of a system to absorb and recover from a severe disturbance or extreme event. In our analysis, resilience is a system property that requires integration of physical, social, and information domains. This work employs a 3-stage linear program to identify the optimal mix of floodplain management options using conditional probabilities to represent perfect and imperfect flood stages (forecast vs. actual events). We assess the value of information in terms of minimizing damage costs for two theoretical cases - urban and rural systems. We use portfolio analysis to explore how the set of optimal management options differs depending on whether the goal is for the system to be risk-adverse to a specified event or resilient over a range of events.
Activity Recognition on Streaming Sensor Data.
Krishnan, Narayanan C; Cook, Diane J
2014-02-01
Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.
Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-03-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.
Supervised Time Series Event Detector for Building Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-04-13
A machine learning based approach is developed to detect events that have rarely been seen in the historical data. The data can include building energy consumption, sensor data, environmental data and any data that may affect the building's energy consumption. The algorithm is a modified nonlinear Bayesian support vector machine, which examines daily energy consumption profile, detect the days with abnormal events, and diagnose the cause of the events.
Perez, Miguel A; Sudweeks, Jeremy D; Sears, Edie; Antin, Jonathan; Lee, Suzanne; Hankey, Jonathan M; Dingus, Thomas A
2017-06-01
Understanding causal factors for traffic safety-critical events (e.g., crashes and near-crashes) is an important step in reducing their frequency and severity. Naturalistic driving data offers unparalleled insight into these factors, but requires identification of situations where crashes are present within large volumes of data. Sensitivity and specificity of these identification approaches are key to minimizing the resources required to validate candidate crash events. This investigation used data from the Second Strategic Highway Research Program Naturalistic Driving Study (SHRP 2 NDS) and the Canada Naturalistic Driving Study (CNDS) to develop and validate different kinematic thresholds that can be used to detect crash events. Results indicate that the sensitivity of many of these approaches can be quite low, but can be improved by selecting particular threshold levels based on detection performance. Additional improvements in these approaches are possible, and may involve leveraging combinations of different detection approaches, including advanced statistical techniques and artificial intelligence approaches, additional parameter modifications, and automation of validation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Tzallas, A T; Karvelis, P S; Katsis, C D; Fotiadis, D I; Giannopoulos, S; Konitsiotis, S
2006-01-01
The aim of the paper is to analyze transient events in inter-ictal EEG recordings, and classify epileptic activity into focal or generalized epilepsy using an automated method. A two-stage approach is proposed. In the first stage the observed transient events of a single channel are classified into four categories: epileptic spike (ES), muscle activity (EMG), eye blinking activity (EOG), and sharp alpha activity (SAA). The process is based on an artificial neural network. Different artificial neural network architectures have been tried and the network having the lowest error has been selected using the hold out approach. In the second stage a knowledge-based system is used to produce diagnosis for focal or generalized epileptic activity. The classification of transient events reported high overall accuracy (84.48%), while the knowledge-based system for epilepsy diagnosis correctly classified nine out of ten cases. The proposed method is advantageous since it effectively detects and classifies the undesirable activity into appropriate categories and produces a final outcome related to the existence of epilepsy.
NASA Astrophysics Data System (ADS)
Nijland, Linda; Arentze, Theo; Timmermans, Harry
2014-01-01
Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of individuals indicate the importance of incorporating those pre-planned activities in the new generation of dynamic travel demand models. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that takes into account possible influences of pre-planned activities and events. This paper describes the theory and shows the results of simulations of the extension. The simulation was conducted for six different activities, and the parameter values used were consistent with an earlier estimation study. The results show that the model works well and that the influences of the parameters are consistent, logical, and have clear interpretations. These findings offer further evidence of face and construct validity to the suggested modeling approach.
Scott, J; Botsis, T; Ball, R
2014-01-01
Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.
Managing ecological drought and flood within a nature-based approach. Reality or illusion?
NASA Astrophysics Data System (ADS)
Halbac-Cotoara-Zamfir, Rares; Finger, David; Stolte, Jannes
2017-04-01
Water hazards events, emphasized by an improperly implemented water management, may lead to ecological degradation of ecosystems. Traditional water management has generally sought to dampen the natural variability of water flows in different types of ecosystems to attain steady and dependable water supplies for domestic and industrial uses, irrigation, navigation, and hydropower, and to moderate extreme water conditions such as floods and droughts. Ecological drought can be defined as a prolonged and widespread deficit in available water supplies — including changes in natural and managed hydrology — that create multiple stresses across ecosystems, becomes a critical concern among researchers being a phenomenon much more complex than the other types of drought and requesting a specific approach. The impact of drought on ecosystem services lead to the necessity of identifying and implementing eco-reclamation measures which can generate better ecological answers to droughts. Ecological flood is the type of flood analyzed in full consideration with ecological issues, in the analyze process being approached 4 key aspects: connectivity of water system, landscapes of river and lakes, mobility of water bodies, and safety of flood control. As a consequence, both ecological drought and ecological flood represents high challenges for ecological sustainable water management in the process of identifying structural and non-structural measures for covering human demands without causing affected ecosystems to degrade or simplify. An ecological flood and drought control system will combine both the needs of the ecosystems as well as and flood and drought control measures. The components ecosystems' natural flow regime defined by magnitude, frequency, duration and peak timing (high or low flows) interact to maintain the ecosystem productivity. This productivity can be impaired by altered flow regimes generally due to structural measures designed to control flooding. However, from an ecological perspective, floods are not disasters in the sense that human society typically views them. Considering all previous aspects, it is clear that events like floods and droughts can't be avoided, but the hydrological extremes related to these events can be sustainable managed using a series of actions based on two inter-connected approaches: prevention approach and post-event management approach. The main objective remains the necessity of limiting the consequences of water hazards on socio-economic sectors but also the need of quickly and sustainable recovering after an event like this. However, the question still remains valid: Ecological flood and ecological drought can be managed through a nature-based approach? This paper will focus on a theoretical analysis of these "ecological" hydro-meteorological events and will debate a possible nature-based approach for their sustainable management.
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
Three decades of disasters: a review of disaster-specific literature from 1977-2009.
Smith, Erin; Wasiak, Jason; Sen, Ayan; Archer, Frank; Burkle, Frederick M
2009-01-01
The potential for disasters exists in all communities. To mitigate the potential catastrophes that confront humanity in the new millennium, an evidence-based approach to disaster management is required urgently. This study moves toward such an evidence-based approach by identifying peer-reviewed publications following a range of disasters and events over the past three decades. Peer-reviewed, event-specific literature was identified using a comprehensive search of the electronically indexed database, MEDLINE (1956-January 2009). An extended comprehensive search was conducted for one event to compare the event-specific literature indexed in MEDLINE to other electronic databases (EMBASE, CINAHL, AMED, CENTRAL, Psych Info, Maternity and Infant Care, EBM Reviews). Following 25 individual disasters or overwhelming crises, a total of 2,098 peer-reviewed, event-specific publications were published in 789 journals (652 publications following disasters/events caused by natural hazards, 966 following human-made/technological disasters/events, and 480 following conflict/complex humanitarian events).The event with the greatest number of peer-reviewed, event-specific publications was the 11 September 2001 terrorist attacks (686 publications). Prehospital and Disaster Medicine published the greatest number of peer-reviewed, event-specific publications (54), followed by Journal of Traumatic Stress (42), Military Medicine (40), and Psychiatric Services (40). The primary topics of event-specific publications were mental health, medical health, and response. When an extended, comprehensive search was conducted for one event, 75% of all peer-reviewed, event-specific publications were indexed in MEDLINE. A broad range of multi-disciplinary journals publish peer reviewed, event-specific publications. While the majority of peer-reviewed, event-specific literature is indexed in MEDLINE, comprehensive search strategies should include EMBASE to increase yield.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Machine learning methods may complement traditional analytic methods for medical device surveillance. Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k =0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k =-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k =-0.042). Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.
A process-based standard for the Solar Energetic Particle Event Environment
NASA Astrophysics Data System (ADS)
Gabriel, Stephen
For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE statistical models has been caused by two things: 1) the data set and 2) the definition of an event Because unravelling the dependencies of the outputs of different statistical models on these two parameters is extremely difficult if not impossible, currently comparison of the results from the different models is also extremely difficult and can lead to controversies, especially over which model is the correct one; hence, when it comes to using these models for engineering purposes to calculate, for example, the radiation dose for a particular mission, the user, who is in all likelihood not an expert in this field, could be given two( or even more) very different environments and find it impossible to know how to select one ( or even how to compare them). What is proposed then, is a process-based standard, which in common with nearly all of the current models is composed of 3 elements, a standard data set, a standard event definition and a resulting standard event list. A standard event list is the output of this standard and can then be used with any of the existing (or indeed future) models that are based on events. This standard event list is completely traceable and transparent and represents a reference event list for all the community. When coupled with a statistical model, the results when compared will only be dependent on the statistical model and not on the data set or event definition.
Tsunami Modeling of Hikurangi Trench M9 Events: Case Study for Napier, New Zealand
NASA Astrophysics Data System (ADS)
Williams, C. R.; Nyst, M.; Farahani, R.; Bryngelson, J.; Lee, R.; Molas, G.
2015-12-01
RMS has developed a tsunami model for New Zealand for the insurance industry to price and to manage their tsunami risks. A key tsunamigenic source for New Zealand is the Hikurangi Trench that lies offshore on the eastside of the North Island. The trench is the result of the subduction of the Pacific Plate beneath the North Island at a rate of 40-45 mm/yr. Though there have been no M9 historical events on the Hikurangi Trench, events in this magnitude range are considered in the latest version of the National Seismic Hazard Maps for New Zealand (Stirling et al., 2012). The RMS modeling approaches the tsunami lifecycle in three stages: event generation, ocean wave propagation, and coastal inundation. The tsunami event generation is modeled based on seafloor deformation resulting from an event rupture model. The ocean wave propagation and coastal inundation are modeled using a RMS-developed numerical solver, implemented on graphic processing units using a finite-volume approach to approximate two-dimensional, shallow-water wave equations over the ocean and complex topography. As the tsunami waves enter shallow water and approach the coast, the RMS model calculates the propagation of the waves along the wet-dry interface considering variable land friction. The initiation and characteristics of the tsunami are based on the event rupture model. As there have been no historical M9 events on the Hikurangi Trench, this rupture characterization posed unique challenges. This study examined the impacts of a suite of event rupture models to understand the key drivers in the variations in the tsunami inundation footprints. The goal was to develop a suite of tsunamigenic event characterizations that represent a range of potential tsunami outcomes for M9 events on the Hikurangi Trench. The focus of this case study is the Napier region as it represents an important exposure concentration in the region and has experience tsunami inundations in the past including during the 1931 Ms7.8 Hawkes Bay Earthquake.
An Event Driven Hybrid Identity Management Approach to Privacy Enhanced e-Health
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent—considered as a privacy rule in sensitive scenarios—has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism. PMID:22778634
An event driven hybrid identity management approach to privacy enhanced e-health.
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent--considered as a privacy rule in sensitive scenarios--has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism.
Cross scale interactions, nonlinearities, and forecasting catastrophic events
Peters, Debra P.C.; Pielke, Roger A.; Bestelmeyer, Brandon T.; Allen, Craig D.; Munson-McGee, Stuart; Havstad, Kris M.
2004-01-01
Catastrophic events share characteristic nonlinear behaviors that are often generated by cross-scale interactions and feedbacks among system elements. These events result in surprises that cannot easily be predicted based on information obtained at a single scale. Progress on catastrophic events has focused on one of the following two areas: nonlinear dynamics through time without an explicit consideration of spatial connectivity [Holling, C. S. (1992) Ecol. Monogr. 62, 447–502] or spatial connectivity and the spread of contagious processes without a consideration of cross-scale interactions and feedbacks [Zeng, N., Neeling, J. D., Lau, L. M. & Tucker, C. J. (1999) Science 286, 1537–1540]. These approaches rarely have ventured beyond traditional disciplinary boundaries. We provide an interdisciplinary, conceptual, and general mathematical framework for understanding and forecasting nonlinear dynamics through time and across space. We illustrate the generality and usefulness of our approach by using new data and recasting published data from ecology (wildfires and desertification), epidemiology (infectious diseases), and engineering (structural failures). We show that decisions that minimize the likelihood of catastrophic events must be based on cross-scale interactions, and such decisions will often be counterintuitive. Given the continuing challenges associated with global change, approaches that cross disciplinary boundaries to include interactions and feedbacks at multiple scales are needed to increase our ability to predict catastrophic events and develop strategies for minimizing their occurrence and impacts. Our framework is an important step in developing predictive tools and designing experiments to examine cross-scale interactions.
Causal learning and inference as a rational process: the new synthesis.
Holyoak, Keith J; Cheng, Patricia W
2011-01-01
Over the past decade, an active line of research within the field of human causal learning and inference has converged on a general representational framework: causal models integrated with bayesian probabilistic inference. We describe this new synthesis, which views causal learning and inference as a fundamentally rational process, and review a sample of the empirical findings that support the causal framework over associative alternatives. Causal events, like all events in the distal world as opposed to our proximal perceptual input, are inherently unobservable. A central assumption of the causal approach is that humans (and potentially nonhuman animals) have been designed in such a way as to infer the most invariant causal relations for achieving their goals based on observed events. In contrast, the associative approach assumes that learners only acquire associations among important observed events, omitting the representation of the distal relations. By incorporating bayesian inference over distributions of causal strength and causal structures, along with noisy-logical (i.e., causal) functions for integrating the influences of multiple causes on a single effect, human judgments about causal strength and structure can be predicted accurately for relatively simple causal structures. Dynamic models of learning based on the causal framework can explain patterns of acquisition observed with serial presentation of contingency data and are consistent with available neuroimaging data. The approach has been extended to a diverse range of inductive tasks, including category-based and analogical inferences.
Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J
2016-06-01
Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. Copyright © 2016 Elsevier Inc. All rights reserved.
Detecting earthquakes over a seismic network using single-station similarity measures
NASA Astrophysics Data System (ADS)
Bergen, Karianne J.; Beroza, Gregory C.
2018-06-01
New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.
Context-sensitive patch histograms for detecting rare events in histopathological data
NASA Astrophysics Data System (ADS)
Diaz, Kristians; Baust, Maximilian; Navab, Nassir
2017-03-01
Assessment of histopathological data is not only difficult due to its varying appearance, e.g. caused by staining artifacts, but also due to its sheer size: Common whole slice images feature a resolution of 6000x4000 pixels. Therefore, finding rare events in such data sets is a challenging and tedious task and developing sophisticated computerized tools is not easy, especially when no or little training data is available. In this work, we propose learning-free yet effective approach based on context sensitive patch-histograms in order to find extramedullary hematopoiesis events in Hematoxylin-Eosin-stained images. When combined with a simple nucleus detector, one can achieve performance levels in terms of sensitivity 0.7146, specificity 0.8476 and accuracy 0.8353 which are very well comparable to a recently published approach based on random forests.
NASA Astrophysics Data System (ADS)
Peres, David Johnny; Cancelliere, Antonino
2016-04-01
Assessment of shallow landslide hazard is important for appropriate planning of mitigation measures. Generally, return period of slope instability is assumed as a quantitative metric to map landslide triggering hazard on a catchment. The most commonly applied approach to estimate such return period consists in coupling a physically-based landslide triggering model (hydrological and slope stability) with rainfall intensity-duration-frequency (IDF) curves. Among the drawbacks of such an approach, the following assumptions may be mentioned: (1) prefixed initial conditions, with no regard to their probability of occurrence, and (2) constant intensity-hyetographs. In our work we propose the use of a Monte Carlo simulation approach in order to investigate the effects of the two above mentioned assumptions. The approach is based on coupling a physically based hydrological and slope stability model with a stochastic rainfall time series generator. By this methodology a long series of synthetic rainfall data can be generated and given as input to a landslide triggering physically based model, in order to compute the return period of landslide triggering as the mean inter-arrival time of a factor of safety less than one. In particular, we couple the Neyman-Scott rectangular pulses model for hourly rainfall generation and the TRIGRS v.2 unsaturated model for the computation of transient response to individual rainfall events. Initial conditions are computed by a water table recession model that links initial conditions at a given event to the final response at the preceding event, thus taking into account variable inter-arrival time between storms. One-thousand years of synthetic hourly rainfall are generated to estimate return periods up to 100 years. Applications are first carried out to map landslide triggering hazard in the Loco catchment, located in highly landslide-prone area of the Peloritani Mountains, Sicily, Italy. Then a set of additional simulations are performed in order to compare the results obtained by the traditional IDF-based method with the Monte Carlo ones. Results indicate that both variability of initial conditions and of intra-event rainfall intensity significantly affect return period estimation. In particular, the common assumption of an initial water table depth at the base of the pervious strata may lead in practice to an overestimation of return period up to one order of magnitude, while the assumption of constant-intensity hyetographs may yield an overestimation by a factor of two or three. Hence, it may be concluded that the analysed simplifications involved in the traditional IDF-based approach generally imply a non-conservative assessment of landslide triggering hazard.
NASA Astrophysics Data System (ADS)
Santillan, J. R.; Amora, A. M.; Makinano-Santillan, M.; Marqueso, J. T.; Cutamora, L. C.; Serviano, J. L.; Makinano, R. M.
2016-06-01
In this paper, we present a combined geospatial and two dimensional (2D) flood modeling approach to assess the impacts of flooding due to extreme rainfall events. We developed and implemented this approach to the Tago River Basin in the province of Surigao del Sur in Mindanao, Philippines, an area which suffered great damage due to flooding caused by Tropical Storms Lingling and Jangmi in the year 2014. The geospatial component of the approach involves extraction of several layers of information such as detailed topography/terrain, man-made features (buildings, roads, bridges) from 1-m spatial resolution LiDAR Digital Surface and Terrain Models (DTM/DSMs), and recent land-cover from Landsat 7 ETM+ and Landsat 8 OLI images. We then used these layers as inputs in developing a Hydrologic Engineering Center Hydrologic Modeling System (HEC HMS)-based hydrologic model, and a hydraulic model based on the 2D module of the latest version of HEC River Analysis System (RAS) to dynamically simulate and map the depth and extent of flooding due to extreme rainfall events. The extreme rainfall events used in the simulation represent 6 hypothetical rainfall events with return periods of 2, 5, 10, 25, 50, and 100 years. For each event, maximum flood depth maps were generated from the simulations, and these maps were further transformed into hazard maps by categorizing the flood depth into low, medium and high hazard levels. Using both the flood hazard maps and the layers of information extracted from remotely-sensed datasets in spatial overlay analysis, we were then able to estimate and assess the impacts of these flooding events to buildings, roads, bridges and landcover. Results of the assessments revealed increase in number of buildings, roads and bridges; and increase in areas of land-cover exposed to various flood hazards as rainfall events become more extreme. The wealth of information generated from the flood impact assessment using the approach can be very useful to the local government units and the concerned communities within Tago River Basin as an aid in determining in an advance manner all those infrastructures (buildings, roads and bridges) and land-cover that can be affected by different extreme rainfall event flood scenarios.
Comparing Stream DOC Fluxes from Sensor- and Sample-Based Approaches
NASA Astrophysics Data System (ADS)
Shanley, J. B.; Saraceno, J.; Aulenbach, B. T.; Mast, A.; Clow, D. W.; Hood, K.; Walker, J. F.; Murphy, S. F.; Torres-Sanchez, A.; Aiken, G.; McDowell, W. H.
2015-12-01
DOC transport by streamwater is a significant flux that does not consistently show up in ecosystem carbon budgets. In an effort to quantify stream DOC flux, we analyzed three to four years of high-frequency in situ fluorescing dissolved organic matter (FDOM) concentrations and turbidity measured by optical sensors at the five diverse forested and/or alpine headwater sites of the U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) program. FDOM serves as a proxy for DOC. We also took discrete samples over a range of hydrologic conditions, using both manual weekly and automated event-based sampling. After compensating FDOM for temperature effects and turbidity interference - which was successful even at the high-turbidity Luquillo, PR site -- we evaluated the DOC-FDOM relation based on discrete sample DOC analyses matched to corrected FDOM at the time of sampling. FDOM was a moderately robust predictor of DOC, with r2 from 0.60 to more than 0.95 among sites. We then formed continuous DOC time series by two independent approaches: (1) DOC predicted from FDOM; and (2) the composite method, based on modeled DOC from regression on stream discharge, season, air temperature, and time, forcing the model to observations and adjusting modeled concentrations between observations by linearly-interpolated model residuals. DOC flux from each approach was then computed directly as concentration times discharge. DOC fluxes based on the sensor approach were consistently greater than the sample-based approach. At Loch Vale, CO (2.5 years) and Panola Mountain GA (1 year), the difference was 5-17%. At Sleepers River, VT (3 years), preliminary differences were greater than 20%. The difference is driven by the highest events, but we are investigating these results further. We will also present comparisons from Luquillo, PR, and Allequash Creek, WI. The higher sensor-based DOC fluxes could result from their accuracy during hysteresis, which is difficult to model. In at least one case the higher sensor-based DOC flux was linked to an unsampled event outside the range of the concentration model. Sensors require upkeep and vigilance with the data, but have the potential to yield more accurate fluxes than sample-based approaches.
Sun, Xu; May, Andrew; Wang, Qingfeng
2016-05-01
This article describes an experimental study investigating the impact on user experience of two approaches of personalization of content provided on a mobile device, for spectators at large sports events. A lab-based experiment showed that a system-driven approach to personalization was generally preferable, but that there were advantages to retaining some user control over the process. Usability implications for a hybrid approach, and design implications are discussed, with general support for countermeasures designed to overcome recognised limitations of adaptive systems. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret
2003-12-01
A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.
Rezaeian, Sanaz; Hartzell, Stephen; Sun, Xiaodan; Mendoza, Carlos
2017-01-01
Earthquake ground‐motion recordings are scarce in the central and eastern United States (CEUS) for large‐magnitude events and at close distances. We use two different simulation approaches, a deterministic physics‐based method and a site‐based stochastic method, to simulate ground motions over a wide range of magnitudes. Drawing on previous results for the modeling of recordings from the 2011 Mw 5.8 Mineral, Virginia, earthquake and using the 2001 Mw 7.6 Bhuj, India, earthquake as a tectonic analog for a large magnitude CEUS event, we are able to calibrate the two simulation methods over this magnitude range. Both models show a good fit to the Mineral and Bhuj observations from 0.1 to 10 Hz. Model parameters are then adjusted to obtain simulations for Mw 6.5, 7.0, and 7.6 events in the CEUS. Our simulations are compared with the 2014 U.S. Geological Survey weighted combination of existing ground‐motion prediction equations in the CEUS. The physics‐based simulations show comparable response spectral amplitudes and a fairly similar attenuation with distance. The site‐based stochastic simulations suggest a slightly faster attenuation of the response spectral amplitudes with distance for larger magnitude events and, as a result, slightly lower amplitudes at distances greater than 200 km. Both models are plausible alternatives and, given the few available data points in the CEUS, can be used to represent the epistemic uncertainty in modeling of postulated CEUS large‐magnitude events.
Shannon, Robin; Glowacki, David R
2018-02-15
The chemical master equation is a powerful theoretical tool for analyzing the kinetics of complex multiwell potential energy surfaces in a wide range of different domains of chemical kinetics spanning combustion, atmospheric chemistry, gas-surface chemistry, solution phase chemistry, and biochemistry. There are two well-established methodologies for solving the chemical master equation: a stochastic "kinetic Monte Carlo" approach and a matrix-based approach. In principle, the results yielded by both approaches are identical; the decision of which approach is better suited to a particular study depends on the details of the specific system under investigation. In this Article, we present a rigorous method for accelerating stochastic approaches by several orders of magnitude, along with a method for unbiasing the accelerated results to recover the "true" value. The approach we take in this paper is inspired by the so-called "boxed molecular dynamics" (BXD) method, which has previously only been applied to accelerate rare events in molecular dynamics simulations. Here we extend BXD to design a simple algorithmic strategy for accelerating rare events in stochastic kinetic simulations. Tests on a number of systems show that the results obtained using the BXD rare event strategy are in good agreement with unbiased results. To carry out these tests, we have implemented a kinetic Monte Carlo approach in MESMER, which is a cross-platform, open-source, and freely available master equation solver.
A new practice-driven approach to develop software in a cyber-physical system environment
NASA Astrophysics Data System (ADS)
Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei
2016-02-01
Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.
User Centric Job Monitoring - a redesign and novel approach in the STAR experiment
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.; Zulkarneeva, Y.
2014-06-01
User Centric Monitoring (or UCM) has been a long awaited feature in STAR, whereas programs, workflows and system "events" could be logged, broadcast and later analyzed. UCM allows to collect and filter available job monitoring information from various resources and present it to users in a user-centric view rather than an administrative-centric point of view. The first attempt and implementation of "a" UCM approach was made in STAR 2004 using a log4cxx plug-in back-end and then further evolved with an attempt to push toward a scalable database back-end (2006) and finally using a Web-Service approach (2010, CSW4DB SBIR). The latest showed to be incomplete and not addressing the evolving needs of the experiment where streamlined messages for online (data acquisition) purposes as well as the continuous support for the data mining needs and event analysis need to coexists and unified in a seamless approach. The code also revealed to be hardly maintainable. This paper presents the next evolutionary step of the UCM toolkit, a redesign and redirection of our latest attempt acknowledging and integrating recent technologies and a simpler, maintainable and yet scalable manner. The extended version of the job logging package is built upon three-tier approach based on Task, Job and Event, and features a Web-Service based logging API, a responsive AJAX-powered user interface, and a database back-end relying on MongoDB, which is uniquely suited for STAR needs. In addition, we present details of integration of this logging package with the STAR offline and online software frameworks. Leveraging on the reported experience and work from the ATLAS and CMS experience on using the ESPER engine, we discuss and show how such approach has been implemented in STAR for meta-data event triggering stream processing and filtering. An ESPER based solution seems to fit well into the online data acquisition system where many systems are monitored.
A systematic comparison of recurrent event models for application to composite endpoints.
Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine
2018-01-04
Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.
Cloud-Assisted UAV Data Collection for Multiple Emerging Events in Distributed WSNs
Cao, Huiru; Liu, Yongxin; Yue, Xuejun; Zhu, Wenjian
2017-01-01
In recent years, UAVs (Unmanned Aerial Vehicles) have been widely applied for data collection and image capture. Specifically, UAVs have been integrated with wireless sensor networks (WSNs) to create data collection platforms with high flexibility. However, most studies in this domain focus on system architecture and UAVs’ flight trajectory planning while event-related factors and other important issues are neglected. To address these challenges, we propose a cloud-assisted data gathering strategy for UAV-based WSN in the light of emerging events. We also provide a cloud-assisted approach for deriving UAV’s optimal flying and data acquisition sequence of a WSN cluster. We validate our approach through simulations and experiments. It has been proved that our methodology outperforms conventional approaches in terms of flying time, energy consumption, and integrity of data acquisition. We also conducted a real-world experiment using a UAV to collect data wirelessly from multiple clusters of sensor nodes for monitoring an emerging event, which are deployed in a farm. Compared against the traditional method, this proposed approach requires less than half the flying time and achieves almost perfect data integrity. PMID:28783100
Cloud-Assisted UAV Data Collection for Multiple Emerging Events in Distributed WSNs.
Cao, Huiru; Liu, Yongxin; Yue, Xuejun; Zhu, Wenjian
2017-08-07
In recent years, UAVs (Unmanned Aerial Vehicles) have been widely applied for data collection and image capture. Specifically, UAVs have been integrated with wireless sensor networks (WSNs) to create data collection platforms with high flexibility. However, most studies in this domain focus on system architecture and UAVs' flight trajectory planning while event-related factors and other important issues are neglected. To address these challenges, we propose a cloud-assisted data gathering strategy for UAV-based WSN in the light of emerging events. We also provide a cloud-assisted approach for deriving UAV's optimal flying and data acquisition sequence of a WSN cluster. We validate our approach through simulations and experiments. It has been proved that our methodology outperforms conventional approaches in terms of flying time, energy consumption, and integrity of data acquisition. We also conducted a real-world experiment using a UAV to collect data wirelessly from multiple clusters of sensor nodes for monitoring an emerging event, which are deployed in a farm. Compared against the traditional method, this proposed approach requires less than half the flying time and achieves almost perfect data integrity.
NASA Astrophysics Data System (ADS)
Chiadamrong, N.; Piyathanavong, V.
2017-12-01
Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.
NASA Astrophysics Data System (ADS)
Bao, X.; Cai, X.; Liu, Y.
2009-12-01
Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.
Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.
Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong
2015-11-01
Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.
Detecting modification of biomedical events using a deep parsing approach.
Mackinlay, Andrew; Martinez, David; Baldwin, Timothy
2012-04-30
This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-04-01
The SplitLab package (Wüstefeld et al., Computers and Geosciences, 2008), written in MATLAB, is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to seaside or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure.
NASA Astrophysics Data System (ADS)
Tweed, Fiona S.
2017-08-01
This special edition of Zeitschrift für Geomorphologie (ZfG) is based on presentations given at a conference entitled 'Hydrological Extreme Events in Historic and Prehistoric Times' which took place in Bonn in June 2014. The volume consists of an editorial introduction and nine research papers reflecting a range of approaches to understanding past events, including modelling, analysis of historical data and studies that focus on a consistent approach to collection and analysis of data from different areas. The HEX project, which generated the conference in Bonn, adopted a multidisciplinary approach and this is reflected in the collection of papers, which emphasise the importance of combining a range of approaches and analyses as tools for decoding both landscapes and processes.
Huang, Wei Tao; Luo, Hong Qun; Li, Nian Bing
2014-05-06
The most serious, and yet unsolved, problem of constructing molecular computing devices consists in connecting all of these molecular events into a usable device. This report demonstrates the use of Boolean logic tree for analyzing the chemical event network based on graphene, organic dye, thrombin aptamer, and Fenton reaction, organizing and connecting these basic chemical events. And this chemical event network can be utilized to implement fluorescent combinatorial logic (including basic logic gates and complex integrated logic circuits) and fuzzy logic computing. On the basis of the Boolean logic tree analysis and logic computing, these basic chemical events can be considered as programmable "words" and chemical interactions as "syntax" logic rules to construct molecular search engine for performing intelligent molecular search query. Our approach is helpful in developing the advanced logic program based on molecules for application in biosensing, nanotechnology, and drug delivery.
Yao, Bin; Kang, Hong; Miao, Qi; Zhou, Sicheng; Liang, Chen; Gong, Yang
2017-01-01
Patient falls are a common safety event type that impairs the healthcare quality. Strategies including solution tools and reporting systems for preventing patient falls have been developed and implemented in the U.S. However, the current strategies do not include timely knowledge support, which is in great need in bridging the gap between reporting and learning. In this study, we constructed a knowledge base of fall events by combining expert-reviewed fall prevention solutions and then integrating them into a reporting system. The knowledge base enables timely and tailored knowledge support and thus will serve as a prevailing fall prevention tool. This effort holds promise in making knowledge acquisition and management a routine process for enhancing the reporting and understanding of patient safety events.
Tsatsoulis, C; Amthauer, H
2003-01-01
A novel methodological approach for identifying clusters of similar medical incidents by analyzing large databases of incident reports is described. The discovery of similar events allows the identification of patterns and trends, and makes possible the prediction of future events and the establishment of barriers and best practices. Two techniques from the fields of information science and artificial intelligence have been integrated—namely, case based reasoning and information retrieval—and very good clustering accuracies have been achieved on a test data set of incident reports from transfusion medicine. This work suggests that clustering should integrate the features of an incident captured in traditional form based records together with the detailed information found in the narrative included in event reports. PMID:14645892
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees.
Martínez-Aquino, Andrés
2016-08-01
Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host-parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a "compass" when "walking" through jungles of tangled phylogenetic trees.
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees
2016-01-01
Abstract Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host–parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a “compass” when “walking” through jungles of tangled phylogenetic trees. PMID:29491928
NASA Astrophysics Data System (ADS)
Strauss, Cesar; Rosa, Marcelo Barbio; Stephany, Stephan
2013-12-01
Convective cells are cloud formations whose growth, maturation and dissipation are of great interest among meteorologists since they are associated with severe storms with large precipitation structures. Some works suggest a strong correlation between lightning occurrence and convective cells. The current work proposes a new approach to analyze the correlation between precipitation and lightning, and to identify electrically active cells. Such cells may be employed for tracking convective events in the absence of weather radar coverage. This approach employs a new spatio-temporal clustering technique based on a temporal sliding-window and a standard kernel density estimation to process lightning data. Clustering allows the identification of the cells from lightning data and density estimation bounds the contours of the cells. The proposed approach was evaluated for two convective events in Southeast Brazil. Image segmentation of radar data was performed to identify convective precipitation structures using the Steiner criteria. These structures were then compared and correlated to the electrically active cells in particular instants of time for both events. It was observed that most precipitation structures have associated cells, by comparing the ground tracks of their centroids. In addition, for one particular cell of each event, its temporal evolution was compared to that of the associated precipitation structure. Results show that the proposed approach may improve the use of lightning data for tracking convective events in countries that lack weather radar coverage.
Using Knowledge Base for Event-Driven Scheduling of Web Monitoring Systems
NASA Astrophysics Data System (ADS)
Kim, Yang Sok; Kang, Sung Won; Kang, Byeong Ho; Compton, Paul
Web monitoring systems report any changes to their target web pages by revisiting them frequently. As they operate under significant resource constraints, it is essential to minimize revisits while ensuring minimal delay and maximum coverage. Various statistical scheduling methods have been proposed to resolve this problem; however, they are static and cannot easily cope with events in the real world. This paper proposes a new scheduling method that manages unpredictable events. An MCRDR (Multiple Classification Ripple-Down Rules) document classification knowledge base was reused to detect events and to initiate a prompt web monitoring process independent of a static monitoring schedule. Our experiment demonstrates that the approach improves monitoring efficiency significantly.
An assessment of the tracer-based approach to quantifying groundwater contributions to streamflow
NASA Astrophysics Data System (ADS)
Jones, J. P.; Sudicky, E. A.; Brookfield, A. E.; Park, Y.-J.
2006-02-01
The use of conservative geochemical and isotopic tracers along with mass balance equations to determine the pre-event groundwater contributions to streamflow during a rainfall event is widely used for hydrograph separation; however, aspects related to the influence of surface and subsurface mixing processes on the estimates of the pre-event contribution remain poorly understood. Moreover, the lack of a precise definition of "pre-event" versus "event" contributions on the one hand and "old" versus "new" water components on the other hand has seemingly led to confusion within the hydrologic community about the role of Darcian-based groundwater flow during a storm event. In this work, a fully integrated surface and subsurface flow and solute transport model is used to analyze flow system dynamics during a storm event, concomitantly with advective-dispersive tracer transport, and to investigate the role of hydrodynamic mixing processes on the estimates of the pre-event component. A number of numerical experiments are presented, including an analysis of a controlled rainfall-runoff experiment, that compare the computed Darcian-based groundwater fluxes contributing to streamflow during a rainfall event with estimates of these contributions based on a tracer-based separation. It is shown that hydrodynamic mixing processes can dramatically influence estimates of the pre-event water contribution estimated by a tracer-based separation. Specifically, it is demonstrated that the actual amount of bulk flowing groundwater contributing to streamflow may be much smaller than the quantity indirectly estimated from a separation based on tracer mass balances, even if the mixing processes are weak.
A Survey of Insider Attack Detection Research
2008-08-25
modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination
Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim
2013-01-01
Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.
On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.
Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen
2018-04-01
In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.
A comparison of synthesis and integrative approaches for meaning making and information fusion
NASA Astrophysics Data System (ADS)
Eggleston, Robert G.; Fenstermacher, Laurie
2017-05-01
Traditionally, information fusion approaches to meaning making have been integrative or aggregative in nature, creating meaning "containers" in which to put content (e.g., attributes) about object classes. In a large part, this was due to the limits in technology/tools for supporting information fusion (e.g., computers). A different synthesis based approach for meaning making is described which takes advantage of computing advances. The approach is not focused on the events/behaviors being observed/sensed; instead, it is human work centric. The former director of the Defense Intelligence Agency once wrote, "Context is king. Achieving an understanding of what is happening - or will happen - comes from a truly integrated picture of an area, the situation and the various personalities in it…a layered approach over time that builds depth of understanding."1 The synthesis based meaning making framework enables this understanding. It is holistic (both the sum and the parts, the proverbial forest and the trees), multi-perspective and emulative (as opposed to representational). The two approaches are complementary, with the synthesis based meaning making framework as a wrapper. The integrative approach would be dominant at level 0,1 fusion: data fusion, track formation and the synthesis based meaning making becomes dominant at higher fusion levels (levels 2 and 3), although both may be in play. A synthesis based approach to information fusion is thus well suited for "gray zone" challenges in which there is aggression and ambiguity and which are inherently perspective dependent (e.g., recent events in Ukraine).
NASA Astrophysics Data System (ADS)
Gou, Yabin; Ma, Yingzhao; Chen, Haonan; Wen, Yixin
2018-05-01
Quantitative precipitation estimation (QPE) is one of the important applications of weather radars. However, in complex terrain such as Tibetan Plateau, it is a challenging task to obtain an optimal Z-R relation due to the complex spatial and temporal variability in precipitation microphysics. This paper develops two radar QPE schemes respectively based on Reflectivity Threshold (RT) and Storm Cell Identification and Tracking (SCIT) algorithms using observations from 11 Doppler weather radars and 3264 rain gauges over the Eastern Tibetan Plateau (ETP). These two QPE methodologies are evaluated extensively using four precipitation events that are characterized by different meteorological features. Precipitation characteristics of independent storm cells associated with these four events, as well as the storm-scale differences, are investigated using short-term vertical profile of reflectivity (VPR) clusters. Evaluation results show that the SCIT-based rainfall approach performs better than the simple RT-based method for all precipitation events in terms of score comparison using validation gauge measurements as references. It is also found that the SCIT-based approach can effectively mitigate the local error of radar QPE and represent the precipitation spatiotemporal variability better than the RT-based scheme.
Nuclear Test Depth Determination with Synthetic Modelling: Global Analysis from PNEs to DPRK-2016
NASA Astrophysics Data System (ADS)
Rozhkov, Mikhail; Stachnik, Joshua; Baker, Ben; Epiphansky, Alexey; Bobrov, Dmitry
2016-04-01
Seismic event depth determination is critical for the event screening process at the International Data Center, CTBTO. A thorough determination of the event depth can be conducted mostly through additional special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface making the depth screening criterion not applicable. Further it may result in a heavier workload to manually distinguish between subsurface and deeper crustal events. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the depth phases, cross correlation between observed and theoretic seismograms can provide a basis for the event depth estimation, and so an expansion to the screening process. We applied this approach mostly to events at teleseismic and partially regional distances. The approach was found efficient for the seismic event screening process, with certain caveats related mostly to poorly defined source and receiver crustal models which can shift the depth estimate. An adjustable teleseismic attenuation model (t*) for synthetics was used since this characteristic is not known for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to account for the complex source topography. The software prototype is designed to be used for the Expert Technical Analysis at the IDC. With this, the design effectively reuses the NDC-in-a-Box code and can be comfortably utilized by the NDC users. The package uses Geotool as a front-end for data retrieval and pre-processing. After the event database is compiled, the control is passed to the driver software, running the external processing and plotting toolboxes, which controls the final stage and produces the final result. The modules are mostly Python coded, C-coded (Raysynth3D complex topography regional synthetics) and FORTRAN coded synthetics from the CPS330 software package by Robert Herrmann of Saint Louis University. The extension of this single station depth determination method is under development and uses joint information from all stations participating in processing. It is based on simultaneous depth and moment tensor determination for both short and long period seismic phases. A novel approach recently developed for microseismic event location utilizing only phase waveform information was migrated to a global scale. It should provide faster computation as it does not require intensive synthetic modelling, and might benefit processing noisy signals. A consistent depth estimate for all recent nuclear tests was produced for the vast number of IMS stations (primary and auxiliary) used in processing.
Kittipittayakorn, Cholada; Ying, Kuo-Ching
2016-01-01
Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.
Kittipittayakorn, Cholada
2016-01-01
Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606
HOTS: A Hierarchy of Event-Based Time-Surfaces for Pattern Recognition.
Lagorce, Xavier; Orchard, Garrick; Galluppi, Francesco; Shi, Bertram E; Benosman, Ryad B
2017-07-01
This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract features using increasingly large spatio-temporal windows. The central concept is to use the rich temporal information provided by events to create contexts in the form of time-surfaces which represent the recent temporal activity within a local spatial neighborhood. We demonstrate that this concept can robustly be used at all stages of an event-based hierarchical model. First layer feature units operate on groups of pixels, while subsequent layer feature units operate on the output of lower level feature units. We report results on a previously published 36 class character recognition task and a four class canonical dynamic card pip task, achieving near 100 percent accuracy on each. We introduce a new seven class moving face recognition task, achieving 79 percent accuracy.This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract features using increasingly large spatio-temporal windows. The central concept is to use the rich temporal information provided by events to create contexts in the form of time-surfaces which represent the recent temporal activity within a local spatial neighborhood. We demonstrate that this concept can robustly be used at all stages of an event-based hierarchical model. First layer feature units operate on groups of pixels, while subsequent layer feature units operate on the output of lower level feature units. We report results on a previously published 36 class character recognition task and a four class canonical dynamic card pip task, achieving near 100 percent accuracy on each. We introduce a new seven class moving face recognition task, achieving 79 percent accuracy.
Assessing the severity of sleep apnea syndrome based on ballistocardiogram
Zhou, Xingshe; Zhao, Weichao; Liu, Fan; Ni, Hongbo; Yu, Zhiwen
2017-01-01
Background Sleep Apnea Syndrome (SAS) is a common sleep-related breathing disorder, which affects about 4-7% males and 2-4% females all around the world. Different approaches have been adopted to diagnose SAS and measure its severity, including the gold standard Polysomnography (PSG) in sleep study field as well as several alternative techniques such as single-channel ECG, pulse oximeter and so on. However, many shortcomings still limit their generalization in home environment. In this study, we aim to propose an efficient approach to automatically assess the severity of sleep apnea syndrome based on the ballistocardiogram (BCG) signal, which is non-intrusive and suitable for in home environment. Methods We develop an unobtrusive sleep monitoring system to capture the BCG signals, based on which we put forward a three-stage sleep apnea syndrome severity assessment framework, i.e., data preprocessing, sleep-related breathing events (SBEs) detection, and sleep apnea syndrome severity evaluation. First, in the data preprocessing stage, to overcome the limits of BCG signals (e.g., low precision and reliability), we utilize wavelet decomposition to obtain the outline information of heartbeats, and apply a RR correction algorithm to handle missing or spurious RR intervals. Afterwards, in the event detection stage, we propose an automatic sleep-related breathing event detection algorithm named Physio_ICSS based on the iterative cumulative sums of squares (i.e., the ICSS algorithm), which is originally used to detect structural breakpoints in a time series. In particular, to efficiently detect sleep-related breathing events in the obtained time series of RR intervals, the proposed algorithm not only explores the practical factors of sleep-related breathing events (e.g., the limit of lasting duration and possible occurrence sleep stages) but also overcomes the event segmentation issue (e.g., equal-length segmentation method might divide one sleep-related breathing event into different fragments and lead to incorrect results) of existing approaches. Finally, by fusing features extracted from multiple domains, we can identify sleep-related breathing events and assess the severity level of sleep apnea syndrome effectively. Conclusions Experimental results on 136 individuals of different sleep apnea syndrome severities validate the effectiveness of the proposed framework, with the accuracy of 94.12% (128/136). PMID:28445548
Climate Hazard Assessment for Stakeholder Adaptation Planning in New York City
NASA Technical Reports Server (NTRS)
Horton, Radley M.; Gornitz, Vivien; Bader, Daniel A.; Ruane, Alex C.; Goldberg, Richard; Rosenzweig, Cynthia
2011-01-01
This paper describes a time-sensitive approach to climate change projections, developed as part of New York City's climate change adaptation process, that has provided decision support to stakeholders from 40 agencies, regional planning associations, and private companies. The approach optimizes production of projections given constraints faced by decision makers as they incorporate climate change into long-term planning and policy. New York City stakeholders, who are well-versed in risk management, helped pre-select the climate variables most likely to impact urban infrastructure, and requested a projection range rather than a single 'most likely' outcome. The climate projections approach is transferable to other regions and consistent with broader efforts to provide climate services, including impact, vulnerability, and adaptation information. The approach uses 16 Global Climate Models (GCMs) and three emissions scenarios to calculate monthly change factors based on 30-year average future time slices relative to a 30- year model baseline. Projecting these model mean changes onto observed station data for New York City yields dramatic changes in the frequency of extreme events such as coastal flooding and dangerous heat events. Based on these methods, the current 1-in-10 year coastal flood is projected to occur more than once every 3 years by the end of the century, and heat events are projected to approximately triple in frequency. These frequency changes are of sufficient magnitude to merit consideration in long-term adaptation planning, even though the precise changes in extreme event frequency are highly uncertain
Wong, Man Sing; Ho, Hung Chak; Yang, Lin; Shi, Wenzhong; Yang, Jinxin; Chan, Ta-Chien
2017-07-24
Dust events have long been recognized to be associated with a higher mortality risk. However, no study has investigated how prolonged dust events affect the spatial variability of mortality across districts in a downwind city. In this study, we applied a spatial regression approach to estimate the district-level mortality during two extreme dust events in Hong Kong. We compared spatial and non-spatial models to evaluate the ability of each regression to estimate mortality. We also compared prolonged dust events with non-dust events to determine the influences of community factors on mortality across the city. The density of a built environment (estimated by the sky view factor) had positive association with excess mortality in each district, while socioeconomic deprivation contributed by lower income and lower education induced higher mortality impact in each territory planning unit during a prolonged dust event. Based on the model comparison, spatial error modelling with the 1st order of queen contiguity consistently outperformed other models. The high-risk areas with higher increase in mortality were located in an urban high-density environment with higher socioeconomic deprivation. Our model design shows the ability to predict spatial variability of mortality risk during an extreme weather event that is not able to be estimated based on traditional time-series analysis or ecological studies. Our spatial protocol can be used for public health surveillance, sustainable planning and disaster preparation when relevant data are available.
Effective dynamical coupling of hydrodynamics and transport for heavy-ion collisions
NASA Astrophysics Data System (ADS)
Oliinychenko, Dmytro; Petersen, Hannah
2017-04-01
Present hydrodynamics-based simulations of heavy-ion collisions neglect the feedback from the frozen-out particles flying back into the hydrodynamical region. This causes an artefact called “negative Cooper-Frye contributions”, which is negligible for high collision energies, but becomes significant for lower RHIC BES energies and for event-by-event simulations. To avoid negative Cooper-Frye contributions, while still preserving hydrodynamical behavior, we propose a pure hadronic transport approach with forced thermalization in the regions of high energy density. It is demonstrated that this approach exhibits enhancement of strangeness and mean transverse momenta compared to conventional transport - an effect typical for hydrodynamical approaches.
Wang, Yuanjia; Chen, Tianle; Zeng, Donglin
2016-01-01
Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Background Machine learning methods may complement traditional analytic methods for medical device surveillance. Methods and results Using data from the National Cardiovascular Data Registry for implantable cardioverter–defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%–20.9%; nonfatal ICD-related adverse events, 19.3%–26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%–37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k=0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k=−0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k=−0.042). Conclusion Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance. PMID:28860874
A comparative assessment of statistical methods for extreme weather analysis
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of this study are of relevance for a broad range of environmental variables, including meteorological and hydrological quantities.
Parental Health and Children's Economic Well-Being
ERIC Educational Resources Information Center
Wagmiller, Robert L., Jr.; Lennon, Mary Clare; Kuang, Li
2008-01-01
The life course perspective emphasizes that past economic experiences and stage in the life course influence a family's ability to cope with negative life events such as poor health. However, traditional analytic approaches are not well-suited to examine how the impact of negative life events differs based on a family's past economic experiences,…
History, rare, and multiple events of mechanical unfolding of repeat proteins
NASA Astrophysics Data System (ADS)
Sumbul, Fidan; Marchesi, Arin; Rico, Felix
2018-03-01
Mechanical unfolding of proteins consisting of repeat domains is an excellent tool to obtain large statistics. Force spectroscopy experiments using atomic force microscopy on proteins presenting multiple domains have revealed that unfolding forces depend on the number of folded domains (history) and have reported intermediate states and rare events. However, the common use of unspecific attachment approaches to pull the protein of interest holds important limitations to study unfolding history and may lead to discarding rare and multiple probing events due to the presence of unspecific adhesion and uncertainty on the pulling site. Site-specific methods that have recently emerged minimize this uncertainty and would be excellent tools to probe unfolding history and rare events. However, detailed characterization of these approaches is required to identify their advantages and limitations. Here, we characterize a site-specific binding approach based on the ultrastable complex dockerin/cohesin III revealing its advantages and limitations to assess the unfolding history and to investigate rare and multiple events during the unfolding of repeated domains. We show that this approach is more robust, reproducible, and provides larger statistics than conventional unspecific methods. We show that the method is optimal to reveal the history of unfolding from the very first domain and to detect rare events, while being more limited to assess intermediate states. Finally, we quantify the forces required to unfold two molecules pulled in parallel, difficult when using unspecific approaches. The proposed method represents a step forward toward more reproducible measurements to probe protein unfolding history and opens the door to systematic probing of rare and multiple molecule unfolding mechanisms.
NASA Astrophysics Data System (ADS)
Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.
2015-12-01
Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.
NASA Astrophysics Data System (ADS)
Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang
2009-09-01
Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.
What's a Lab to Do During and After a Hurricane?
Rodriguez, Fred; Selvaratnam, Rajeevan; Mann, Peggy; Kalariya, Rina; Petersen, John R
2018-03-21
Although laboratories may be able to rely on a comprehensive Hurricane Plan during a hurricane, alarming and unanticipated events frequently occur. To minimize disruption of lab operations, it is important to try to mitigate the impact of these unexpected events as quickly as possible, in the quest to minimize negative outcomes. In this article, we discuss approaches to dealing with unanticipated events during and after hurricanes, based on our personal experiences.
Hignett, Sue; Wolf, Laurie; Taylor, Ellen; Griffiths, Paula
2015-11-01
The aim of this study was to use a theoretical model (bench) for human factors and ergonomics (HFE) and a comparison with occupational slips, trips, and falls (STFs) risk management to discuss patient STF interventions (bedside). Risk factors for patient STFs have been identified and reported since the 1950s and are mostly unchanged in the 2010s. The prevailing clinical view has been that STF events indicate underlying frailty or illness, and so many of the interventions over the past 60 years have focused on assessing and treating physiological factors (dizziness, illness, vision/hearing, medicines) rather than designing interventions to reduce risk factors at the time of the STF. Three case studies are used to discuss how HFE has been, or could be, applied to STF risk management as (a) a design-based (building) approach to embed safety into the built environment, (b) a staff- (and organization-) based approach, and (c) a patient behavior-based approach to explore and understand patient perspectives of STF events. The results from the case studies suggest taking a similar HFE integration approach to other industries, that is, a sustainable design intervention for the person who experiences the STF event-the patient. This paper offers a proactive problem-solving approach to reduce STFs by patients in acute hospitals. Authors of the three case studies use HFE principles (bench/book) to understand the complex systems for facility and equipment design and include the perspective of all stakeholders (bedside). © 2015, Human Factors and Ergonomics Society.
Learning-automaton-based online discovery and tracking of spatiotemporal event patterns.
Yazidi, Anis; Granmo, Ole-Christoffer; Oommen, B John
2013-06-01
Discovering and tracking of spatiotemporal patterns in noisy sequences of events are difficult tasks that have become increasingly pertinent due to recent advances in ubiquitous computing, such as community-based social networking applications. The core activities for applications of this class include the sharing and notification of events, and the importance and usefulness of these functionalities increase as event sharing expands into larger areas of one's life. Ironically, instead of being helpful, an excessive number of event notifications can quickly render the functionality of event sharing to be obtrusive. Indeed, any notification of events that provides redundant information to the application/user can be seen to be an unnecessary distraction. In this paper, we introduce a new scheme for discovering and tracking noisy spatiotemporal event patterns, with the purpose of suppressing reoccurring patterns, while discerning novel events. Our scheme is based on maintaining a collection of hypotheses, each one conjecturing a specific spatiotemporal event pattern. A dedicated learning automaton (LA)--the spatiotemporal pattern LA (STPLA)--is associated with each hypothesis. By processing events as they unfold, we attempt to infer the correctness of each hypothesis through a real-time guided random walk. Consequently, the scheme that we present is computationally efficient, with a minimal memory footprint. Furthermore, it is ergodic, allowing adaptation. Empirical results involving extensive simulations demonstrate the superior convergence and adaptation speed of STPLA, as well as an ability to operate successfully with noise, including both the erroneous inclusion and omission of events. An empirical comparison study was performed and confirms the superiority of our scheme compared to a similar state-of-the-art approach. In particular, the robustness of the STPLA to inclusion as well as to omission noise constitutes a unique property compared to other related approaches. In addition, the results included, which involve the so-called " presence sharing" application, are both promising and, in our opinion, impressive. It is thus our opinion that the proposed STPLA scheme is, in general, ideal for improving the usefulness of event notification and sharing systems, since it is capable of significantly, robustly, and adaptively suppressing redundant information.
NASA Astrophysics Data System (ADS)
Jia, Chaoqing; Hu, Jun; Chen, Dongyan; Liu, Yurong; Alsaadi, Fuad E.
2018-07-01
In this paper, we discuss the event-triggered resilient filtering problem for a class of time-varying systems subject to stochastic uncertainties and successive packet dropouts. The event-triggered mechanism is employed with hope to reduce the communication burden and save network resources. The stochastic uncertainties are considered to describe the modelling errors and the phenomenon of successive packet dropouts is characterized by a random variable obeying the Bernoulli distribution. The aim of the paper is to provide a resilient event-based filtering approach for addressed time-varying systems such that, for all stochastic uncertainties, successive packet dropouts and filter gain perturbation, an optimized upper bound of the filtering error covariance is obtained by designing the filter gain. Finally, simulations are provided to demonstrate the effectiveness of the proposed robust optimal filtering strategy.
Detecting modification of biomedical events using a deep parsing approach
2012-01-01
Background This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. Method To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Results Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Conclusions Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification. PMID:22595089
Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.
Das, Rahul Deb; Winter, Stephan
2016-11-23
Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.
Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation
Das, Rahul Deb; Winter, Stephan
2016-01-01
Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation. PMID:27886053
NASA Technical Reports Server (NTRS)
Ali, Moonis; Whitehead, Bruce; Gupta, Uday K.; Ferber, Harry
1989-01-01
This paper describes an expert system which is designed to perform automatic data analysis, identify anomalous events, and determine the characteristic features of these events. We have employed both artificial intelligence and neural net approaches in the design of this expert system. The artificial intelligence approach is useful because it provides (1) the use of human experts' knowledge of sensor behavior and faulty engine conditions in interpreting data; (2) the use of engine design knowledge and physical sensor locations in establishing relationships among the events of multiple sensors; (3) the use of stored analysis of past data of faulty engine conditions; and (4) the use of knowledge-based reasoning in distinguishing sensor failure from actual faults. The neural network approach appears promising because neural nets (1) can be trained on extremely noisy data and produce classifications which are more robust under noisy conditions than other classification techniques; (2) avoid the necessity of noise removal by digital filtering and therefore avoid the need to make assumptions about frequency bands or other signal characteristics of anomalous behavior; (3) can, in effect, generate their own feature detectors based on the characteristics of the sensor data used in training; and (4) are inherently parallel and therefore are potentially implementable in special-purpose parallel hardware.
A data base approach for prediction of deforestation-induced mass wasting events
NASA Technical Reports Server (NTRS)
Logan, T. L.
1981-01-01
A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend the use of conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of the study directly address road and traffic management but can be transferred to a range of other environmental variables including meteorological and hydrological quantities.
One commonly used approach to CSO pollution abatement is to rely on a storm-event based design of storage-tank volume to capture CSO for pump-back and/or bleed-back (gravity flow) to the existing WWTP for treatment. However, this approach may not be by itself the most economical...
The Generation of a Stochastic Flood Event Catalogue for Continental USA
NASA Astrophysics Data System (ADS)
Quinn, N.; Wing, O.; Smith, A.; Sampson, C. C.; Neal, J. C.; Bates, P. D.
2017-12-01
Recent advances in the acquisition of spatiotemporal environmental data and improvements in computational capabilities has enabled the generation of large scale, even global, flood hazard layers which serve as a critical decision-making tool for a range of end users. However, these datasets are designed to indicate only the probability and depth of inundation at a given location and are unable to describe the likelihood of concurrent flooding across multiple sites.Recent research has highlighted that although the estimation of large, widespread flood events is of great value to flood mitigation and insurance industries, to date it has been difficult to deal with this spatial dependence structure in flood risk over relatively large scales. Many existing approaches have been restricted to empirical estimates of risk based on historic events, limiting their capability of assessing risk over the full range of plausible scenarios. Therefore, this research utilises a recently developed model-based approach to describe the multisite joint distribution of extreme river flows across continental USA river gauges. Given an extreme event at a site, the model characterises the likelihood neighbouring sites are also impacted. This information is used to simulate an ensemble of plausible synthetic extreme event footprints from which flood depths are extracted from an existing global flood hazard catalogue. Expected economic losses are then estimated by overlaying flood depths with national datasets defining asset locations, characteristics and depth damage functions. The ability of this approach to quantify probabilistic economic risk and rare threshold exceeding events is expected to be of value to those interested in the flood mitigation and insurance sectors.This work describes the methodological steps taken to create the flood loss catalogue over a national scale; highlights the uncertainty in the expected annual economic vulnerability within the USA from extreme river flows; and presents future developments to the modelling approach.
NASA Astrophysics Data System (ADS)
Zhang, Rong-Hua; Tao, Ling-Jiang; Gao, Chuan
2017-09-01
Large uncertainties exist in real-time predictions of the 2015 El Niño event, which have systematic intensity biases that are strongly model-dependent. It is critically important to characterize those model biases so they can be reduced appropriately. In this study, the conditional nonlinear optimal perturbation (CNOP)-based approach was applied to an intermediate coupled model (ICM) equipped with a four-dimensional variational data assimilation technique. The CNOP-based approach was used to quantify prediction errors that can be attributed to initial conditions (ICs) and model parameters (MPs). Two key MPs were considered in the ICM: one represents the intensity of the thermocline effect, and the other represents the relative coupling intensity between the ocean and atmosphere. Two experiments were performed to illustrate the effects of error corrections, one with a standard simulation and another with an optimized simulation in which errors in the ICs and MPs derived from the CNOP-based approach were optimally corrected. The results indicate that simulations of the 2015 El Niño event can be effectively improved by using CNOP-derived error correcting. In particular, the El Niño intensity in late 2015 was adequately captured when simulations were started from early 2015. Quantitatively, the Niño3.4 SST index simulated in Dec. 2015 increased to 2.8 °C in the optimized simulation, compared with only 1.5 °C in the standard simulation. The feasibility and effectiveness of using the CNOP-based technique to improve ENSO simulations are demonstrated in the context of the 2015 El Niño event. The limitations and further applications are also discussed.
Undergraduates Learn Evolution Through Teaching Kindergartners About Blind Mexican Cavefish
Gross, Joshua B.; Gangidine, Andrew; Schafer, Rachel E.
2017-01-01
The development and implementation of a scientific outreach activity comes with a number of challenges. A successful outreach event must match the sophistication of content to the audience, be engaging, expand the knowledge base for participants, and be inclusive for a diverse audience. Ideally, a successful event will also convey the importance of scientific outreach for future scientists and citizens. In this paper, we present a simple, hands-on guide to a scientific outreach event targeted to kindergarten learners. This activity also pursued a second goal: the inclusion of undergraduate students in the development and delivery of the event. We provided a detailed set of four activities, focusing on the blind Mexican cavefish, which were enthusiastically received by kindergarten audiences. The engagement of undergraduate students in the development of this activity encouraged public outreach involvement and fostered new scientific and communication skills. The format of the outreach event we describe is flexible. We provide a set of guidelines and suggestions for adapting this approach to other biological topics. The activity and approach we describe enables the implementation of effective scientific outreach, using active learning approaches, which benefits both elementary school learners and undergraduate students. PMID:28936469
Koutkias, Vassilis; Stalidis, George; Chouvarda, Ioanna; Lazou, Katerina; Kilintzis, Vassilis; Maglaveras, Nicos
2009-01-01
Adverse Drug Events (ADEs) are currently considered as a major public health issue, endangering patients' safety and causing significant healthcare costs. Several research efforts are currently concentrating on the reduction of preventable ADEs by employing Information Technology (IT) solutions, which aim to provide healthcare professionals and patients with relevant knowledge and decision support tools. In this context, we present a knowledge engineering approach towards the construction of a Knowledge-based System (KBS) regarded as the core part of a CDSS (Clinical Decision Support System) for ADE prevention, all developed in the context of the EU-funded research project PSIP (Patient Safety through Intelligent Procedures in Medication). In the current paper, we present the knowledge sources considered in PSIP and the implications they pose to knowledge engineering, the methodological approach followed, as well as the components defining the knowledge engineering framework based on relevant state-of-the-art technologies and representation formalisms.
Improved EEG Event Classification Using Differential Energy.
Harati, A; Golmohammadi, M; Lopez, S; Obeid, I; Picone, J
2015-12-01
Feature extraction for automatic classification of EEG signals typically relies on time frequency representations of the signal. Techniques such as cepstral-based filter banks or wavelets are popular analysis techniques in many signal processing applications including EEG classification. In this paper, we present a comparison of a variety of approaches to estimating and postprocessing features. To further aid in discrimination of periodic signals from aperiodic signals, we add a differential energy term. We evaluate our approaches on the TUH EEG Corpus, which is the largest publicly available EEG corpus and an exceedingly challenging task due to the clinical nature of the data. We demonstrate that a variant of a standard filter bank-based approach, coupled with first and second derivatives, provides a substantial reduction in the overall error rate. The combination of differential energy and derivatives produces a 24 % absolute reduction in the error rate and improves our ability to discriminate between signal events and background noise. This relatively simple approach proves to be comparable to other popular feature extraction approaches such as wavelets, but is much more computationally efficient.
Taking the CCDs to the ultimate performance for low threshold experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haro, Miguel; Moroni, Guillermo; Tiffenberg, Javier
2016-11-14
Scientific grade CCDs show atractive capabilities for the detection of particles with small energy deposition in matter. Their very low threshold of approximately 40 eV and their good spatial reconstruction of the event are key properties for currently running experiments: CONNIE and DAMIC. Both experiments can benefit from any increase of the detection efficiency of nuclear recoils at low energy. In this work we present two different approaches to increase this efficiency by increasing the SNR of events. The first one is based on the reduction of the readout noise of the device, which is the main contribution of uncertaintymore » to the signal measurement. New studies on the electronic noise from the integrated output amplifier and the readout electronics will be presented together with result of a new configuration showing a lower limit on the readout noise which can be implemented on the current setup of the CCD based experiments. A second approach to increase the SNR of events at low energy that will be presented is the studies of the spatial conformation of nuclear recoil events at different depth in the active volume by studies of new effects that differ from expected models based on not interacting diffusion model of electrons in the semiconductor.« less
NASA Astrophysics Data System (ADS)
Gavrishchaka, V. V.; Ganguli, S. B.
2001-12-01
Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.
Development of a GCR Event-based Risk Model
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee
2009-01-01
A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how GCR event rates mapped to biological signaling induction and relaxation times. We considered several hypotheses related to signaling and cancer risk, and then performed simulations for conditions where aberrant or adaptive signaling would occur on long-duration space mission. Our results do not support the conventional assumptions of dose, linearity and additivity. A discussion on how event-based systems biology models, which focus on biological signaling as the mechanism to propagate damage or adaptation, can be further developed for cancer and CNS space radiation risk projections is given.
Quasi-continuous stochastic simulation framework for flood modelling
NASA Astrophysics Data System (ADS)
Moustakis, Yiannis; Kossieris, Panagiotis; Tsoukalas, Ioannis; Efstratiadis, Andreas
2017-04-01
Typically, flood modelling in the context of everyday engineering practices is addressed through event-based deterministic tools, e.g., the well-known SCS-CN method. A major shortcoming of such approaches is the ignorance of uncertainty, which is associated with the variability of soil moisture conditions and the variability of rainfall during the storm event.In event-based modeling, the sole expression of uncertainty is the return period of the design storm, which is assumed to represent the acceptable risk of all output quantities (flood volume, peak discharge, etc.). On the other hand, the varying antecedent soil moisture conditions across the basin are represented by means of scenarios (e.g., the three AMC types by SCS),while the temporal distribution of rainfall is represented through standard deterministic patterns (e.g., the alternative blocks method). In order to address these major inconsistencies,simultaneously preserving the simplicity and parsimony of the SCS-CN method, we have developed a quasi-continuous stochastic simulation approach, comprising the following steps: (1) generation of synthetic daily rainfall time series; (2) update of potential maximum soil moisture retention, on the basis of accumulated five-day rainfall; (3) estimation of daily runoff through the SCS-CN formula, using as inputs the daily rainfall and the updated value of soil moisture retention;(4) selection of extreme events and application of the standard SCS-CN procedure for each specific event, on the basis of synthetic rainfall.This scheme requires the use of two stochastic modelling components, namely the CastaliaR model, for the generation of synthetic daily data, and the HyetosMinute model, for the disaggregation of daily rainfall to finer temporal scales. Outcomes of this approach are a large number of synthetic flood events, allowing for expressing the design variables in statistical terms and thus properly evaluating the flood risk.
Modeling hard clinical end-point data in economic analyses.
Kansal, Anuraag R; Zheng, Ying; Palencia, Roberto; Ruffolo, Antonio; Hass, Bastian; Sorensen, Sonja V
2013-11-01
The availability of hard clinical end-point data, such as that on cardiovascular (CV) events among patients with type 2 diabetes mellitus, is increasing, and as a result there is growing interest in using hard end-point data of this type in economic analyses. This study investigated published approaches for modeling hard end-points from clinical trials and evaluated their applicability in health economic models with different disease features. A review of cost-effectiveness models of interventions in clinically significant therapeutic areas (CV diseases, cancer, and chronic lower respiratory diseases) was conducted in PubMed and Embase using a defined search strategy. Only studies integrating hard end-point data from randomized clinical trials were considered. For each study included, clinical input characteristics and modeling approach were summarized and evaluated. A total of 33 articles (23 CV, eight cancer, two respiratory) were accepted for detailed analysis. Decision trees, Markov models, discrete event simulations, and hybrids were used. Event rates were incorporated either as constant rates, time-dependent risks, or risk equations based on patient characteristics. Risks dependent on time and/or patient characteristics were used where major event rates were >1%/year in models with fewer health states (<7). Models of infrequent events or with numerous health states generally preferred constant event rates. The detailed modeling information and terminology varied, sometimes requiring interpretation. Key considerations for cost-effectiveness models incorporating hard end-point data include the frequency and characteristics of the relevant clinical events and how the trial data is reported. When event risk is low, simplification of both the model structure and event rate modeling is recommended. When event risk is common, such as in high risk populations, more detailed modeling approaches, including individual simulations or explicitly time-dependent event rates, are more appropriate to accurately reflect the trial data.
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
ERIC Educational Resources Information Center
Pain, Paromita; Masullo Chen, Gina; Campbell, Christopher P.
2016-01-01
In-depth qualitative interviews with participants of a high school journalism workshop reveal that immersing students in coverage of a historically important news event enhances learning of multimedia journalism. Study explores how using a team-based approach to coverage of the 50th anniversary of "Freedom Summer," a key event in…
Toward a Healthy Community (Organizing Events for Community Health Promotion).
ERIC Educational Resources Information Center
Public Health Service (DHHS), Rockville, MD. Office of Disease Prevention and Health Promotion.
This booklet suggests the first steps communities can take in assessing their needs and resources and mobilizing public interest and support for health promotion. It is based on an approach to health education and community organization that recognizes the value of a highly visible, time-limited event, such as a health fair, a marathon, or an…
The power of the adverse outcome pathway (AOP) framework arises from its utilization of pathway-based data to describe the initial interaction of a chemical with a molecular target (molecular initiating event; (MIE), followed by a progression through a series of key events that l...
Representation of photon limited data in emission tomography using origin ensembles
NASA Astrophysics Data System (ADS)
Sitek, A.
2008-06-01
Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements.
A browser-based event display for the CMS experiment at the LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hategan, M.; McCauley, T.; Nguyen, P.
2012-01-01
The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d.more » The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.« less
Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko
2016-07-01
A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation of the induced microseismicity and novel insights into dynamic rupture processes based on the average temporal (foreshock-aftershock) relationship of child events to parents.
Estimating the Probability of Rare Events Occurring Using a Local Model Averaging.
Chen, Jin-Hua; Chen, Chun-Shu; Huang, Meng-Fan; Lin, Hung-Chih
2016-10-01
In statistical applications, logistic regression is a popular method for analyzing binary data accompanied by explanatory variables. But when one of the two outcomes is rare, the estimation of model parameters has been shown to be severely biased and hence estimating the probability of rare events occurring based on a logistic regression model would be inaccurate. In this article, we focus on estimating the probability of rare events occurring based on logistic regression models. Instead of selecting a best model, we propose a local model averaging procedure based on a data perturbation technique applied to different information criteria to obtain different probability estimates of rare events occurring. Then an approximately unbiased estimator of Kullback-Leibler loss is used to choose the best one among them. We design complete simulations to show the effectiveness of our approach. For illustration, a necrotizing enterocolitis (NEC) data set is analyzed. © 2016 Society for Risk Analysis.
Market-based control mechanisms for patient safety
Coiera, E; Braithwaite, J
2009-01-01
A new model is proposed for enhancing patient safety using market-based control (MBC), inspired by successful approaches to environmental governance. Emissions trading, enshrined in the Kyoto protocol, set a carbon price and created a carbon market—is it possible to set a patient safety price and let the marketplace find ways of reducing clinically adverse events? To “cap and trade,” a regulator would need to establish system-wide and organisation-specific targets, based on the cost of adverse events, create a safety market for trading safety credits and then police the market. Organisations are given a clear policy signal to reduce adverse event rates, are told by how much, but are free to find mechanisms best suited to their local needs. The market would inevitably generate novel ways of creating safety credits, and accountability becomes hard to evade when adverse events are explicitly measured and accounted for in an organisation’s bottom line. PMID:19342522
A practical approach to screen for authorised and unauthorised genetically modified plants.
Waiblinger, Hans-Ulrich; Grohmann, Lutz; Mankertz, Joachim; Engelbert, Dirk; Pietsch, Klaus
2010-03-01
In routine analysis, screening methods based on real-time PCR are most commonly used for the detection of genetically modified (GM) plant material in food and feed. In this paper, it is shown that the combination of five DNA target sequences can be used as a universal screening approach for at least 81 GM plant events authorised or unauthorised for placing on the market and described in publicly available databases. Except for maize event LY038, soybean events DP-305423 and BPS-CV127-9 and cotton event 281-24-236 x 3006-210-23, at least one of the five genetic elements has been inserted in these GM plants and is targeted by this screening approach. For the detection of these sequences, fully validated real-time PCR methods have been selected. A screening table is presented that describes the presence or absence of the target sequences for most of the listed GM plants. These data have been verified either theoretically according to available databases or experimentally using available reference materials. The screening table will be updated regularly by a network of German enforcement laboratories.
NASA Astrophysics Data System (ADS)
Gou, Y.
2017-12-01
Quantitative Precipitation Estimation (QPE) is one of the important applications of weather radars. However, in complex terrain such as Tibetan Plateau, it is a challenging task to obtain an optimal Z-R relation due to the complex space time variability in precipitation microphysics. This paper develops two radar QPE schemes respectively based on Reflectivity Threshold (RT) and Storm Cell Identification and Tracking (SCIT) algorithms using observations from 11 Doppler weather radars and 3294 rain gauges over the Eastern Tibetan Plateau (ETP). These two QPE methodologies are evaluated extensively using four precipitation events that are characterized by different meteorological features. Precipitation characteristics of independent storm cells associated with these four events, as well as the storm-scale differences, are investigated using short-term vertical profiles of reflectivity clusters. Evaluation results show that the SCIT-based rainfall approach performs better than the simple RT-based method in all precipitation events in terms of score comparison using validation gauge measurements as references, with higher correlation (than 75.74%), lower mean absolute error (than 82.38%) and root-mean-square error (than 89.04%) of all the comparative frames. It is also found that the SCIT-based approach can effectively mitigate the radar QPE local error and represent precipitation spatiotemporal variability better than RT-based scheme.
A Process Study of the Development of Virtual Research Environments
NASA Astrophysics Data System (ADS)
Ahmed, I.; Cooper, K.; McGrath, R.; Griego, G.; Poole, M. S.; Hanisch, R. J.
2014-05-01
In recent years, cyberinfrastructures have been deployed to create virtual research environments (VREs) - such as the Virtual Astronomical Observatory (VAO) - to enhance the quality and speed of scientific research, and to foster global scientific communities. Our study utilizes process methodology to study the evolution of VREs. This approach focuses on a series of events that bring about or lead to some outcome, and attempts to specify the generative mechanism that could produce the event series. This paper briefly outlines our approach and describes initial results of a case study of the VAO, one of the participating VREs. The case study is based on interviews with seven individuals participating in the VAO, and analysis of project documents and online resources. These sources are hand tagged to identify events related to the thematic tracks, to yield a narrative of the project. Results demonstrate the event series of an organization through traditional methods augmented by virtual sources.
GAC: Gene Associations with Clinical, a web based application.
Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne
2017-01-01
We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC. Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data. In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC.
NASA Astrophysics Data System (ADS)
Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo
2017-10-01
This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.
Jiang, Guoqian; Wang, Liwei; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G
2013-01-01
A semantically coded knowledge base of adverse drug events (ADEs) with severity information is critical for clinical decision support systems and translational research applications. However it remains challenging to measure and identify the severity information of ADEs. The objective of the study is to develop and evaluate a semantic web based approach for building a knowledge base of severe ADEs based on the FDA Adverse Event Reporting System (AERS) reporting data. We utilized a normalized AERS reporting dataset and extracted putative drug-ADE pairs and their associated outcome codes in the domain of cardiac disorders. We validated the drug-ADE associations using ADE datasets from SIDe Effect Resource (SIDER) and the UMLS. We leveraged the Common Terminology Criteria for Adverse Event (CTCAE) grading system and classified the ADEs into the CTCAE in the Web Ontology Language (OWL). We identified and validated 2,444 unique Drug-ADE pairs in the domain of cardiac disorders, of which 760 pairs are in Grade 5, 775 pairs in Grade 4 and 2,196 pairs in Grade 3.
the ILSI Research Foundation conveded a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categoreis of bioactive agents: food allergens, nutrients, pathogenic microorganisms, and ...
Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.
Soleimani, Hossein; Hensman, James; Saria, Suchi
2017-08-21
Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.
Symbolic Processing Combined with Model-Based Reasoning
NASA Technical Reports Server (NTRS)
James, Mark
2009-01-01
A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.
Deep Recurrent Neural Network-Based Autoencoders for Acoustic Novelty Detection
Vesperini, Fabio; Schuller, Björn
2017-01-01
In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are predicted from the previous frames by means of Long-Short Term Memory recurrent denoising autoencoders. The reconstruction error between the input and the output of the autoencoder is used as activation signal to detect novel events. There is no evidence of studies focused on comparing previous efforts to automatically recognize novel events from audio signals and giving a broad and in depth evaluation of recurrent neural network-based autoencoders. The present contribution aims to consistently evaluate our recent novel approaches to fill this white spot in the literature and provide insight by extensive evaluations carried out on three databases: A3Novelty, PASCAL CHiME, and PROMETHEUS. Besides providing an extensive analysis of novel and state-of-the-art methods, the article shows how RNN-based autoencoders outperform statistical approaches up to an absolute improvement of 16.4% average F-measure over the three databases. PMID:28182121
Estimating the probability of rare events: addressing zero failure data.
Quigley, John; Revie, Matthew
2011-07-01
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Stancanelli, Laura Maria; Peres, David Johnny; Cancelliere, Antonino; Foti, Enrico
2017-07-01
Rainfall-induced shallow slides can evolve into debris flows that move rapidly downstream with devastating consequences. Mapping the susceptibility to debris flow is an important aid for risk mitigation. We propose a novel practical approach to derive debris flow inundation maps useful for susceptibility assessment, that is based on the integrated use of DEM-based spatially-distributed hydrological and slope stability models with debris flow propagation models. More specifically, the TRIGRS infiltration and infinite slope stability model and the FLO-2D model for the simulation of the related debris flow propagation and deposition are combined. An empirical instability-to-debris flow triggering threshold calibrated on the basis of observed events, is applied to link the two models and to accomplish the task of determining the amount of unstable mass that develops as a debris flow. Calibration of the proposed methodology is carried out based on real data of the debris flow event occurred on 1 October 2009, in the Peloritani mountains area (Italy). Model performance, assessed by receiver-operating-characteristics (ROC) indexes, evidences fairly good reproduction of the observed event. Comparison with the performance of the traditional debris flow modeling procedure, in which sediment and water hydrographs are inputed as lumped at selected points on top of the streams, is also performed, in order to assess quantitatively the limitations of such commonly applied approach. Results show that the proposed method, besides of being more process-consistent than the traditional hydrograph-based approach, can potentially provide a more accurate simulation of debris-flow phenomena, in terms of spatial patterns of erosion and deposition as well on the quantification of mobilized volumes and depths, avoiding overestimation of debris flow triggering volume and, thus, of maximum inundation flow depths.
Echterhoff, Gerald; Hirst, William
2006-06-01
Extant research shows that people use retrieval ease, a feeling-based cue, to judge how well they remember life periods. Extending this approach, we investigated the role of retrieval ease in memory judgments for single events. In Experiment 1, participants who were asked to recall many memories of an everyday event (New Year's Eve) rated retrieval as more difficult and judged their memory as worse than did participants asked to recall only a few memories. In Experiment 2, this ease-of-retrieval effect was found to interact with the shocking character of the remembered event: There was no effect when the event was highly shocking (i.e., learning about the attacks of September 11, 2001), whereas an effect was found when the event was experienced as less shocking (due either to increased distance to "9/11" or to the nonshocking nature of the event itself). Memory vividness accounted for additional variance in memory judgments, indicating an independent contribution of content-based cues in judgments of event memories.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Physics-based, Bayesian sequential detection method and system for radioactive contraband
Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E
2014-03-18
A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.
An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.
Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein
2017-12-22
The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.
Considering context: reliable entity networks through contextual relationship extraction
NASA Astrophysics Data System (ADS)
David, Peter; Hawes, Timothy; Hansen, Nichole; Nolan, James J.
2016-05-01
Existing information extraction techniques can only partially address the problem of exploiting unreadable-large amounts text. When discussion of events and relationships is limited to simple, past-tense, factual descriptions of events, current NLP-based systems can identify events and relationships and extract a limited amount of additional information. But the simple subset of available information that existing tools can extract from text is only useful to a small set of users and problems. Automated systems need to find and separate information based on what is threatened or planned to occur, has occurred in the past, or could potentially occur. We address the problem of advanced event and relationship extraction with our event and relationship attribute recognition system, which labels generic, planned, recurring, and potential events. The approach is based on a combination of new machine learning methods, novel linguistic features, and crowd-sourced labeling. The attribute labeler closes the gap between structured event and relationship models and the complicated and nuanced language that people use to describe them. Our operational-quality event and relationship attribute labeler enables Warfighters and analysts to more thoroughly exploit information in unstructured text. This is made possible through 1) More precise event and relationship interpretation, 2) More detailed information about extracted events and relationships, and 3) More reliable and informative entity networks that acknowledge the different attributes of entity-entity relationships.
Kazaryan, Airazat M.; Røsok, Bård I.; Edwin, Bjørn
2013-01-01
Background. Morbidity is a cornerstone assessing surgical treatment; nevertheless surgeons have not reached extensive consensus on this problem. Methods and Findings. Clavien, Dindo, and Strasberg with coauthors (1992, 2004, 2009, and 2010) made significant efforts to the standardization of surgical morbidity (Clavien-Dindo-Strasberg classification, last revision, the Accordion classification). However, this classification includes only postoperative complications and has two principal shortcomings: disregard of intraoperative events and confusing terminology. Postoperative events have a major impact on patient well-being. However, intraoperative events should also be recorded and reported even if they do not evidently affect the patient's postoperative well-being. The term surgical complication applied in the Clavien-Dindo-Strasberg classification may be regarded as an incident resulting in a complication caused by technical failure of surgery, in contrast to the so-called medical complications. Therefore, the term surgical complication contributes to misinterpretation of perioperative morbidity. The term perioperative adverse events comprising both intraoperative unfavourable incidents and postoperative complications could be regarded as better alternative. In 2005, Satava suggested a simple grading to evaluate intraoperative surgical errors. Based on that approach, we have elaborated a 3-grade classification of intraoperative incidents so that it can be used to grade intraoperative events of any type of surgery. Refinements have been made to the Accordion classification of postoperative complications. Interpretation. The proposed systematization of perioperative adverse events utilizing the combined application of two appraisal tools, that is, the elaborated classification of intraoperative incidents on the basis of the Satava approach to surgical error evaluation together with the modified Accordion classification of postoperative complication, appears to be an effective tool for comprehensive assessment of surgical outcomes. This concept was validated in regard to various surgical procedures. Broad implementation of this approach will promote the development of surgical science and practice. PMID:23762627
Non-Lipschitz Dynamics Approach to Discrete Event Systems
NASA Technical Reports Server (NTRS)
Zak, M.; Meyers, R.
1995-01-01
This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.
Synergy between Molecular and Contextual Views of Coping among Four Ethnic Groups of Older Adults
ERIC Educational Resources Information Center
Conway, Francine; Magai, Carol; McPherson-Salandy, Renee; Milano, Kate
2010-01-01
The coping styles of four ethnic groups of older adults in response to negative life events were analyzed in a population-based study of 1118 residents of Brooklyn, New York. Using a molecular approach, data regarding the context of events and the corresponding coping responses was obtained. Open-ended semi-structured interviews allowed…
Large Scale Meteorological Pattern of Extreme Rainfall in Indonesia
NASA Astrophysics Data System (ADS)
Kuswanto, Heri; Grotjahn, Richard; Rachmi, Arinda; Suhermi, Novri; Oktania, Erma; Wijaya, Yosep
2014-05-01
Extreme Weather Events (EWEs) cause negative impacts socially, economically, and environmentally. Considering these facts, forecasting EWEs is crucial work. Indonesia has been identified as being among the countries most vulnerable to the risk of natural disasters, such as floods, heat waves, and droughts. Current forecasting of extreme events in Indonesia is carried out by interpreting synoptic maps for several fields without taking into account the link between the observed events in the 'target' area with remote conditions. This situation may cause misidentification of the event leading to an inaccurate prediction. Grotjahn and Faure (2008) compute composite maps from extreme events (including heat waves and intense rainfall) to help forecasters identify such events in model output. The composite maps show large scale meteorological patterns (LSMP) that occurred during historical EWEs. Some vital information about the EWEs can be acquired from studying such maps, in addition to providing forecaster guidance. Such maps have robust mid-latitude meteorological patterns (for Sacramento and California Central Valley, USA EWEs). We study the performance of the composite approach for tropical weather condition such as Indonesia. Initially, the composite maps are developed to identify and forecast the extreme weather events in Indramayu district- West Java, the main producer of rice in Indonesia and contributes to about 60% of the national total rice production. Studying extreme weather events happening in Indramayu is important since EWEs there affect national agricultural and fisheries activities. During a recent EWE more than a thousand houses in Indramayu suffered from serious flooding with each home more than one meter underwater. The flood also destroyed a thousand hectares of rice plantings in 5 regencies. Identifying the dates of extreme events is one of the most important steps and has to be carried out carefully. An approach has been applied to identify the dates involving observations from multiple sites (rain gauges). The approach combines the POT (Peaks Over Threshold) with 'declustering' of the data to approximate independence based on the autocorrelation structure of each rainfall series. The cross correlation among sites is considered also to develop the event's criteria yielding a rational choice of the extreme dates given the 'spotty' nature of the intense convection. Based on the identified dates, we are developing a supporting tool for forecasting extreme rainfall based on the corresponding large-scale meteorological patterns (LSMPs). The LSMPs methodology focuses on the larger-scale patterns that the model are better able to forecast, as those larger-scale patterns create the conditions fostering the local EWE. Bootstrap resampling method is applied to highlight the key features that statistically significant with the extreme events. Grotjahn, R., and G. Faure. 2008: Composite Predictor Maps of Extraordinary Weather Events in the Sacramento California Region. Weather and Forecasting. 23: 313-335.
Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model
NASA Astrophysics Data System (ADS)
Anderson, K. R.
2016-12-01
Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics-based, mixed deterministic-probabilistic eruption forecasting approach in reducing and quantifying these uncertainties.
Spatial event cluster detection using an approximate normal distribution.
Torabi, Mahmoud; Rosychuk, Rhonda J
2008-12-12
In geographic surveillance of disease, areas with large numbers of disease cases are to be identified so that investigations of the causes of high disease rates can be pursued. Areas with high rates are called disease clusters and statistical cluster detection tests are used to identify geographic areas with higher disease rates than expected by chance alone. Typically cluster detection tests are applied to incident or prevalent cases of disease, but surveillance of disease-related events, where an individual may have multiple events, may also be of interest. Previously, a compound Poisson approach that detects clusters of events by testing individual areas that may be combined with their neighbours has been proposed. However, the relevant probabilities from the compound Poisson distribution are obtained from a recursion relation that can be cumbersome if the number of events are large or analyses by strata are performed. We propose a simpler approach that uses an approximate normal distribution. This method is very easy to implement and is applicable to situations where the population sizes are large and the population distribution by important strata may differ by area. We demonstrate the approach on pediatric self-inflicted injury presentations to emergency departments and compare the results for probabilities based on the recursion and the normal approach. We also implement a Monte Carlo simulation to study the performance of the proposed approach. In a self-inflicted injury data example, the normal approach identifies twelve out of thirteen of the same clusters as the compound Poisson approach, noting that the compound Poisson method detects twelve significant clusters in total. Through simulation studies, the normal approach well approximates the compound Poisson approach for a variety of different population sizes and case and event thresholds. A drawback of the compound Poisson approach is that the relevant probabilities must be determined through a recursion relation and such calculations can be computationally intensive if the cluster size is relatively large or if analyses are conducted with strata variables. On the other hand, the normal approach is very flexible, easily implemented, and hence, more appealing for users. Moreover, the concepts may be more easily conveyed to non-statisticians interested in understanding the methodology associated with cluster detection test results.
Rahman, Md Motiur; Alatawi, Yasser; Cheng, Ning; Qian, Jingjing; Peissig, Peggy L; Berg, Richard L; Page, David C; Hansen, Richard A
2017-12-01
The US Food and Drug Administration Adverse Event Reporting System (FAERS), a post-marketing safety database, can be used to differentiate brand versus generic safety signals. To explore the methods for identifying and analyzing brand versus generic adverse event (AE) reports. Public release FAERS data from January 2004 to March 2015 were analyzed using alendronate and carbamazepine as examples. Reports were classified as brand, generic, and authorized generic (AG). Disproportionality analyses compared reporting odds ratios (RORs) of selected known labeled serious adverse events stratifying by brand, generic, and AG. The homogeneity of these RORs was compared using the Breslow-Day test. The AG versus generic was the primary focus since the AG is identical to brand but marketed as a generic, therefore minimizing generic perception bias. Sensitivity analyses explored how methodological approach influenced results. Based on 17,521 US event reports involving alendronate and 3733 US event reports involving carbamazepine (immediate and extended release), no consistently significant differences were observed across RORs for the AGs versus generics. Similar results were obtained when comparing reporting patterns over all time and just after generic entry. The most restrictive approach for classifying AE reports yielded smaller report counts but similar results. Differentiation of FAERS reports as brand versus generic requires careful attention to risk of product misclassification, but the relative stability of findings across varying assumptions supports the utility of these approaches for potential signal detection.
Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie
2018-05-22
Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.
Drug delivery, cell-based therapies, and tissue engineering approaches for spinal cord injury.
Kabu, Shushi; Gao, Yue; Kwon, Brian K; Labhasetwar, Vinod
2015-12-10
Spinal cord injury (SCI) results in devastating neurological and pathological consequences, causing major dysfunction to the motor, sensory, and autonomic systems. The primary traumatic injury to the spinal cord triggers a cascade of acute and chronic degenerative events, leading to further secondary injury. Many therapeutic strategies have been developed to potentially intervene in these progressive neurodegenerative events and minimize secondary damage to the spinal cord. Additionally, significant efforts have been directed toward regenerative therapies that may facilitate neuronal repair and establish connectivity across the injury site. Despite the promise that these approaches have shown in preclinical animal models of SCI, challenges with respect to successful clinical translation still remain. The factors that could have contributed to failure include important biologic and physiologic differences between the preclinical models and the human condition, study designs that do not mirror clinical reality, discrepancies in dosing and the timing of therapeutic interventions, and dose-limiting toxicity. With a better understanding of the pathobiology of events following acute SCI, developing integrated approaches aimed at preventing secondary damage and also facilitating neuroregenerative recovery is possible and hopefully will lead to effective treatments for this devastating injury. The focus of this review is to highlight the progress that has been made in drug therapies and delivery systems, and also cell-based and tissue engineering approaches for SCI. Copyright © 2015 Elsevier B.V. All rights reserved.
An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.
Nguyen, Ngan; Watson, William D; Dominguez, Edward
2016-01-01
Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4) it provides formative and constructive feedback to bridge the gap between the learners' KSAs and the targeted KSAs. The EBAT methodology guides the design of simulation that incorporates these 4 features and, thus, enhances training effectiveness with simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Xu, Stanley; Clarke, Christina L; Newcomer, Sophia R; Daley, Matthew F; Glanz, Jason M
2018-05-16
Vaccine safety studies are often electronic health record (EHR)-based observational studies. These studies often face significant methodological challenges, including confounding and misclassification of adverse event. Vaccine safety researchers use self-controlled case series (SCCS) study design to handle confounding effect and employ medical chart review to ascertain cases that are identified using EHR data. However, for common adverse events, limited resources often make it impossible to adjudicate all adverse events observed in electronic data. In this paper, we considered four approaches for analyzing SCCS data with confirmation rates estimated from an internal validation sample: (1) observed cases, (2) confirmed cases only, (3) known confirmation rate, and (4) multiple imputation (MI). We conducted a simulation study to evaluate these four approaches using type I error rates, percent bias, and empirical power. Our simulation results suggest that when misclassification of adverse events is present, approaches such as observed cases, confirmed case only, and known confirmation rate may inflate the type I error, yield biased point estimates, and affect statistical power. The multiple imputation approach considers the uncertainty of estimated confirmation rates from an internal validation sample, yields a proper type I error rate, largely unbiased point estimate, proper variance estimate, and statistical power. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Kariuki, Jacob K; Gona, Philimon; Leveille, Suzanne G; Stuart-Shor, Eileen M; Hayman, Laura L; Cromwell, Jerry
2018-06-01
The non-lab Framingham algorithm, which substitute body mass index for lipids in the laboratory based (lab-based) Framingham algorithm, has been validated among African Americans (AAs). However, its cost-effectiveness and economic tradeoffs have not been evaluated. This study examines the incremental cost-effectiveness ratio (ICER) of two cardiovascular disease (CVD) prevention programs guided by the non-lab versus lab-based Framingham algorithm. We simulated the World Health Organization CVD prevention guidelines on a cohort of 2690 AA participants in the Atherosclerosis Risk in Communities (ARIC) cohort. Costs were estimated using Medicare fee schedules (diagnostic tests, drugs & visits), Bureau of Labor Statistics (RN wages), and estimates for managing incident CVD events. Outcomes were assumed to be true positive cases detected at a data driven treatment threshold. Both algorithms had the best balance of sensitivity/specificity at the moderate risk threshold (>10% risk). Over 12years, 82% and 77% of 401 incident CVD events were accurately predicted via the non-lab and lab-based Framingham algorithms, respectively. There were 20 fewer false negative cases in the non-lab approach translating into over $900,000 in savings over 12years. The ICER was -$57,153 for every extra CVD event prevented when using the non-lab algorithm. The approach guided by the non-lab Framingham strategy dominated the lab-based approach with respect to both costs and predictive ability. Consequently, the non-lab Framingham algorithm could potentially provide a highly effective screening tool at lower cost to address the high burden of CVD especially among AA and in resource-constrained settings where lab tests are unavailable. Copyright © 2017 Elsevier Inc. All rights reserved.
Gould, A Lawrence
2016-12-30
Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Exceptional aerosol pollution plume observed using a new ULA-lidar approach
NASA Astrophysics Data System (ADS)
Chazette, Patrick
2016-09-01
An exceptional particulate pollution event was sampled in June 2005 over the Ardèche region in Southern France. Airborne (at the wavelength of 355 nm) and ground-based (at the wavelength of 532 nm) lidars performed measurements simultaneously. Airborne observations were performed from an ultra-light aircraft (ULA); they offer an opportunity to test a new method for inversing lidar profiles which enables their quantitative use while the airplane flies in a scattering layer. Using the results of this approach and the ground-based lidar measurements, the aerosol plumes have been optically quantified and the diversity of particle sources (from Western Europe, North Africa and even North America) which contributed to the event has been highlighted using both spaceborne observations and multiple air mass back-trajectories.
Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.P.
2008-01-01
Remote sensing techniques have been shown effective for large-scale damage surveys after a hazardous event in both near real-time or post-event analyses. The paper aims to compare accuracy of common imaging processing techniques to detect tornado damage tracks from Landsat TM data. We employed the direct change detection approach using two sets of images acquired before and after the tornado event to produce a principal component composite images and a set of image difference bands. Techniques in the comparison include supervised classification, unsupervised classification, and objectoriented classification approach with a nearest neighbor classifier. Accuracy assessment is based on Kappa coefficient calculated from error matrices which cross tabulate correctly identified cells on the TM image and commission and omission errors in the result. Overall, the Object-oriented Approach exhibits the highest degree of accuracy in tornado damage detection. PCA and Image Differencing methods show comparable outcomes. While selected PCs can improve detection accuracy 5 to 10%, the Object-oriented Approach performs significantly better with 15-20% higher accuracy than the other two techniques. ?? 2008 by MDPI.
Myint, Soe W.; Yuan, May; Cerveny, Randall S.; Giri, Chandra P.
2008-01-01
Remote sensing techniques have been shown effective for large-scale damage surveys after a hazardous event in both near real-time or post-event analyses. The paper aims to compare accuracy of common imaging processing techniques to detect tornado damage tracks from Landsat TM data. We employed the direct change detection approach using two sets of images acquired before and after the tornado event to produce a principal component composite images and a set of image difference bands. Techniques in the comparison include supervised classification, unsupervised classification, and object-oriented classification approach with a nearest neighbor classifier. Accuracy assessment is based on Kappa coefficient calculated from error matrices which cross tabulate correctly identified cells on the TM image and commission and omission errors in the result. Overall, the Object-oriented Approach exhibits the highest degree of accuracy in tornado damage detection. PCA and Image Differencing methods show comparable outcomes. While selected PCs can improve detection accuracy 5 to 10%, the Object-oriented Approach performs significantly better with 15-20% higher accuracy than the other two techniques. PMID:27879757
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
From Goal-Oriented Requirements to Event-B Specifications
NASA Technical Reports Server (NTRS)
Aziz, Benjamin; Arenas, Alvaro E.; Bicarregui, Juan; Ponsard, Christophe; Massonet, Philippe
2009-01-01
In goal-oriented requirements engineering methodologies, goals are structured into refinement trees from high-level system-wide goals down to fine-grained requirements assigned to specific software/ hardware/human agents that can realise them. Functional goals assigned to software agents need to be operationalised into specification of services that the agent should provide to realise those requirements. In this paper, we propose an approach for operationalising requirements into specifications expressed in the Event-B formalism. Our approach has the benefit of aiding software designers by bridging the gap between declarative requirements and operational system specifications in a rigorous manner, enabling powerful correctness proofs and allowing further refinements down to the implementation level. Our solution is based on verifying that a consistent Event-B machine exhibits properties corresponding to requirements.
Embryoids, organoids and gastruloids: new approaches to understanding embryogenesis
2017-01-01
ABSTRACT Cells have an intrinsic ability to self-assemble and self-organize into complex and functional tissues and organs. By taking advantage of this ability, embryoids, organoids and gastruloids have recently been generated in vitro, providing a unique opportunity to explore complex embryological events in a detailed and highly quantitative manner. Here, we examine how such approaches are being used to answer fundamental questions in embryology, such as how cells self-organize and assemble, how the embryo breaks symmetry, and what controls timing and size in development. We also highlight how further improvements to these exciting technologies, based on the development of quantitative platforms to precisely follow and measure subcellular and molecular events, are paving the way for a more complete understanding of the complex events that help build the human embryo. PMID:28292844
Foreign Object Damage Identification in Turbine Engines
NASA Technical Reports Server (NTRS)
Strack, William; Zhang, Desheng; Turso, James; Pavlik, William; Lopez, Isaac
2005-01-01
This report summarizes the collective work of a five-person team from different organizations examining the problem of detecting foreign object damage (FOD) events in turbofan engines from gas path thermodynamic and bearing accelerometer sensors, and determining the severity of damage to each component (diagnosis). Several detection and diagnostic approaches were investigated and a software tool (FODID) was developed to assist researchers detect/diagnose FOD events. These approaches include (1) fan efficiency deviation computed from upstream and downstream temperature/ pressure measurements, (2) gas path weighted least squares estimation of component health parameter deficiencies, (3) Kalman filter estimation of component health parameters, and (4) use of structural vibration signal processing to detect both large and small FOD events. The last three of these approaches require a significant amount of computation in conjunction with a physics-based analytic model of the underlying phenomenon the NPSS thermodynamic cycle code for approaches 1 to 3 and the DyRoBeS reduced-order rotor dynamics code for approach 4. A potential application of the FODID software tool, in addition to its detection/diagnosis role, is using its sensitivity results to help identify the best types of sensors and their optimum locations within the gas path, and similarly for bearing accelerometers.
2013-01-01
Background Sensitizing events may trigger and stimulate discursive renewal. From a discursive institutional perspective, changing discourses are the driving force behind the institutional dynamics of policy domains. Theoretically informed by discursive institutionalism, this article assesses the impact of a series of four sensitizing events that triggered serious environmental health concerns in Flanders between the 1970s till the 1990s, and led onto the gradual institutionalization of a Flemish environmental health arrangement. Methods The Policy Arrangement Approach is used as the analytical framework to structure the empirical results of the historical analysis based on document analysis and in-depth interviews. Results Until the 1990s, environmental health was characterized as an ad hoc policy field in Flanders, where agenda setting was based on sensitizing events – also referred to as incident-driven. Each of these events contributed to a gradual rethinking of the epistemological discourses about environmental health risks and uncertainties. These new discourses were the driving forces behind institutional dynamics as they gradually resulted in an increased need for: 1) long-term, policy-oriented, interdisciplinary environmental health research; 2) policy coordination and integration between the environmental and public health policy fields; and 3) new forms of science-policy interactions based on mutual learning. These changes are desirable in order to detect environmental health problems as fast as possible, to react immediately and communicate appropriately. Conclusions The series of four events that triggered serious environmental health concerns in Flanders provided the opportunity to rethink and re-organize the current affairs concerning environmental health and gradually resulted into the institutionalization of a Flemish environmental health arrangement. PMID:23758822
Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J
2013-01-01
Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.
Series: Pragmatic trials and real world evidence: Paper 7. Safety, quality and monitoring.
Irving, Elaine; van den Bor, Rutger; Welsing, Paco; Walsh, Veronica; Alfonso-Cristancho, Rafael; Harvey, Catherine; Garman, Nadia; Grobbee, Diederick E
2017-11-01
Pragmatic trials offer the opportunity to obtain real-life data on the relative effectiveness and safety of a treatment before or after market authorization. This is the penultimate paper in a series of eight, describing the impact of design choices on the practical implementation of pragmatic trials. This paper focuses on the practical challenges of collecting and reporting safety data and of monitoring trial conduct while maintaining routine clinical care practice. Current ICH guidance recommends that all serious adverse events and all drug-related events must be reported in an interventional trial. In line with current guidance, we propose a risk-based approach to the collection of non-drug-related non-serious adverse events and even serious events not related to treatment based on the risk profile of the medicine/class in the patient population of interest. Different options available to support the collection and reporting of safety data while minimizing study-related follow-up visits are discussed. A risk-based approach to monitoring trial conduct is also discussed, highlighting the difference in the balance of risks likely to occur in a pragmatic trial compared to traditional clinical trials and the careful consideration that must be given to the mitigation and management of these risks to maintain routine care. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
A time-scale-free approach was developed for estimation of water fluxes at boundaries of monitoring soil profile using water content time series. The approach uses the soil water budget to compute soil water budget components, i.e. surface-water excess (Sw), infiltration less evapotranspiration (I-E...
Evidence-based support for the all-hazards approach to emergency preparedness
2012-01-01
Background During the last decade there has been a need to respond and recover from various types of emergencies including mass casualty events (MCEs), mass toxicological/chemical events (MTEs), and biological events (pandemics and bio-terror agents). Effective emergency preparedness is more likely to be achieved if an all-hazards response plan is adopted. Objectives To investigate if there is a relationship among hospitals' preparedness for various emergency scenarios, and whether components of one emergency scenario correlate with preparedness for other emergency scenarios. Methods Emergency preparedness levels of all acute-care hospitals for MCEs, MTEs, and biological events were evaluated, utilizing a structured evaluation tool based on measurable parameters. Evaluations were made by professional experts in two phases: evaluation of standard operating procedures (SOPs) followed by a site visit. Relationships among total preparedness and different components' scores for various types of emergencies were analyzed. Results Significant relationships were found among preparedness for different emergencies. Standard Operating Procedures (SOPs) for biological events correlated with preparedness for all investigated emergency scenarios. Strong correlations were found between training and drills with preparedness for all investigated emergency scenarios. Conclusions Fundamental critical building blocks such as SOPs, training, and drill programs improve preparedness for different emergencies including MCEs, MTEs, and biological events, more than other building blocks, such as equipment or knowledge of personnel. SOPs are especially important in unfamiliar emergency scenarios. The findings support the adoption of an all-hazards approach to emergency preparedness. PMID:23098065
Assessing Inhalation Exposures Associated with ...
Journal Article This paper presents a simulation-based approach for assessing short-term, water-distribution-system-wide inhalation exposures that could result from showering and the use of humidifiers during contamination events.
Concept Cartoons Supported Problem Based Learning Method in Middle School Science Classrooms
ERIC Educational Resources Information Center
Balim, Ali Günay; Inel-Ekici, Didem; Özcan, Erkan
2016-01-01
Problem based learning, in which events from daily life are presented as interesting scenarios, is one of the active learning approaches that encourages students to self-direct learning. Problem based learning, generally used in higher education, requires students to use high end thinking skills in learning environments. In order to use…
Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization
NASA Astrophysics Data System (ADS)
Lee, Kyungbook; Song, Seok Goo
2017-09-01
Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.
NASA Astrophysics Data System (ADS)
Schilirò, L.; Esposito, C.; Scarascia Mugnozza, G.
2015-09-01
Rainfall-induced shallow landslides are a widespread phenomenon that frequently causes substantial damage to property, as well as numerous casualties. In recent~years a wide range of physically based models have been developed to analyze the triggering process of these events. Specifically, in this paper we propose an approach for the evaluation of different shallow landslide-triggering scenarios by means of the TRIGRS (transient rainfall infiltration and grid-based slope stability) numerical model. For the validation of the model, a back analysis of the landslide event that occurred in the study area (located SW of Messina, northeastern Sicily, Italy) on 1 October 2009 was performed, by using different methods and techniques for the definition of the input parameters. After evaluating the reliability of the model through comparison with the 2009 landslide inventory, different triggering scenarios were defined using rainfall values derived from the rainfall probability curves, reconstructed on the basis of daily and hourly historical rainfall data. The results emphasize how these phenomena are likely to occur in the area, given that even short-duration (1-3 h) rainfall events with a relatively low return period (e.g., 10-20~years) can trigger numerous slope failures. Furthermore, for the same rainfall amount, the daily simulations underestimate the instability conditions. The high susceptibility of this area to shallow landslides is testified by the high number of landslide/flood events that have occurred in the past and are summarized in this paper by means of archival research. Considering the main features of the proposed approach, the authors suggest that this methodology could be applied to different areas, even for the development of landslide early warning systems.
A Metastatistical Approach to Satellite Estimates of Extreme Rainfall Events
NASA Astrophysics Data System (ADS)
Zorzetto, E.; Marani, M.
2017-12-01
The estimation of the average recurrence interval of intense rainfall events is a central issue for both hydrologic modeling and engineering design. These estimates require the inference of the properties of the right tail of the statistical distribution of precipitation, a task often performed using the Generalized Extreme Value (GEV) distribution, estimated either from a samples of annual maxima (AM) or with a peaks over threshold (POT) approach. However, these approaches require long and homogeneous rainfall records, which often are not available, especially in the case of remote-sensed rainfall datasets. We use here, and tailor it to remotely-sensed rainfall estimates, an alternative approach, based on the metastatistical extreme value distribution (MEVD), which produces estimates of rainfall extreme values based on the probability distribution function (pdf) of all measured `ordinary' rainfall event. This methodology also accounts for the interannual variations observed in the pdf of daily rainfall by integrating over the sample space of its random parameters. We illustrate the application of this framework to the TRMM Multi-satellite Precipitation Analysis rainfall dataset, where MEVD optimally exploits the relatively short datasets of satellite-sensed rainfall, while taking full advantage of its high spatial resolution and quasi-global coverage. Accuracy of TRMM precipitation estimates and scale issues are here investigated for a case study located in the Little Washita watershed, Oklahoma, using a dense network of rain gauges for independent ground validation. The methodology contributes to our understanding of the risk of extreme rainfall events, as it allows i) an optimal use of the TRMM datasets in estimating the tail of the probability distribution of daily rainfall, and ii) a global mapping of daily rainfall extremes and distributional tail properties, bridging the existing gaps in rain gauges networks.
Mazumdar, Atmadeep; Sen, Krishna Nirmalya; Lahiri, Balendra Nath
2007-01-01
The Haddon matrix is a potential tool for recognizing hazards in any operating engineering system. This paper presents a case study of operational hazards at a large construction site. The fish bone structure helps to visualize and relate the chain of events, which led to the failure of the system. The two-tier Haddon matrix approach helps to analyze the problem and subsequently prescribes preventive steps. The cybernetic approach has been undertaken to establish the relationship among event variables and to identify the ones with most potential. Those event variables in this case study, based on the cybernetic concepts like control responsiveness and controllability salience, are (a) uncontrolled swing of sheet contributing to energy, (b) slippage of sheet from anchor, (c) restricted longitudinal and transverse swing or rotation about the suspension, (d) guilt or uncertainty of the crane driver, (e) safe working practices and environment.
Terminal Dynamics Approach to Discrete Event Systems
NASA Technical Reports Server (NTRS)
Zak, Michail; Meyers, Ronald
1995-01-01
This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.
An event-based approach for examining the effects of wildland fire decisions on communities
Stephen F. McCool; James A. Burchfield; Daniel R. Williams; Matthew S. Carroll
2006-01-01
Public concern over the consequences of forest fire to wildland interface communities has led to increased resources devoted to fire suppression, fuel treatment, and management of fire events. The social consequences of the decisions involved in these and other fire-related actions are largely unknown, except in an anecdotal sense, but do occur at a variety of temporal...
Clayton, Hilary M.
2015-01-01
The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641
Minimum datasets to establish a CAR-mediated mode of action for rodent liver tumors.
Peffer, Richard C; LeBaron, Matthew J; Battalora, Michael; Bomann, Werner H; Werner, Christoph; Aggarwal, Manoj; Rowe, Rocky R; Tinwell, Helen
2018-04-16
Methods for investigating the Mode of Action (MoA) for rodent liver tumors via constitutive androstane receptor (CAR) activation are outlined here, based on current scientific knowledge about CAR and feedback from regulatory agencies globally. The key events (i.e., CAR activation, altered gene expression, cell proliferation, altered foci and increased adenomas/carcinomas) can be demonstrated by measuring a combination of key events and associative events that are markers for the key events. For crop protection products, a primary dataset typically should include a short-term study in the species/strain that showed the tumor response at dose levels that bracket the tumorigenic and non-tumorigenic dose levels. The dataset may vary depending on the species and the test compound. As examples, Case Studies with nitrapyrin (in mice) and metofluthrin (in rats) are described. Based on qualitative differences between the species, the key events leading to tumors in mice or rats by this MoA are not operative in humans. In the future, newer approaches such as a CAR biomarker signature approach and/or in vitro CAR3 reporter assays for mouse, rat and human CAR may eventually be used to demonstrate a CAR MoA is operative, without the need for extensive additional studies in laboratory animals. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Silva, Adrian; Schmookler, Barak; Papadopoulou, Afroditi; Schmidt, Axel; Hen, Or; Khachatryan, Mariana; Weinstein, Lawrence
2017-09-01
Using wide phase-space electron scattering data, we study a novel technique for neutrino energy reconstruction for future neutrino oscillation experiments. Accelerator-based neutrino oscillation experiments require detailed understanding of neutrino-nucleus interactions, which are complicated by the underlying nuclear physics that governs the process. One area of concern is that neutrino energy must be reconstructed event-by-event from the final-state kinematics. In charged-current quasielastic scattering, Fermi motion of nucleons prevents exact energy reconstruction. However, in scattering from deuterium, the momentum of the electron and proton constrain the neutrino energy exactly, offering a new avenue for reducing systematic uncertainties. To test this approach, we analyzed d (e ,e' p) data taken with the CLAS detector at Jefferson Lab Hall B and made kinematic selection cuts to obtain quasielastic events. We estimated the remaining inelastic background by using d (e ,e' pπ-) events to produce a simulated dataset of events with an undetected π-. These results demonstrate the feasibility of energy reconstruction in a hypothetical future deuterium-based neutrino detector. Supported by the Paul E. Gray UROP Fund, MIT.
Pathway-based predictive approaches for non-animal assessment of acute inhalation toxicity.
Clippinger, Amy J; Allen, David; Behrsing, Holger; BéruBé, Kelly A; Bolger, Michael B; Casey, Warren; DeLorme, Michael; Gaça, Marianna; Gehen, Sean C; Glover, Kyle; Hayden, Patrick; Hinderliter, Paul; Hotchkiss, Jon A; Iskandar, Anita; Keyser, Brian; Luettich, Karsta; Ma-Hock, Lan; Maione, Anna G; Makena, Patrudu; Melbourne, Jodie; Milchak, Lawrence; Ng, Sheung P; Paini, Alicia; Page, Kathryn; Patlewicz, Grace; Prieto, Pilar; Raabe, Hans; Reinke, Emily N; Roper, Clive; Rose, Jane; Sharma, Monita; Spoo, Wayne; Thorne, Peter S; Wilson, Daniel M; Jarabek, Annie M
2018-06-20
New approaches are needed to assess the effects of inhaled substances on human health. These approaches will be based on mechanisms of toxicity, an understanding of dosimetry, and the use of in silico modeling and in vitro test methods. In order to accelerate wider implementation of such approaches, development of adverse outcome pathways (AOPs) can help identify and address gaps in our understanding of relevant parameters for model input and mechanisms, and optimize non-animal approaches that can be used to investigate key events of toxicity. This paper describes the AOPs and the toolbox of in vitro and in silico models that can be used to assess the key events leading to toxicity following inhalation exposure. Because the optimal testing strategy will vary depending on the substance of interest, here we present a decision tree approach to identify an appropriate non-animal integrated testing strategy that incorporates consideration of a substance's physicochemical properties, relevant mechanisms of toxicity, and available in silico models and in vitro test methods. This decision tree can facilitate standardization of the testing approaches. Case study examples are presented to provide a basis for proof-of-concept testing to illustrate the utility of non-animal approaches to inform hazard identification and risk assessment of humans exposed to inhaled substances. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Dahm, Torsten; Cesca, Simone; Hainzl, Sebastian; Braun, Thomas; Krüger, Frank
2015-04-01
Earthquakes occurring close to hydrocarbon fields under production are often under critical view of being induced or triggered. However, clear and testable rules to discriminate the different events have rarely been developed and tested. The unresolved scientific problem may lead to lengthy public disputes with unpredictable impact on the local acceptance of the exploitation and field operations. We propose a quantitative approach to discriminate induced, triggered, and natural earthquakes, which is based on testable input parameters. Maxima of occurrence probabilities are compared for the cases under question, and a single probability of being triggered or induced is reported. The uncertainties of earthquake location and other input parameters are considered in terms of the integration over probability density functions. The probability that events have been human triggered/induced is derived from the modeling of Coulomb stress changes and a rate and state-dependent seismicity model. In our case a 3-D boundary element method has been adapted for the nuclei of strain approach to estimate the stress changes outside the reservoir, which are related to pore pressure changes in the field formation. The predicted rate of natural earthquakes is either derived from the background seismicity or, in case of rare events, from an estimate of the tectonic stress rate. Instrumentally derived seismological information on the event location, source mechanism, and the size of the rupture plane is of advantage for the method. If the rupture plane has been estimated, the discrimination between induced or only triggered events is theoretically possible if probability functions are convolved with a rupture fault filter. We apply the approach to three recent main shock events: (1) the Mw 4.3 Ekofisk 2001, North Sea, earthquake close to the Ekofisk oil field; (2) the Mw 4.4 Rotenburg 2004, Northern Germany, earthquake in the vicinity of the Söhlingen gas field; and (3) the Mw 6.1 Emilia 2012, Northern Italy, earthquake in the vicinity of a hydrocarbon reservoir. The three test cases cover the complete range of possible causes: clearly "human induced," "not even human triggered," and a third case in between both extremes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angélil, Oliver; Stone, Dáithí; Wehner, Michael
The annual "State of the Climate" report, published in the Bulletin of the American Meteorological Society (BAMS), has included a supplement since 2011 composed of brief analyses of the human influence on recent major extreme weather events. There are now several dozen extreme weather events examined in these supplements, but these studies have all differed in their data sources as well as their approaches to defining the events, analyzing the events, and the consideration of the role of anthropogenic emissions. This study reexamines most of these events using a single analytical approach and a single set of climate model andmore » observational data sources. In response to recent studies recommending the importance of using multiple methods for extreme weather event attribution, results are compared from these analyses to those reported in the BAMS supplements collectively, with the aim of characterizing the degree to which the lack of a common methodological framework may or may not influence overall conclusions. Results are broadly similar to those reported earlier for extreme temperature events but disagree for a number of extreme precipitation events. Based on this, it is advised that the lack of comprehensive uncertainty analysis in recent extreme weather attribution studies is important and should be considered when interpreting results, but as yet it has not introduced a systematic bias across these studies.« less
Angélil, Oliver; Stone, Dáithí; Wehner, Michael; ...
2016-12-16
The annual "State of the Climate" report, published in the Bulletin of the American Meteorological Society (BAMS), has included a supplement since 2011 composed of brief analyses of the human influence on recent major extreme weather events. There are now several dozen extreme weather events examined in these supplements, but these studies have all differed in their data sources as well as their approaches to defining the events, analyzing the events, and the consideration of the role of anthropogenic emissions. This study reexamines most of these events using a single analytical approach and a single set of climate model andmore » observational data sources. In response to recent studies recommending the importance of using multiple methods for extreme weather event attribution, results are compared from these analyses to those reported in the BAMS supplements collectively, with the aim of characterizing the degree to which the lack of a common methodological framework may or may not influence overall conclusions. Results are broadly similar to those reported earlier for extreme temperature events but disagree for a number of extreme precipitation events. Based on this, it is advised that the lack of comprehensive uncertainty analysis in recent extreme weather attribution studies is important and should be considered when interpreting results, but as yet it has not introduced a systematic bias across these studies.« less
NASA Astrophysics Data System (ADS)
Bösmeier, Annette; Glaser, Rüdiger; Stahl, Kerstin; Himmelsbach, Iso; Schönbein, Johannes
2017-04-01
Future estimations of flood hazard and risk for developing optimal coping and adaption strategies inevitably include considerations of the frequency and magnitude of past events. Methods of historical climatology represent one way of assessing flood occurrences beyond the period of instrumental measurements and can thereby substantially help to extend the view into the past and to improve modern risk analysis. Such historical information can be of additional value and has been used in statistical approaches like Bayesian flood frequency analyses during recent years. However, the derivation of quantitative values from vague descriptive information of historical sources remains a crucial challenge. We explored possibilities of parametrization of descriptive flood related data specifically for the assessment of historical floods in a framework that combines a hermeneutical approach with mathematical and statistical methods. This study forms part of the transnational, Franco-German research project TRANSRISK2 (2014 - 2017), funded by ANR and DFG, with the focus on exploring the floods history of the last 300 years for the regions of Upper and Middle Rhine. A broad data base of flood events had been compiled, dating back to AD 1500. The events had been classified based on hermeneutical methods, depending on intensity, spatial dimension, temporal structure, damages and mitigation measures associated with the specific events. This indexed database allowed the exploration of a link between descriptive data and quantitative information for the overlapping time period of classified floods and instrumental measurements since the end of the 19th century. Thereby, flood peak discharges as a quantitative measure of the severity of a flood were used to assess the discharge intervals for flood classes (upper and lower thresholds) within different time intervals for validating the flood classification, as well as examining the trend in the perception threshold over time. Furthermore, within a suitable time period, flood classes and other quantifiable indicators of flood intensity (number of damaged locations mentioned in historical sources, general availability of reports associated with a specific event) were combined with available peak discharges measurements. We argue that this information can be considered as 'expert knowledge' and used it to develop a fuzzy rule based model for deriving peak discharge estimates of pre-instrumental events that can finally be introduced into a flood frequency analysis.
Warrick, P A; Precup, D; Hamilton, E F; Kearney, R E
2007-01-01
To develop a singular-spectrum analysis (SSA) based change-point detection algorithm applicable to fetal heart rate (FHR) monitoring to improve the detection of deceleration events. We present a method for decomposing a signal into near-orthogonal components via the discrete cosine transform (DCT) and apply this in a novel online manner to change-point detection based on SSA. The SSA technique forms models of the underlying signal that can be compared over time; models that are sufficiently different indicate signal change points. To adapt the algorithm to deceleration detection where many successive similar change events can occur, we modify the standard SSA algorithm to hold the reference model constant under such conditions, an approach that we term "base-hold SSA". The algorithm is applied to a database of 15 FHR tracings that have been preprocessed to locate candidate decelerations and is compared to the markings of an expert obstetrician. Of the 528 true and 1285 false decelerations presented to the algorithm, the base-hold approach improved on standard SSA, reducing the number of missed decelerations from 64 to 49 (21.9%) while maintaining the same reduction in false-positives (278). The standard SSA assumption that changes are infrequent does not apply to FHR analysis where decelerations can occur successively and in close proximity; our base-hold SSA modification improves detection of these types of event series.
76 FR 77452 - Advisory Circular for Stall and Stick Pusher Training
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-13
... a recovery from a stall or approach. Scenario-based training that includes realistic events that... training, testing, and checking recommendations designed to maximize the likelihood that pilots will... circular was developed based on a review of recommended practices developed by major aircraft manufacturers...
An important challenge for an integrative approach to developmental systems toxicology is associating putative molecular initiating events (MIEs), cell signaling pathways, cell function and modeled fetal exposure kinetics. We have developed a chemical classification model based o...
A 2D flood inundation model based on cellular automata approach
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Todini, Ezio
2010-05-01
In the past years, the cellular automata approach has been successfully applied in two-dimensional modelling of flood events. When used in experimental applications, models based on such approach have provided good results, comparable to those obtained with more complex 2D models; moreover, CA models have proven significantly faster and easier to apply than most of existing models, and these features make them a valuable tool for flood analysis especially when dealing with large areas. However, to date the real degree of accuracy of such models has not been demonstrated, since they have been mainly used in experimental applications, while very few comparisons with theoretical solutions have been made. Also, the use of an explicit scheme of solution, which is inherent in cellular automata models, forces them to work only with small time steps, thus reducing model computation speed. The present work describes a cellular automata model based on the continuity and diffusive wave equations. Several model versions based on different solution schemes have been realized and tested in a number of numerical cases, both 1D and 2D, comparing the results with theoretical and numerical solutions. In all cases, the model performed well compared to the reference solutions, and proved to be both stable and accurate. Finally, the version providing the best results in terms of stability was tested in a real flood event and compared with different hydraulic models. Again, the cellular automata model provided very good results, both in term of computational speed and reproduction of the simulated event.
Geospace Environment Modeling 2008-2009 Challenge: Ground Magnetic Field Perturbations
NASA Technical Reports Server (NTRS)
Pulkkinen, A.; Kuznetsova, M.; Ridley, A.; Raeder, J.; Vapirev, A.; Weimer, D.; Weigel, R. S.; Wiltberger, M.; Millward, G.; Rastatter, L.;
2011-01-01
Acquiring quantitative metrics!based knowledge about the performance of various space physics modeling approaches is central for the space weather community. Quantification of the performance helps the users of the modeling products to better understand the capabilities of the models and to choose the approach that best suits their specific needs. Further, metrics!based analyses are important for addressing the differences between various modeling approaches and for measuring and guiding the progress in the field. In this paper, the metrics!based results of the ground magnetic field perturbation part of the Geospace Environment Modeling 2008 2009 Challenge are reported. Predictions made by 14 different models, including an ensemble model, are compared to geomagnetic observatory recordings from 12 different northern hemispheric locations. Five different metrics are used to quantify the model performances for four storm events. It is shown that the ranking of the models is strongly dependent on the type of metric used to evaluate the model performance. None of the models rank near or at the top systematically for all used metrics. Consequently, one cannot pick the absolute winner : the choice for the best model depends on the characteristics of the signal one is interested in. Model performances vary also from event to event. This is particularly clear for root!mean!square difference and utility metric!based analyses. Further, analyses indicate that for some of the models, increasing the global magnetohydrodynamic model spatial resolution and the inclusion of the ring current dynamics improve the models capability to generate more realistic ground magnetic field fluctuations.
Huang, Weiqing; Fan, Hongbo; Qiu, Yongfu; Cheng, Zhiyu; Qian, Yu
2016-02-15
Haze weather has become a serious environmental pollution problem which occurs in many Chinese cities. One of the most critical factors for the formation of haze weather is the exhausts of coal combustion, thus it is meaningful to figure out the causation mechanism between urban haze and the exhausts of coal combustion. Based on above considerations, the fault tree analysis (FAT) approach was employed for the causation mechanism of urban haze in Beijing by considering the risk events related with the exhausts of coal combustion for the first time. Using this approach, firstly the fault tree of the urban haze causation system connecting with coal combustion exhausts was established; consequently the risk events were discussed and identified; then, the minimal cut sets were successfully determined using Boolean algebra; finally, the structure, probability and critical importance degree analysis of the risk events were completed for the qualitative and quantitative assessment. The study results proved that the FTA was an effective and simple tool for the causation mechanism analysis and risk management of urban haze in China. Copyright © 2015 Elsevier B.V. All rights reserved.
Pseudo and conditional score approach to joint analysis of current count and current status data.
Wen, Chi-Chung; Chen, Yi-Hau
2018-04-17
We develop a joint analysis approach for recurrent and nonrecurrent event processes subject to case I interval censorship, which are also known in literature as current count and current status data, respectively. We use a shared frailty to link the recurrent and nonrecurrent event processes, while leaving the distribution of the frailty fully unspecified. Conditional on the frailty, the recurrent event is assumed to follow a nonhomogeneous Poisson process, and the mean function of the recurrent event and the survival function of the nonrecurrent event are assumed to follow some general form of semiparametric transformation models. Estimation of the models is based on the pseudo-likelihood and the conditional score techniques. The resulting estimators for the regression parameters and the unspecified baseline functions are shown to be consistent with rates of square and cubic roots of the sample size, respectively. Asymptotic normality with closed-form asymptotic variance is derived for the estimator of the regression parameters. We apply the proposed method to a fracture-osteoporosis survey data to identify risk factors jointly for fracture and osteoporosis in elders, while accounting for association between the two events within a subject. © 2018, The International Biometric Society.
Joint scale-change models for recurrent events and failure time.
Xu, Gongjun; Chiou, Sy Han; Huang, Chiung-Yu; Wang, Mei-Cheng; Yan, Jun
2017-01-01
Recurrent event data arise frequently in various fields such as biomedical sciences, public health, engineering, and social sciences. In many instances, the observation of the recurrent event process can be stopped by the occurrence of a correlated failure event, such as treatment failure and death. In this article, we propose a joint scale-change model for the recurrent event process and the failure time, where a shared frailty variable is used to model the association between the two types of outcomes. In contrast to the popular Cox-type joint modeling approaches, the regression parameters in the proposed joint scale-change model have marginal interpretations. The proposed approach is robust in the sense that no parametric assumption is imposed on the distribution of the unobserved frailty and that we do not need the strong Poisson-type assumption for the recurrent event process. We establish consistency and asymptotic normality of the proposed semiparametric estimators under suitable regularity conditions. To estimate the corresponding variances of the estimators, we develop a computationally efficient resampling-based procedure. Simulation studies and an analysis of hospitalization data from the Danish Psychiatric Central Register illustrate the performance of the proposed method.
A habituation based approach for detection of visual changes in surveillance camera
NASA Astrophysics Data System (ADS)
Sha'abani, M. N. A. H.; Adan, N. F.; Sabani, M. S. M.; Abdullah, F.; Nadira, J. H. S.; Yasin, M. S. M.
2017-09-01
This paper investigates a habituation based approach in detecting visual changes using video surveillance systems in a passive environment. Various techniques have been introduced for dynamic environment such as motion detection, object classification and behaviour analysis. However, in a passive environment, most of the scenes recorded by the surveillance system are normal. Therefore, implementing a complex analysis all the time in the passive environment resulting on computationally expensive, especially when using a high video resolution. Thus, a mechanism of attention is required, where the system only responds to an abnormal event. This paper proposed a novelty detection mechanism in detecting visual changes and a habituation based approach to measure the level of novelty. The objective of the paper is to investigate the feasibility of the habituation based approach in detecting visual changes. Experiment results show that the approach are able to accurately detect the presence of novelty as deviations from the learned knowledge.
Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model.
Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai
2017-02-08
Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences.
Novel Hierarchical Fall Detection Algorithm Using a Multiphase Fall Model
Hsieh, Chia-Yeh; Liu, Kai-Chun; Huang, Chih-Ning; Chu, Woei-Chyn; Chan, Chia-Tai
2017-01-01
Falls are the primary cause of accidents for the elderly in the living environment. Reducing hazards in the living environment and performing exercises for training balance and muscles are the common strategies for fall prevention. However, falls cannot be avoided completely; fall detection provides an alarm that can decrease injuries or death caused by the lack of rescue. The automatic fall detection system has opportunities to provide real-time emergency alarms for improving the safety and quality of home healthcare services. Two common technical challenges are also tackled in order to provide a reliable fall detection algorithm, including variability and ambiguity. We propose a novel hierarchical fall detection algorithm involving threshold-based and knowledge-based approaches to detect a fall event. The threshold-based approach efficiently supports the detection and identification of fall events from continuous sensor data. A multiphase fall model is utilized, including free fall, impact, and rest phases for the knowledge-based approach, which identifies fall events and has the potential to deal with the aforementioned technical challenges of a fall detection system. Seven kinds of falls and seven types of daily activities arranged in an experiment are used to explore the performance of the proposed fall detection algorithm. The overall performances of the sensitivity, specificity, precision, and accuracy using a knowledge-based algorithm are 99.79%, 98.74%, 99.05% and 99.33%, respectively. The results show that the proposed novel hierarchical fall detection algorithm can cope with the variability and ambiguity of the technical challenges and fulfill the reliability, adaptability, and flexibility requirements of an automatic fall detection system with respect to the individual differences. PMID:28208694
A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment
NASA Astrophysics Data System (ADS)
Wu, Wenyan; Westra, Seth; Leonard, Michael
2017-04-01
Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability analysis for flood risk analysis considering multiple flood producing mechanisms. This is the first step in applying a Monte Carlo based joint probability method for flood risk assessment.
Ontology-supported research on vaccine efficacy, safety and integrative biological networks.
He, Yongqun
2014-07-01
While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including Vaccine Ontology, Ontology of Adverse Events and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network ('OneNet') Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms.
Ontology-supported Research on Vaccine Efficacy, Safety, and Integrative Biological Networks
He, Yongqun
2016-01-01
Summary While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including the Vaccine Ontology, Ontology of Adverse Events, and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network (“OneNet”) Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms. PMID:24909153
Daily GRACE gravity field solutions track major flood events in the Ganges-Brahmaputra Delta
NASA Astrophysics Data System (ADS)
Gouweleeuw, Ben T.; Kvas, Andreas; Gruber, Christian; Gain, Animesh K.; Mayer-Gürr, Thorsten; Flechtner, Frank; Güntner, Andreas
2018-05-01
Two daily gravity field solutions based on observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission are evaluated against daily river runoff data for major flood events in the Ganges-Brahmaputra Delta (GBD) in 2004 and 2007. The trends over periods of a few days of the daily GRACE data reflect temporal variations in daily river runoff during major flood events. This is especially true for the larger flood in 2007, which featured two distinct periods of critical flood level exceedance in the Brahmaputra River. This first hydrological evaluation of daily GRACE gravity field solutions based on a Kalman filter approach confirms their potential for gravity-based large-scale flood monitoring. This particularly applies to short-lived, high-volume floods, as they occur in the GBD with a 4-5-year return period. The release of daily GRACE gravity field solutions in near-real time may enable flood monitoring for large events.
2014-01-01
Background Cardiovascular diseases are the main cause of death worldwide, making their prevention a major health care challenge. In 2006, a German statutory health insurance company presented a novel individualised prevention programme (KardioPro), which focused on coronary heart disease (CHD) screening, risk factor assessment, early detection and secondary prevention. This study evaluates KardioPro in CHD risk subgroups, and analyses the cost-effectiveness of different individualised prevention strategies. Methods The CHD risk subgroups were assembled based on routine data from the statutory health insurance company, making use of a quasi-beta regression model for risk prediction. The control group was selected via propensity score matching based on logistic regression and an approximate nearest neighbour approach. The main outcome was cost-effectiveness. Effectiveness was measured as event-free time, and events were defined as myocardial infarction, stroke and death. Incremental cost-effectiveness ratios comparing participants with non-participants were calculated for each subgroup. To assess the uncertainty of results, a bootstrapping approach was applied. Results The cost-effectiveness of KardioPro in the group at high risk of CHD was €20,901 per event-free year; in the medium-risk group, €52,323 per event-free year; in the low-risk group, €186,074 per event-free year; and in the group with known CHD, €26,456 per event-free year. KardioPro was associated with a significant health gain but also a significant cost increase. However, statistical significance could not be shown for all subgroups. Conclusion The cost-effectiveness of KardioPro differs substantially according to the group being targeted. Depending on the willingness-to-pay, it may be reasonable to only offer KardioPro to patients at high risk of further cardiovascular events. This high-risk group could be identified from routine statutory health insurance data. However, the long-term consequences of KardioPro still need to be evaluated. PMID:24938674
Analyzing and Identifying Teens' Stressful Periods and Stressor Events From a Microblog.
Li, Qi; Xue, Yuanyuan; Zhao, Liang; Jia, Jia; Feng, Ling
2017-09-01
Increased health problems among adolescents caused by psychological stress have aroused worldwide attention. Long-standing stress without targeted assistance and guidance negatively impacts the healthy growth of adolescents, threatening the future development of our society. So far, research focused on detecting adolescent psychological stress revealed from each individual post on microblogs. However, beyond stressful moments, identifying teens' stressful periods and stressor events that trigger each stressful period is more desirable to understand the stress from appearance to essence. In this paper, we define the problem of identifying teens' stressful periods and stressor events from the open social media microblog. Starting from a case study of adolescents' posting behaviors during stressful school events, we build a Poisson-based probability model for the correlation between stressor events and stressful posting behaviors through a series of posts on Tencent Weibo (referred to as the microblog throughout the paper). With the model, we discover teens' maximal stressful periods and further extract details of possible stressor events that cause the stressful periods. We generalize and present the extracted stressor events in a hierarchy based on common stress dimensions and event types. Taking 122 scheduled stressful study-related events in a high school as the ground truth, we test the approach on 124 students' posts from January 1, 2012 to February 1, 2015 and obtain some promising experimental results: (stressful periods: recall 0.761, precision 0.737, and F 1 -measure 0.734) and (top-3 stressor events: recall 0.763, precision 0.756, and F 1 -measure 0.759). The most prominent stressor events extracted are in the self-cognition domain, followed by the school life domain. This conforms to the adolescent psychological investigation result that problems in school life usually accompanied with teens' inner cognition problems. Compared with the state-of-the-art top-1 personal life event detection approach, our stressor event detection method is 13.72% higher in precision, 19.18% higher in recall, and 16.50% higher in F 1 -measure, demonstrating the effectiveness of our proposed framework.
A Facet Theory Model for Integrating Contextual and Personal Experiences of International Students
ERIC Educational Resources Information Center
Hackett, Paul M. W.
2014-01-01
The purpose of this article is to use a facet theory research approach to provide a clear, coherent, and integrated model of international students' experiences based upon the findings of psychological research into students when studying abroad. In research that employs a facet theory approach events are classified in terms of their constituent…
Faculty as Border Crossers: A Study of Fulbright Faculty
ERIC Educational Resources Information Center
Eddy, Pamela L.
2014-01-01
As adult learners, faculty members approach new experiences based on events of the past, but this underlying framework of understanding is challenged when they work abroad for an extended period of time.
GAC: Gene Associations with Clinical, a web based application
Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne
2018-01-01
We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC. Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data. In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC. PMID:29263780
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
An Improved Forwarding of Diverse Events with Mobile Sinks in Underwater Wireless Sensor Networks.
Raza, Waseem; Arshad, Farzana; Ahmed, Imran; Abdul, Wadood; Ghouzali, Sanaa; Niaz, Iftikhar Azim; Javaid, Nadeem
2016-11-04
In this paper, a novel routing strategy to cater the energy consumption and delay sensitivity issues in deep underwater wireless sensor networks is proposed. This strategy is named as ESDR: Event Segregation based Delay sensitive Routing. In this strategy sensed events are segregated on the basis of their criticality and, are forwarded to their respective destinations based on forwarding functions. These functions depend on different routing metrics like: Signal Quality Index, Localization free Signal to Noise Ratio, Energy Cost Function and Depth Dependent Function. The problem of incomparable values of previously defined forwarding functions causes uneven delays in forwarding process. Hence forwarding functions are redefined to ensure their comparable values in different depth regions. Packet forwarding strategy is based on the event segregation approach which forwards one third of the generated events (delay sensitive) to surface sinks and two third events (normal events) are forwarded to mobile sinks. Motion of mobile sinks is influenced by the relative distribution of normal nodes. We have also incorporated two different mobility patterns named as; adaptive mobility and uniform mobility for mobile sinks. The later one is implemented for collecting the packets generated by the normal nodes. These improvements ensure optimum holding time, uniform delay and in-time reporting of delay sensitive events. This scheme is compared with the existing ones and outperforms the existing schemes in terms of network lifetime, delay and throughput.
Patterns and Sequences: Interactive Exploration of Clickstreams to Understand Common Visitor Paths.
Liu, Zhicheng; Wang, Yang; Dontcheva, Mira; Hoffman, Matthew; Walker, Seth; Wilson, Alan
2017-01-01
Modern web clickstream data consists of long, high-dimensional sequences of multivariate events, making it difficult to analyze. Following the overarching principle that the visual interface should provide information about the dataset at multiple levels of granularity and allow users to easily navigate across these levels, we identify four levels of granularity in clickstream analysis: patterns, segments, sequences and events. We present an analytic pipeline consisting of three stages: pattern mining, pattern pruning and coordinated exploration between patterns and sequences. Based on this approach, we discuss properties of maximal sequential patterns, propose methods to reduce the number of patterns and describe design considerations for visualizing the extracted sequential patterns and the corresponding raw sequences. We demonstrate the viability of our approach through an analysis scenario and discuss the strengths and limitations of the methods based on user feedback.
Psychodynamic Assessment and Treatment of Traumatized Patients
Chertoff, Judith
1998-01-01
This article describes how psychodynamic assessment and treatment of traumatized patients can improve clinical acuity. The author describes an ego psychological, psychodynamic approach that involves 1) assessing the impact of trauma on the patient's ego defensive functioning and 2) elucidating the dynamic meaning of both the patient's presenting symptoms and the traumatic events that precipitated them. Clinical descriptions illustrate the ways in which psychodynamic psychotherapy may be particularly useful with patients whose acute symptoms develop following specific events. The author points out the advantages of an ego psychological, psychodynamic approach for her patients and the limitations of more symptom-based diagnostic assessments and treatments. PMID:9407474
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
Time to foster a rational approach to preventing cardiovascular morbid events.
Cohn, Jay N; Duprez, Daniel A
2008-07-29
Efforts to prevent atherosclerotic morbid events have focused primarily on risk factor prevention and intervention. These approaches, based on the statistical association of risk factors with events, have dominated clinical practice in the last generation. Because the cardiovascular abnormalities eventuating in morbid events are detectable in the arteries and heart before the development of symptomatic disease, recent efforts have focused on identifying the presence of these abnormalities as a more sensitive and specific guide to the need for therapy. Advances in noninvasive techniques for studying the vasculature and the left ventricle now provide the opportunity to use early disease rather than risk factors as the tool for clinical decision making. A disease scoring system has been developed using 10 tests of vascular and cardiac function and structure. More extensive data to confirm the sensitivity and specificity of this scoring system and to demonstrate its utility in tracking the response to therapy are needed to justify widespread application in clinical practice.
Rogue waves and entropy consumption
NASA Astrophysics Data System (ADS)
Hadjihoseini, Ali; Lind, Pedro G.; Mori, Nobuhito; Hoffmann, Norbert P.; Peinke, Joachim
2017-11-01
Based on data from the Sea of Japan and the North Sea the occurrence of rogue waves is analyzed by a scale-dependent stochastic approach, which interlinks fluctuations of waves for different spacings. With this approach we are able to determine a stochastic cascade process, which provides information of the general multipoint statistics. Furthermore the evolution of single trajectories in scale, which characterize wave height fluctuations in the surroundings of a chosen location, can be determined. The explicit knowledge of the stochastic process enables to assign entropy values to all wave events. We show that for these entropies the integral fluctuation theorem, a basic law of non-equilibrium thermodynamics, is valid. This implies that positive and negative entropy events must occur. Extreme events like rogue waves are characterized as negative entropy events. The statistics of these entropy fluctuations changes with the wave state, thus for the Sea of Japan the statistics of the entropies has a more pronounced tail for negative entropy values, indicating a higher probability of rogue waves.
Forensic Disaster Analysis in Near-real Time
NASA Astrophysics Data System (ADS)
Kunz, Michael; Zschau, Jochen; Wenzel, Friedemann; Khazai, Bijan; Kunz-Plapp, Tina; Trieselmann, Werner
2014-05-01
The impacts of extreme hydro-meteorological and geophysical events are controlled by various factors including severity of the event (intensity, duration, spatial extent), amplification with other phenomena (multihazard or cascading effects), interdependencies of technical systems and infrastructure, preparedness and resilience of the society. The Center for Disaster Management and Risk Reduction Technology (CEDIM) has adopted the comprehensive understanding of disasters and develops methodologies of near real-time FDA as a complementing component of the FORIN program of IRDR. The new research strategy 'Near Real-Time Forensic Disaster Analysis (FDA)' aims at scrutinizing disasters closely with a multi-disciplinary approach in order to assess the various aspects of disasters and to identify mechanisms most relevant for an extreme event to become a disaster (e.g., causal loss analysis). Recent technology developments - which have opened unprecedented opportunities for real-time hazard, vulnerability and loss assessment - are used for analyzing disasters and their impacts in combination with databases of historical events. The former covers modern empirical and analytical methods available in engineering and remote sensing for rapid impact assessments, rapid information extraction from crowd sourcing as well as rapid assessments of socio-economic impacts and economic losses. The event-driven science-based assessments of CEDIM are compiled based on interdisciplinary expertise and include the critical evaluation, assessment, validation, and quantification of an event. An important component of CEDIM's FDA is the near real-time approach which is expected to significantly speed up our understanding of natural disasters and be used to provide timely, relevant and valuable information to various user groups within their respective contexts. Currently, CEDIM has developed models and methodologies to assess different types of hazard. These approaches were applied to several disasters including, for example, Super Typhoon Haiyan/Yolanda (Nov. 2013), Central European Floods (June 2013), Hurricane Sandy (Oct. 2012), US Droughts (Summer 2012), or Typhoon Saola in Taiwan and Philippines (July 2012).
Emergence of self and other in perception and action: an event-control approach.
Jordan, J Scott
2003-12-01
The present paper analyzes the regularities referred to via the concept 'self.' This is important, for cognitive science traditionally models the self as a cognitive mediator between perceptual inputs and behavioral outputs. This leads to the assertion that the self causes action. Recent findings in social psychology indicate this is not the case and, as a consequence, certain cognitive scientists model the self as being epiphenomenal. In contrast, the present paper proposes an alternative approach (i.e., the event-control approach) that is based on recently discovered regularities between perception and action. Specifically, these regularities indicate that perception and action planning utilize common neural resources. This leads to a coupling of perception, planning, and action in which the first two constitute aspects of a single system (i.e., the distal-event system) that is able to pre-specify and detect distal events. This distal-event system is then coupled with action (i.e., effector-control systems) in a constraining, as opposed to 'causal' manner. This model has implications for how we conceptualize the manner in which one infers the intentions of another, anticipates the intentions of another, and possibly even experiences another. In conclusion, it is argued that it may be possible to map the concept 'self' onto the regularities referred to in the event-control model, not in order to reify 'the self' as a causal mechanism, but to demonstrate its status as a useful concept that refers to regularities that are part of the natural order.
Application of in vitro based safety assessment requires reconciling chemical concentrations sufficient to produce bioactivity in vitro with those that trigger a molecular initiating event at the relevant in vivo target site. To address such need, computational tools such as phy...
Assessing Aridity, Hydrological Drought, and Recovery Using GRACE and GLDAS: a Case Study in Iraq
NASA Astrophysics Data System (ADS)
Moradkhani, H.; Almamalachy, Y. S.; Yan, H.; Ahmadalipour, A.; Irannezhad, M.
2016-12-01
Iraq has suffered from several drought events during the period of 2003-2012, which imposed substantial impacts on natural environment and socioeconomic sectors, e.g. lower discharge of Tigris and Euphrates, groundwater depletion and increase in its salinity, population migration, and agricultural degradation. To investigate the aridity and climatology of Iraq, Global Land Data Assimilation System (GLDAS) monthly datasets of precipitation, temperature, and evapotranspiration at 0.25 degree spatial resolution are used. The Gravity Recovery and Climate Experiment (GRACE) satellite-derived monthly Terrestrial Water Storage (TWS) deficit is used as the hydrological drought indicator. The data is available globally at 1 degree spatial resolution. This study aims to monitor hydrological drought and assess drought recovery time for the period of August 2002 until December 2015. Two approaches are implemented to derive the GRACE-based TWS deficit. The first approach estimates the TWS deficit based on the difference from its own climatology, while the second approach directly calculates the deficit from TWS anomaly. Severity of drought events are calculated by integrating monthly water deficit over the drought period. The results indicate that both methods are capable of capturing the severe drought events in Iraq, while the second approach quantifies higher deficit and severity. In addition, two methods are employed to assess drought recovery time based on the estimated deficit. Both methods indicate similar drought recovery times, varying from less than a month to 9 months. The results demonstrate that the GRACE TWS is a reliable indicator for drought assessment over Iraq, and provides useful information to decision makers for developing drought adaptation and mitigation strategies over data-sparse regions.
Vaccine adverse event text mining system for extracting features from vaccine safety reports.
Botsis, Taxiarchis; Buttolph, Thomas; Nguyen, Michael D; Winiecki, Scott; Woo, Emily Jane; Ball, Robert
2012-01-01
To develop and evaluate a text mining system for extracting key clinical features from vaccine adverse event reporting system (VAERS) narratives to aid in the automated review of adverse event reports. Based upon clinical significance to VAERS reviewing physicians, we defined the primary (diagnosis and cause of death) and secondary features (eg, symptoms) for extraction. We built a novel vaccine adverse event text mining (VaeTM) system based on a semantic text mining strategy. The performance of VaeTM was evaluated using a total of 300 VAERS reports in three sequential evaluations of 100 reports each. Moreover, we evaluated the VaeTM contribution to case classification; an information retrieval-based approach was used for the identification of anaphylaxis cases in a set of reports and was compared with two other methods: a dedicated text classifier and an online tool. The performance metrics of VaeTM were text mining metrics: recall, precision and F-measure. We also conducted a qualitative difference analysis and calculated sensitivity and specificity for classification of anaphylaxis cases based on the above three approaches. VaeTM performed best in extracting diagnosis, second level diagnosis, drug, vaccine, and lot number features (lenient F-measure in the third evaluation: 0.897, 0.817, 0.858, 0.874, and 0.914, respectively). In terms of case classification, high sensitivity was achieved (83.1%); this was equal and better compared to the text classifier (83.1%) and the online tool (40.7%), respectively. Our VaeTM implementation of a semantic text mining strategy shows promise in providing accurate and efficient extraction of key features from VAERS narratives.
Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter
2018-01-01
Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes. PMID:29453930
Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter
2018-02-17
Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes.
Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.
2016-01-01
Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.
An event-triggered control approach for the leader-tracking problem with heterogeneous agents
NASA Astrophysics Data System (ADS)
Garcia, Eloy; Cao, Yongcan; Casbeer, David W.
2018-05-01
This paper presents an event-triggered control and communication framework for the cooperative leader-tracking problem with communication constraints. Continuous communication among agents is not assumed in this work and decentralised event-based strategies are proposed for agents with heterogeneous linear dynamics. Also, the leader dynamics are unknown and only intermittent measurements of its states are obtained by a subset of the followers. The event-based method not only represents a way to restrict communication among agents, but it also provides a decentralised scheme for scheduling information broadcasts. Notably, each agent is able to determine its own broadcasting instants independently of any other agent in the network. In an extension, the case where transmission of information is affected by time-varying communication delays is addressed. Finally, positive lower-bounds on the inter-event time intervals are obtained in order to show that Zeno behaviour does not exist and, therefore, continuous exchange of information is never needed in this framework.
NASA Astrophysics Data System (ADS)
Palumbo, Manuela; Ascione, Alessandra; Santangelo, Nicoletta; Santo, Antonio
2017-04-01
We present the first results of an analysis of flood hazard in ungauged mountain catchments that are associated with intensely urbanized alluvial fans. Assessment of hydrological hazard has been based on the integration of rainfall/runoff modelling of drainage basins with geomorphological analysis and mapping. Some small and steep, ungauged mountain catchments located in various areas of the southern Apennines, in southern Italy, have been chosen as test sites. In the last centuries, the selected basins have been subject to heavy and intense precipitation events, which have caused flash floods with serious damages in the correlated alluvial fan areas. Available spatial information (regional technical maps, DEMs, land use maps, geological/lithological maps, orthophotos) and an automated GIS-based procedure (ArcGis tools and ArcHydro tools) have been used to extract morphological, hydrological and hydraulic parameters. Such parameters have been used to run the HEC (Hydrologic Engineering Center of the US Army Corps of Engineers) software (GeoHMS, GeoRAS, HMS and RAS) based on rainfall-runoff models, which have allowed the hydrological and hydraulic simulations. As the floods occurred in the studied catchments have been debris flows dominated, the solid load simulation has been also performed. In order to validate the simulations, we have compared results of the modelling with the effects produced by past floods. Such effects have been quantified through estimations of both the sediment volumes within each catchment that have the potential to be mobilised (pre-event) during a sediment transfer event, and the volume of sediments delivered by the debris flows at basins' outlets (post-event). The post-event sediment volume has been quantified through post-event surveys and Lidar data. Evaluation of the pre-event sediment volumes in single catchments has been based on mapping of sediment storages that may constitute source zones of bed load transport and debris flows. For such an approach has been used a methodology that consists of the application of a process-based geomorphological mapping, based on data derived from GIS analysis using high-resolution DEMs, field measurements and aerial photograph interpretations. Our integrated approach, which allows quantification of the flow rate and a semi-quantitative assessment of sediment that can be mobilized during hydro-meteorological events, is applied for the first time to torrential catchmenmts of the southern Apennines and may significantly contribute to previsional studies aimed at risk mitigation in the study region.
A Micro-Level Event-Centered Approach to Investigating Armed Conflict and Population Responses
Williams, Nathalie E.; Ghimire, Dirgha J.; Axinn, William G.; Jennings, Elyse A.; Pradhan, Meeta S.
2012-01-01
In this article, we construct and test a micro-level event-centered approach to the study of armed conflict and behavioral responses in the general population. Event-centered approaches have been successfully used in the macro-political study of armed conflict but have not yet been adopted in micro-behavioral studies. The micro-level event-centered approach that we advocate here includes decomposition of a conflict into discrete political and violent events, examination of the mechanisms through which they affect behavior, and consideration of differential risks within the population. We focus on two mechanisms: instability and threat of harm. We test this approach empirically in the context of the recent decade-long armed conflict in Nepal, using detailed measurements of conflict-related events and a longitudinal study of first migration, first marriage, and first contraceptive use. Results demonstrate that different conflict-related events independently shaped migration, marriage, and childbearing and that they can simultaneously influence behaviors in opposing directions. We find that violent events increased migration, but political events slowed migration. Both violent and political events increased marriage and contraceptive use net of migration. Overall, this micro-level event-centered approach yields a significant advance for the study of how armed conflict affects civilian behavioral responses. PMID:22911154
A networks-based discrete dynamic systems approach to volcanic seismicity
NASA Astrophysics Data System (ADS)
Suteanu, Mirela
2013-04-01
The detection and relevant description of pattern change concerning earthquake events is an important, but challenging task. In this paper, earthquake events related to volcanic activity are considered manifestations of a dynamic system evolving over time. The system dynamics is seen as a succession of events with point-like appearance both in time and in space. Each event is characterized by a position in three-dimensional space, a moment of occurrence, and an event size (magnitude). A weighted directed network is constructed to capture the effects of earthquakes on subsequent events. Each seismic event represents a node. Relations among events represent edges. Edge directions are given by the temporal succession of the events. Edges are also characterized by weights reflecting the strengths of the relation between the nodes. Weights are calculated as a function of (i) the time interval separating the two events, (ii) the spatial distance between the events, (iii) the magnitude of the earliest event among the two. Different ways of addressing weight components are explored, and their implications for the properties of the produced networks are analyzed. The resulting networks are then characterized in terms of degree- and weight distributions. Subsequently, the distribution of system transitions is determined for all the edges connecting related events in the network. Two- and three-dimensional diagrams are constructed to reflect transition distributions for each set of events. Networks are thus generated for successive temporal windows of different size, and the evolution of (a) network properties and (b) system transition distributions are followed over time and compared to the timeline of documented geologic processes. Applications concerning volcanic seismicity on the Big Island of Hawaii show that this approach is capable of revealing novel aspects of change occurring in the volcanic system on different scales in time and in space.
Event-based cluster synchronization of coupled genetic regulatory networks
NASA Astrophysics Data System (ADS)
Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang
2017-09-01
In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Discovering Event Structure in Continuous Narrative Perception and Memory.
Baldassano, Christopher; Chen, Janice; Zadbood, Asieh; Pillow, Jonathan W; Hasson, Uri; Norman, Kenneth A
2017-08-02
During realistic, continuous perception, humans automatically segment experiences into discrete events. Using a novel model of cortical event dynamics, we investigate how cortical structures generate event representations during narrative perception and how these events are stored to and retrieved from memory. Our data-driven approach allows us to detect event boundaries as shifts between stable patterns of brain activity without relying on stimulus annotations and reveals a nested hierarchy from short events in sensory regions to long events in high-order areas (including angular gyrus and posterior medial cortex), which represent abstract, multimodal situation models. High-order event boundaries are coupled to increases in hippocampal activity, which predict pattern reinstatement during later free recall. These areas also show evidence of anticipatory reinstatement as subjects listen to a familiar narrative. Based on these results, we propose that brain activity is naturally structured into nested events, which form the basis of long-term memory representations. Copyright © 2017 Elsevier Inc. All rights reserved.
An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.
2002-01-01
Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.
Stranges, P. Benjamin; Palla, Mirkó; Kalachikov, Sergey; Nivala, Jeff; Dorwart, Michael; Trans, Andrew; Kumar, Shiv; Porel, Mintu; Chien, Minchen; Tao, Chuanjuan; Morozova, Irina; Li, Zengmin; Shi, Shundi; Aberra, Aman; Arnold, Cleoma; Yang, Alexander; Aguirre, Anne; Harada, Eric T.; Korenblum, Daniel; Pollard, James; Bhat, Ashwini; Gremyachinskiy, Dmitriy; Bibillo, Arek; Chen, Roger; Davis, Randy; Russo, James J.; Fuller, Carl W.; Roever, Stefan; Ju, Jingyue; Church, George M.
2016-01-01
Scalable, high-throughput DNA sequencing is a prerequisite for precision medicine and biomedical research. Recently, we presented a nanopore-based sequencing-by-synthesis (Nanopore-SBS) approach, which used a set of nucleotides with polymer tags that allow discrimination of the nucleotides in a biological nanopore. Here, we designed and covalently coupled a DNA polymerase to an α-hemolysin (αHL) heptamer using the SpyCatcher/SpyTag conjugation approach. These porin–polymerase conjugates were inserted into lipid bilayers on a complementary metal oxide semiconductor (CMOS)-based electrode array for high-throughput electrical recording of DNA synthesis. The designed nanopore construct successfully detected the capture of tagged nucleotides complementary to a DNA base on a provided template. We measured over 200 tagged-nucleotide signals for each of the four bases and developed a classification method to uniquely distinguish them from each other and background signals. The probability of falsely identifying a background event as a true capture event was less than 1.2%. In the presence of all four tagged nucleotides, we observed sequential additions in real time during polymerase-catalyzed DNA synthesis. Single-polymerase coupling to a nanopore, in combination with the Nanopore-SBS approach, can provide the foundation for a low-cost, single-molecule, electronic DNA-sequencing platform. PMID:27729524
NASA Astrophysics Data System (ADS)
Sulaiman, M.; El-Shafie, A.; Karim, O.; Basri, H.
2011-10-01
Flood forecasting models are a necessity, as they help in planning for flood events, and thus help prevent loss of lives and minimize damage. At present, artificial neural networks (ANN) have been successfully applied in river flow and water level forecasting studies. ANN requires historical data to develop a forecasting model. However, long-term historical water level data, such as hourly data, poses two crucial problems in data training. First is that the high volume of data slows the computation process. Second is that data training reaches its optimal performance within a few cycles of data training, due to there being a high volume of normal water level data in the data training, while the forecasting performance for high water level events is still poor. In this study, the zoning matching approach (ZMA) is used in ANN to accurately monitor flood events in real time by focusing the development of the forecasting model on high water level zones. ZMA is a trial and error approach, where several training datasets using high water level data are tested to find the best training dataset for forecasting high water level events. The advantage of ZMA is that relevant knowledge of water level patterns in historical records is used. Importantly, the forecasting model developed based on ZMA successfully achieves high accuracy forecasting results at 1 to 3 h ahead and satisfactory performance results at 6 h. Seven performance measures are adopted in this study to describe the accuracy and reliability of the forecasting model developed.
Shortt, Colleen; Phan, Kim; Hill, Stephen A; Worster, Andrew; Kavsak, Peter A
2015-03-01
The application of "undetectable" high-sensitivity cardiac troponin (hs-cTn) concentrations to "rule-out" myocardial infarction is appealing, but there are analytical concerns and a lack of consensus on what concentration should be used to define the lower reportable limit; i.e., limit of detection (LoD) or limit of blank. An alternative approach is to utilize a measurable hs-cTn concentration that identifies patients at low-risk for a future cardiovascular event combined with another prognostic test, such as glucose. We assessed both of these approaches in different emergency department (ED) cohorts to rule-out an event. We used cohort 1 (all-comer ED population, n=4773; derivation cohort) to determine the most appropriate approach at presentation (i.e., Dual Panel test: hs-cTn/glucose vs. LoD vs. LoD/glucose) for an early rule-out of hospital death using the Abbott ARCHITECT hs-cTnI assay. We used cohort 2 (n=144) and cohort 3 (n=127), both early chest pain onset ED populations as the verification datasets (outcome: composite cardiovascular event at 72h) with three hs-cTn assays assessed (Abbott Laboratories, Beckman Coulter, Roche Diagnostics). In cohort 1, the sensitivity was >99% for all three approaches; however the specificity (11%; 95% CI: 10-12%) was significantly higher for the Dual Panel as compared to the LoD approach (specificity=5%; 95% CI: 4-6%). Verification of the Dual Panel in cohort 2 and cohort 3 revealed 100% sensitivity and negative predictive values for all three hs-cTn assays. The combination of a "healthy" hs-cTn concentration with glucose might effectively rule-out patients for an acute cardiovascular event at ED presentation. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Joint Modeling Approach for Semicompeting Risks Data with Missing Nonterminal Event Status
Hu, Chen; Tsodikov, Alex
2014-01-01
Semicompeting risks data, where a subject may experience sequential non-terminal and terminal events, and the terminal event may censor the non-terminal event but not vice versa, are widely available in many biomedical studies. We consider the situation when a proportion of subjects’ non-terminal events is missing, such that the observed data become a mixture of “true” semicompeting risks data and partially observed terminal event only data. An illness-death multistate model with proportional hazards assumptions is proposed to study the relationship between non-terminal and terminal events, and provide covariate-specific global and local association measures. Maximum likelihood estimation based on semiparametric regression analysis is used for statistical inference, and asymptotic properties of proposed estimators are studied using empirical process and martingale arguments. We illustrate the proposed method with simulation studies and data analysis of a follicular cell lymphoma study. PMID:24430204
[Causal analysis approaches in epidemiology].
Dumas, O; Siroux, V; Le Moual, N; Varraso, R
2014-02-01
Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the formulation of causal hypotheses, which will be a basis for all methodological choices. Beyond this step, statistical analysis tools recently developed offer new possibilities to delineate complex relationships, in particular in life course epidemiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Chad D. Pierskalla; Dorothy H. Anderson; David W. Lime
2000-01-01
To manage various recreation opportunities, managers and planners must consider the spatial and temporal scale of social process when identifying opportunities on base maps. However, analyses of social process and spatial form are often treated as two distinct approaches--sociological and geographical approaches. A sociologist might control for spatial form by adopting...
Bayesian Phase II optimization for time-to-event data based on historical information.
Bertsche, Anja; Fleischer, Frank; Beyersmann, Jan; Nehmiz, Gerhard
2017-01-01
After exploratory drug development, companies face the decision whether to initiate confirmatory trials based on limited efficacy information. This proof-of-concept decision is typically performed after a Phase II trial studying a novel treatment versus either placebo or an active comparator. The article aims to optimize the design of such a proof-of-concept trial with respect to decision making. We incorporate historical information and develop pre-specified decision criteria accounting for the uncertainty of the observed treatment effect. We optimize these criteria based on sensitivity and specificity, given the historical information. Specifically, time-to-event data are considered in a randomized 2-arm trial with additional prior information on the control treatment. The proof-of-concept criterion uses treatment effect size, rather than significance. Criteria are defined on the posterior distribution of the hazard ratio given the Phase II data and the historical control information. Event times are exponentially modeled within groups, allowing for group-specific conjugate prior-to-posterior calculation. While a non-informative prior is placed on the investigational treatment, the control prior is constructed via the meta-analytic-predictive approach. The design parameters including sample size and allocation ratio are then optimized, maximizing the probability of taking the right decision. The approach is illustrated with an example in lung cancer.
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.
Events as Organisational Stories: an Event-Based Approach for Learning Media Production
NASA Astrophysics Data System (ADS)
Numento, Tomi; Uotila, Pekka
Google’ storytelling’, and you get 22.1 million hits (17.10.2007). By comparison, ‘shareholder value’ gets you 2.1 million (17.10.2007). Storytelling is hot, since a good story and storytelling skills can differentiate you from your competitors. The question is how to create such a story, how to advance from an informing level to an inspiring, engaging, touching and captivating story for your audience.
Enhancing the Teaching of Introductory Economics with a Team-Based, Multi-Section Competition
ERIC Educational Resources Information Center
Beaudin, Laura; Berdiev, Aziz N.; Kaminaga, Allison Shwachman; Mirmirani, Sam; Tebaldi, Edinaldo
2017-01-01
The authors describe a unique approach to enhancing student learning at the introductory economics level that utilizes a multi-section, team-based competition. The competition is structured to supplement learning throughout the entire introductory course. Student teams are presented with current economic issues, trends, or events, and use economic…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boero, Riccardo; Edwards, Brian Keith
Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying themore » event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.« less
The impact of a national mental health arts and film festival on stigma and recovery.
Quinn, N; Shulman, A; Knifton, L; Byrne, P
2011-01-01
This study aims to evaluate the impact of a national mental health arts festival for the general public, encompassing a wide variety of art forms and themes. An evaluation was undertaken with 415 attendees from 20 different events, combining qualitative and quantitative approaches. The findings demonstrate positive impact on the relationship between arts and mental health. Events increased positive attitudes, including positive representations of people's contributions, capabilities and potential to recover. They did not decrease negative attitudes. Intended behaviour change was modest and one film event increased audience perceptions of dangerousness. The paper argues that the arts can change stigma by constructing shared meanings and engaging audiences on an emotional level. Carefully programmed, collaborative, community-based arts festivals should form an integral part of national programmes to address stigma and to promote mental health and wellbeing, alongside traditional social marketing and public education approaches. © 2010 John Wiley & Sons A/S.
Pulse shape discrimination for Gerda Phase I data
NASA Astrophysics Data System (ADS)
Agostini, M.; Allardt, M.; Andreotti, E.; Bakalyarov, A. M.; Balata, M.; Barabanov, I.; Barnabé Heider, M.; Barros, N.; Baudis, L.; Bauer, C.; Becerici-Schmidt, N.; Bellotti, E.; Belogurov, S.; Belyaev, S. T.; Benato, G.; Bettini, A.; Bezrukov, L.; Bode, T.; Brudanin, V.; Brugnera, R.; Budjáš, D.; Caldwell, A.; Cattadori, C.; Chernogorov, A.; Cossavella, F.; Demidova, E. V.; Domula, A.; Egorov, V.; Falkenstein, R.; Ferella, A.; Freund, K.; Frodyma, N.; Gangapshev, A.; Garfagnini, A.; Gotti, C.; Grabmayr, P.; Gurentsov, V.; Gusev, K.; Guthikonda, K. K.; Hampel, W.; Hegai, A.; Heisel, M.; Hemmer, S.; Heusser, G.; Hofmann, W.; Hult, M.; Inzhechik, L. V.; Ioannucci, L.; Janicskó Csáthy, J.; Jochum, J.; Junker, M.; Kihm, T.; Kirpichnikov, I. V.; Kirsch, A.; Klimenko, A.; Knöpfle, K. T.; Kochetov, O.; Kornoukhov, V. N.; Kuzminov, V. V.; Laubenstein, M.; Lazzaro, A.; Lebedev, V. I.; Lehnert, B.; Liao, H. Y.; Lindner, M.; Lippi, I.; Liu, X.; Lubashevskiy, A.; Lubsandorzhiev, B.; Lutter, G.; Macolino, C.; Machado, A. A.; Majorovits, B.; Maneschg, W.; Misiaszek, M.; Nemchenok, I.; Nisi, S.; O'Shaughnessy, C.; Pandola, L.; Pelczar, K.; Pessina, G.; Pullia, A.; Riboldi, S.; Rumyantseva, N.; Sada, C.; Salathe, M.; Schmitt, C.; Schreiner, J.; Schulz, O.; Schwingenheuer, B.; Schönert, S.; Shevchik, E.; Shirchenko, M.; Simgen, H.; Smolnikov, A.; Stanco, L.; Strecker, H.; Tarka, M.; Ur, C. A.; Vasenko, A. A.; Volynets, O.; von Sturm, K.; Wagner, V.; Walter, M.; Wegmann, A.; Wester, T.; Wojcik, M.; Yanovich, E.; Zavarise, P.; Zhitnikov, I.; Zhukov, S. V.; Zinatulina, D.; Zuber, K.; Zuzel, G.
2013-10-01
The Gerda experiment located at the Laboratori Nazionali del Gran Sasso of INFN searches for neutrinoless double beta (0 νββ) decay of 76Ge using germanium diodes as source and detector. In Phase I of the experiment eight semi-coaxial and five BEGe type detectors have been deployed. The latter type is used in this field of research for the first time. All detectors are made from material with enriched 76Ge fraction. The experimental sensitivity can be improved by analyzing the pulse shape of the detector signals with the aim to reject background events. This paper documents the algorithms developed before the data of Phase I were unblinded. The double escape peak (DEP) and Compton edge events of 2.615 MeV γ rays from 208Tl decays as well as two-neutrino double beta (2 νββ) decays of 76Ge are used as proxies for 0 νββ decay. For BEGe detectors the chosen selection is based on a single pulse shape parameter. It accepts 0.92±0.02 of signal-like events while about 80 % of the background events at Q ββ =2039 keV are rejected. For semi-coaxial detectors three analyses are developed. The one based on an artificial neural network is used for the search of 0 νββ decay. It retains 90 % of DEP events and rejects about half of the events around Q ββ . The 2 νββ events have an efficiency of 0.85±0.02 and the one for 0 νββ decays is estimated to be . A second analysis uses a likelihood approach trained on Compton edge events. The third approach uses two pulse shape parameters. The latter two methods confirm the classification of the neural network since about 90 % of the data events rejected by the neural network are also removed by both of them. In general, the selection efficiency extracted from DEP events agrees well with those determined from Compton edge events or from 2 νββ decays.
Pulse shape discrimination for Gerda Phase I data
Agostini, M.; Allardt, M.; Andreotti, E.; ...
2013-10-09
The GERDA experiment located at the Laboratori Nazionali del Gran Sasso of INFN searches for neutrinoless double beta (0νββ) decay of 76Ge using germanium diodes as source and detector. In Phase I of the experiment eight semi-coaxial and five BEGe type detectors have been deployed. The latter type is used in this field of research for the first time. All detectors are made from material with enriched 76Ge fraction. The experimental sensitivity can be improved by analyzing the pulse shape of the detector signals with the aim to reject background events. This paper documents the algorithms developed before the datamore » of Phase I were unblinded. The double escape peak (DEP) and Compton edge events of 2.615 MeV γ rays from 208Tl decays as well as two-neutrino double beta (2νββ) decays of 76Ge are used as proxies for 0νββ decay. For BEGe detectors the chosen selection is based on a single pulse shape parameter. It accepts 0.92 ± 0.02 of signal-like events while about 80 % of the background events at Qββ = 2039 keV are rejected. For semi-coaxial detectors three analyses are developed. The one based on an artificial neural network is used for the search of 0νββ decay. It retains 90 % of DEP events and rejects about half of the events around Qββ . The 2νββ events have an efficiency of 0.85±0.02 and the one for 0νββ decays is estimated to be 0.90 +0.05 -0.09 . A second analysis uses a likelihood approach trained on Compton edge events. The third approach uses two pulse shape parameters. The latter two methods confirm the classification of the neural network since about 90 % of the data events rejected by the neural network are also removed by both of them. In general, the selection efficiency extracted from DEP events agrees well with those determined from Compton edge events or from 2νββ decays.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agostini, M.; Allardt, M.; Andreotti, E.
The GERDA experiment located at the Laboratori Nazionali del Gran Sasso of INFN searches for neutrinoless double beta (0νββ) decay of 76Ge using germanium diodes as source and detector. In Phase I of the experiment eight semi-coaxial and five BEGe type detectors have been deployed. The latter type is used in this field of research for the first time. All detectors are made from material with enriched 76Ge fraction. The experimental sensitivity can be improved by analyzing the pulse shape of the detector signals with the aim to reject background events. This paper documents the algorithms developed before the datamore » of Phase I were unblinded. The double escape peak (DEP) and Compton edge events of 2.615 MeV γ rays from 208Tl decays as well as two-neutrino double beta (2νββ) decays of 76Ge are used as proxies for 0νββ decay. For BEGe detectors the chosen selection is based on a single pulse shape parameter. It accepts 0.92 ± 0.02 of signal-like events while about 80 % of the background events at Qββ = 2039 keV are rejected. For semi-coaxial detectors three analyses are developed. The one based on an artificial neural network is used for the search of 0νββ decay. It retains 90 % of DEP events and rejects about half of the events around Qββ . The 2νββ events have an efficiency of 0.85±0.02 and the one for 0νββ decays is estimated to be 0.90 +0.05 -0.09 . A second analysis uses a likelihood approach trained on Compton edge events. The third approach uses two pulse shape parameters. The latter two methods confirm the classification of the neural network since about 90 % of the data events rejected by the neural network are also removed by both of them. In general, the selection efficiency extracted from DEP events agrees well with those determined from Compton edge events or from 2νββ decays.« less
Motivation and temporal distance: effect on cognitive and affective manifestations.
Bjørnebekk, Gunnar; Gjesme, Torgrim
2009-10-01
The implications of temporal distance on motivation-related concepts were examined. The results of an experiment, based on 585 Grade 6 students, indicated that both positive (approach) and negative (avoidance) motivation increased as the future goal or event approached in time. This increase in approach and avoidance motivation influenced the performance of the pupils differently. For pupils with success orientation, the performance increased. For pupils with failure orientation, the performance remained about the same.
Counting Unfolding Events in Stretched Helices with Induced Oscillation by Optical Tweezers
NASA Astrophysics Data System (ADS)
Bacabac, Rommel Gaud; Otadoy, Roland
Correlation measures based on embedded probe fluctuations, single or paired, are now widely used for characterizing the viscoelastic properties of biological samples. However, more robust applications using this technique are still lacking. Considering that the study of living matter routinely demonstrates new and complex phenomena, mathematical and experimental tools for analysis have to catch up in order to arrive at newer insights. Therefore, we derive ways of probing non-equilibrium events in helical biopolymers provided by stretching beyond thermal forces. We generalize, for the first time, calculations for winding turn probabilities to account for unfolding events in single fibrous biopolymers and globular proteins under tensile stretching using twin optical traps. The approach is based on approximating the ensuing probe fluctuations as originating from a damped harmonic oscillator under oscillatory forcing.
Castillo, R.L; Carrasco Loza, R; Romero-Dapueto, C
2015-01-01
Experimental approaches have been implemented to research the lung damage related-mechanism. These models show in animals pathophysiological events for acute respiratory distress syndrome (ARDS), such as neutrophil activation, reactive oxygen species burst, pulmonary vascular hypertension, exudative edema, and other events associated with organ dysfunction. Moreover, these approaches have not reproduced the clinical features of lung damage. Lung inflammation is a relevant event in the develop of ARDS as component of the host immune response to various stimuli, such as cytokines, antigens and endotoxins. In patients surviving at the local inflammatory states, transition from injury to resolution is an active mechanism regulated by the immuno-inflammatory signaling pathways. Indeed, inflammatory process is regulated by the dynamics of cell populations that migrate to the lung, such as neutrophils and on the other hand, the role of the modulation of transcription factors and reactive oxygen species (ROS) sources, such as nuclear factor kappaB and NADPH oxidase. These experimental animal models reproduce key components of the injury and resolution phases of human ALI/ARDS and provide a methodology to explore mechanisms and potential new therapies. PMID:26312099
Patterns of precipitation and soil moisture extremes in Texas, US: A complex network analysis
NASA Astrophysics Data System (ADS)
Sun, Alexander Y.; Xia, Youlong; Caldwell, Todd G.; Hao, Zengchao
2018-02-01
Understanding of the spatial and temporal dynamics of extreme precipitation not only improves prediction skills, but also helps to prioritize hazard mitigation efforts. This study seeks to enhance the understanding of spatiotemporal covariation patterns embedded in precipitation (P) and soil moisture (SM) by using an event-based, complex-network-theoretic approach. Events concurrences are quantified using a nonparametric event synchronization measure, and spatial patterns of hydroclimate variables are analyzed by using several network measures and a community detection algorithm. SM-P coupling is examined using a directional event coincidence analysis measure that takes the order of event occurrences into account. The complex network approach is demonstrated for Texas, US, a region possessing a rich set of hydroclimate features and is frequented by catastrophic flooding. Gridded daily observed P data and simulated SM data are used to create complex networks of P and SM extremes. The uncovered high degree centrality regions and community structures are qualitatively in agreement with the overall existing knowledge of hydroclimate extremes in the study region. Our analyses provide new visual insights on the propagation, connectivity, and synchronicity of P extremes, as well as the SM-P coupling, in this flood-prone region, and can be readily used as a basis for event-driven predictive analytics for other regions.
Hagan, Melissa J; Sulik, Michael J; Lieberman, Alicia F
2016-07-01
Studies of the association between traumatic experiences and psychopathology in early childhood have primarily focused on specific types of events (e.g., sexual abuse) or aggregated different types of events without differentiating among them. We extend this body of work by investigating patterns of traumatic event exposure in a high-risk, ethnically diverse sample of children ages 3-6 (N = 211; 51 % female) and relating these different patterns to parents' reports of child externalizing, internalizing, and post-traumatic stress symptomatology. Using latent class analysis, which divides a heterogeneous population into homogenous subpopulations, we identified three patterns of traumatic events based on parents' responses to an interview-based assessment of trauma exposure in young children: (1) severe exposure, characterized by a combination of family violence and victimization; (2) witnessing family violence without victimization; and (3) moderate exposure, characterized by an absence of family violence but a moderate probability of other events. The severe exposure class exhibited elevated internalizing and post-traumatic stress symptoms relative to the witness to violence and moderate exposure classes, controlling for average number of traumatic events. Results highlight the need for differentiation between profiles of traumatic life event exposure and the potential for person-centered methods to complement the cumulative risk perspective.
Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring †
Mao, Yingchi; Qi, Hai; Ping, Ping; Li, Xiaofang
2017-01-01
Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED) was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability. PMID:29207535
Setting objective thresholds for rare event detection in flow cytometry
Richards, Adam J.; Staats, Janet; Enzor, Jennifer; McKinnon, Katherine; Frelinger, Jacob; Denny, Thomas N.; Weinhold, Kent J.; Chan, Cliburn
2014-01-01
The accurate identification of rare antigen-specific cytokine positive cells from peripheral blood mononuclear cells (PBMC) after antigenic stimulation in an intracellular staining (ICS) flow cytometry assay is challenging, as cytokine positive events may be fairly diffusely distributed and lack an obvious separation from the negative population. Traditionally, the approach by flow operators has been to manually set a positivity threshold to partition events into cytokine-positive and cytokine-negative. This approach suffers from subjectivity and inconsistency across different flow operators. The use of statistical clustering methods does not remove the need to find an objective threshold between between positive and negative events since consistent identification of rare event subsets is highly challenging for automated algorithms, especially when there is distributional overlap between the positive and negative events (“smear”). We present a new approach, based on the Fβ measure, that is similar to manual thresholding in providing a hard cutoff, but has the advantage of being determined objectively. The performance of this algorithm is compared with results obtained by expert visual gating. Several ICS data sets from the External Quality Assurance Program Oversight Laboratory (EQAPOL) proficiency program were used to make the comparisons. We first show that visually determined thresholds are difficult to reproduce and pose a problem when comparing results across operators or laboratories, as well as problems that occur with the use of commonly employed clustering algorithms. In contrast, a single parameterization for the Fβ method performs consistently across different centers, samples, and instruments because it optimizes the precision/recall tradeoff by using both negative and positive controls. PMID:24727143
Online track detection in triggerless mode for INO
NASA Astrophysics Data System (ADS)
Jain, A.; Padmini, S.; Joseph, A. N.; Mahesh, P.; Preetha, N.; Behere, A.; Sikder, S. S.; Majumder, G.; Behera, S. P.
2018-03-01
The India based Neutrino Observatory (INO) is a proposed particle physics research project to study the atmospheric neutrinos. INO-Iron Calorimeter (ICAL) will consist of 28,800 detectors having 3.6 million electronic channels expected to activate with 100 Hz single rate, producing data at a rate of 3 GBps. Data collected contains a few real hits generated by muon tracks and the remaining noise-induced spurious hits. Estimated reduction factor after filtering out data of interest from generated data is of the order of 103. This makes trigger generation critical for efficient data collection and storage. Trigger is generated by detecting coincidence across multiple channels satisfying trigger criteria, within a small window of 200 ns in the trigger region. As the probability of neutrino interaction is very low, track detection algorithm has to be efficient and fast enough to process 5 × 106 events-candidates/s without introducing significant dead time, so that not even a single neutrino event is missed out. A hardware based trigger system is presently proposed for on-line track detection considering stringent timing requirements. Though the trigger system can be designed with scalability, a lot of hardware devices and interconnections make it a complex and expensive solution with limited flexibility. A software based track detection approach working on the hit information offers an elegant solution with possibility of varying trigger criteria for selecting various potentially interesting physics events. An event selection approach for an alternative triggerless readout scheme has been developed. The algorithm is mathematically simple, robust and parallelizable. It has been validated by detecting simulated muon events for energies of the range of 1 GeV-10 GeV with 100% efficiency at a processing rate of 60 μs/event on a 16 core machine. The algorithm and result of a proof-of-concept for its faster implementation over multiple cores is presented. The paper also discusses about harnessing the computing capabilities of multi-core computing farm, thereby optimizing number of nodes required for the proposed system.
Regression Analysis of Mixed Recurrent-Event and Panel-Count Data with Additive Rate Models
Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L.
2015-01-01
Summary Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007; Zhao et al., 2011). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013). In this paper, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. PMID:25345405
Adult fathead minnows were exposed to dilutions of a historically estrogenic wastewater treatment plant effluent in a 21-d reproduction study. This dataset is comprised of a variety of endpoints representing key events along adverse outcome pathways linking estrogen receptor activation and other molecular initiating events to reproductive impairment. This study demonstrates the value of using an integrative approach that encompasses analytical chemistry, in vitro bioassays, and in vivo apical and pathway-based approaches with endpoints spanning from molecular- (e.g., gene expression) to organismal- (e.g., reproduction) levels of biological organization to help infer causal relationships between chemistry and potential effects on reproduction.This dataset is associated with the following publication:Cavallin , J., K. Jensen , M. Kahl , D. Villeneuve , K. Lee, A. Schroeder , J. Mayasich, E. Eid, K. Nelson, R. Milsk, B. Blackwell, J. Berninger , C. LaLone, C. Blanksma, T. Jicha , C. Elonen , R. Johnson , and G. Ankley. Pathway-based approaches for assessment of real-time exposure to an estrogenic wastewater treatment plant effluent on fathead minnow reproduction. ENVIRONMENTAL TOXICOLOGY AND CHEMISTRY. Society of Environmental Toxicology and Chemistry, Pensacola, FL, USA, 35(3): 702-716, (2016).
Winkler, Daniel; Zischg, Jonatan; Rauch, Wolfgang
2018-01-01
For communicating urban flood risk to authorities and the public, a realistic three-dimensional visual display is frequently more suitable than detailed flood maps. Virtual reality could also serve to plan short-term flooding interventions. We introduce here an alternative approach for simulating three-dimensional flooding dynamics in large- and small-scale urban scenes by reaching out to computer graphics. This approach, denoted 'particle in cell', is a particle-based CFD method that is used to predict physically plausible results instead of accurate flow dynamics. We exemplify the approach for the real flooding event in July 2016 in Innsbruck.
Testing the seismology-based landquake monitoring system
NASA Astrophysics Data System (ADS)
Chao, Wei-An
2016-04-01
I have developed a real-time landquake monitoring system (RLMs), which monitor large-scale landquake activities in the Taiwan using real-time seismic network of Broadband Array in Taiwan for Seismology (BATS). The RLM system applies a grid-based general source inversion (GSI) technique to obtain the preliminary source location and force mechanism. A 2-D virtual source-grid on the Taiwan Island is created with an interval of 0.2° in both latitude and longitude. The depth of each grid point is fixed on the free surface topography. A database is stored on the hard disk for the synthetics, which are obtained using Green's functions computed by the propagator matrix approach for 1-D average velocity model, at all stations from each virtual source-grid due to nine elementary source components: six elementary moment tensors and three orthogonal (north, east and vertical) single-forces. Offline RLM system was carried out for events detected in previous studies. An important aspect of the RLM system is the implementation of GSI approach for different source types (e.g., full moment tensor, double couple faulting, and explosion source) by the grid search through the 2-D virtual source to automatically identify landquake event based on the improvement in waveform fitness and evaluate the best-fit solution in the monitoring area. With this approach, not only the force mechanisms but also the event occurrence time and location can be obtained simultaneously about 6-8 min after an occurrence of an event. To improve the insufficient accuracy of GSI-determined lotion, I further conduct a landquake epicenter determination (LED) method that maximizes the coherency of the high-frequency (1-3 Hz) horizontal envelope functions to determine the final source location. With good knowledge about the source location, I perform landquake force history (LFH) inversion to investigate the source dynamics (e.g., trajectory) for the relatively large-sized landquake event. With providing aforementioned source information in real-time, the government and emergency response agencies have sufficient reaction time for rapid assessment and response to landquake hazards. Since 2016, the RLM system has operated online.
Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.
Housh, Mashor; Ohar, Ziv
2017-03-01
The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.
Human visual system-based smoking event detection
NASA Astrophysics Data System (ADS)
Odetallah, Amjad D.; Agaian, Sos S.
2012-06-01
Human action (e.g. smoking, eating, and phoning) analysis is an important task in various application domains like video surveillance, video retrieval, human-computer interaction systems, and so on. Smoke detection is a crucial task in many video surveillance applications and could have a great impact to raise the level of safety of urban areas, public parks, airplanes, hospitals, schools and others. The detection task is challenging since there is no prior knowledge about the object's shape, texture and color. In addition, its visual features will change under different lighting and weather conditions. This paper presents a new scheme of a system for detecting human smoking events, or small smoke, in a sequence of images. In developed system, motion detection and background subtraction are combined with motion-region-saving, skin-based image segmentation, and smoke-based image segmentation to capture potential smoke regions which are further analyzed to decide on the occurrence of smoking events. Experimental results show the effectiveness of the proposed approach. As well, the developed method is capable of detecting the small smoking events of uncertain actions with various cigarette sizes, colors, and shapes.
NASA Astrophysics Data System (ADS)
Liu, G.; Aspinall, M. D.; Ma, X.; Joyce, M. J.
2009-08-01
The discrimination of neutron and γ-ray events in an organic scintillator has been investigated by using a method based on an artificial neural network (ANN). Voltage pulses arising from an EJ-301 organic liquid scintillation detector in a mixed radiation field have been recorded with a fast digital sampling oscilloscope. Piled-up events have been disentangled using a pile-up management unit based on a fitting method. Each individual pulse has subsequently been sent to a discrimination unit which discriminates neutron and γ-ray events with a method based on an artificial neural network. This discrimination technique has been verified by the corresponding mixed-field data assessed by time of flight (TOF). It is shown that the characterization of the neutrons and photons achieved by the discrimination method based on the ANN is consistent with that afforded by TOF. This approach enables events that are often as a result of scattering or pile-up to be identified and returned to the data set and affords digital discrimination of mixed radiation fields in a broad range of environments on the basis of training obtained with a single TOF dataset.
van Rosendael, Alexander R; Maliakal, Gabriel; Kolli, Kranthi K; Beecy, Ashley; Al'Aref, Subhi J; Dwivedi, Aeshita; Singh, Gurpreet; Panday, Mohit; Kumar, Amit; Ma, Xiaoyue; Achenbach, Stephan; Al-Mallah, Mouaz H; Andreini, Daniele; Bax, Jeroen J; Berman, Daniel S; Budoff, Matthew J; Cademartiri, Filippo; Callister, Tracy Q; Chang, Hyuk-Jae; Chinnaiyan, Kavitha; Chow, Benjamin J W; Cury, Ricardo C; DeLago, Augustin; Feuchtner, Gudrun; Hadamitzky, Martin; Hausleiter, Joerg; Kaufmann, Philipp A; Kim, Yong-Jin; Leipsic, Jonathon A; Maffei, Erica; Marques, Hugo; Pontone, Gianluca; Raff, Gilbert L; Rubinshtein, Ronen; Shaw, Leslee J; Villines, Todd C; Gransar, Heidi; Lu, Yao; Jones, Erica C; Peña, Jessica M; Lin, Fay Y; Min, James K
Machine learning (ML) is a field in computer science that demonstrated to effectively integrate clinical and imaging data for the creation of prognostic scores. The current study investigated whether a ML score, incorporating only the 16 segment coronary tree information derived from coronary computed tomography angiography (CCTA), provides enhanced risk stratification compared with current CCTA based risk scores. From the multi-center CONFIRM registry, patients were included with complete CCTA risk score information and ≥3 year follow-up for myocardial infarction and death (primary endpoint). Patients with prior coronary artery disease were excluded. Conventional CCTA risk scores (conventional CCTA approach, segment involvement score, duke prognostic index, segment stenosis score, and the Leaman risk score) and a score created using ML were compared for the area under the receiver operating characteristic curve (AUC). Only 16 segment based coronary stenosis (0%, 1-24%, 25-49%, 50-69%, 70-99% and 100%) and composition (calcified, mixed and non-calcified plaque) were provided to the ML model. A boosted ensemble algorithm (extreme gradient boosting; XGBoost) was used and the entire data was randomly split into a training set (80%) and testing set (20%). First, tuned hyperparameters were used to generate a trained model from the training data set (80% of data). Second, the performance of this trained model was independently tested on the unseen test set (20% of data). In total, 8844 patients (mean age 58.0 ± 11.5 years, 57.7% male) were included. During a mean follow-up time of 4.6 ± 1.5 years, 609 events occurred (6.9%). No CAD was observed in 48.7% (3.5% event), non-obstructive CAD in 31.8% (6.8% event), and obstructive CAD in 19.5% (15.6% event). Discrimination of events as expressed by AUC was significantly better for the ML based approach (0.771) vs the other scores (ranging from 0.685 to 0.701), P < 0.001. Net reclassification improvement analysis showed that the improved risk stratification was the result of down-classification of risk among patients that did not experience events (non-events). A risk score created by a ML based algorithm, that utilizes standard 16 coronary segment stenosis and composition information derived from detailed CCTA reading, has greater prognostic accuracy than current CCTA integrated risk scores. These findings indicate that a ML based algorithm can improve the integration of CCTA derived plaque information to improve risk stratification. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Lanni, Cristiano; Mazzorana, Bruno; Volcan, Claudio; Bertagnolli, Rudi
2015-04-01
Flood hazard is generally assessed by assuming the return period of the rainfall as a proxy for the return period of the discharge and the related hydrograph. Frequently this deterministic view is extended also to the straightforward application of hydrodynamic models. However, the climate (i.e. precipitation), the catchment (i.e. geology, soil and antecedent soil-moisture condition) and the anthropogenic (i.e. drainage system and its regulation) systems interact in a complex way, and the occurrence probability of a flood inundation event can significantly differ from the occurrence probability of the triggering event (i.e. rainfall). In order to reliably determine the spatial patterns of flood intensities and probabilities, the rigorous determination of flood event scenarios is beneficial because it provides a clear, rationale method to recognize and unveil the inherent stochastic behavior of natural processes. Therefore, a multi-scenario approach for hazard assessment should be applied and should consider the possible events taking place in the area potentially subject to flooding (i.e. floodplains). Here, we apply a multi-scenario approach for the assessment of the flood hazard around the Idro lake (Italy). We consider and estimate the probability of occurrence of several scenarios related to the initial (i.e. initial water level in the lake) and boundary (i.e. shape of the hydrograph, downslope drainage, spillway opening operations) conditions characterizing the lake. Finally, we discuss the advantages and issues of the presented methodological procedure compared to traditional (and essentially deterministic) approaches.
Regression analysis of mixed recurrent-event and panel-count data with additive rate models.
Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L
2015-03-01
Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.
2015-01-01
Background Modern methods for mining biomolecular interactions from literature typically make predictions based solely on the immediate textual context, in effect a single sentence. No prior work has been published on extending this context to the information automatically gathered from the whole biomedical literature. Thus, our motivation for this study is to explore whether mutually supporting evidence, aggregated across several documents can be utilized to improve the performance of the state-of-the-art event extraction systems. In this paper, we describe our participation in the latest BioNLP Shared Task using the large-scale text mining resource EVEX. We participated in the Genia Event Extraction (GE) and Gene Regulation Network (GRN) tasks with two separate systems. In the GE task, we implemented a re-ranking approach to improve the precision of an existing event extraction system, incorporating features from the EVEX resource. In the GRN task, our system relied solely on the EVEX resource and utilized a rule-based conversion algorithm between the EVEX and GRN formats. Results In the GE task, our re-ranking approach led to a modest performance increase and resulted in the first rank of the official Shared Task results with 50.97% F-score. Additionally, in this paper we explore and evaluate the usage of distributed vector representations for this challenge. In the GRN task, we ranked fifth in the official results with a strict/relaxed SER score of 0.92/0.81 respectively. To try and improve upon these results, we have implemented a novel machine learning based conversion system and benchmarked its performance against the original rule-based system. Conclusions For the GRN task, we were able to produce a gene regulatory network from the EVEX data, warranting the use of such generic large-scale text mining data in network biology settings. A detailed performance and error analysis provides more insight into the relatively low recall rates. In the GE task we demonstrate that both the re-ranking approach and the word vectors can provide slight performance improvement. A manual evaluation of the re-ranking results pinpoints some of the challenges faced in applying large-scale text mining knowledge to event extraction. PMID:26551766
ERIC Educational Resources Information Center
Charlson, M. E.; Peterson, J. C.; Boutin-Foster, C.; Briggs, W. M.; Ogedegbe, G. G.; McCulloch, C. E.; Hollenberg, J.; Wong, C.; Allegrante, J. P.
2008-01-01
Patients who have undergone angioplasty experience difficulty modifying at-risk behaviors for subsequent cardiac events. The purpose of this study was to test whether an innovative approach to framing of risk, based on "net present value" economic theory, would be more effective in behavioral intervention than the standard "future value approach"…
Nonbinary Tree-Based Phylogenetic Networks.
Jetten, Laura; van Iersel, Leo
2018-01-01
Rooted phylogenetic networks are used to describe evolutionary histories that contain non-treelike evolutionary events such as hybridization and horizontal gene transfer. In some cases, such histories can be described by a phylogenetic base-tree with additional linking arcs, which can, for example, represent gene transfer events. Such phylogenetic networks are called tree-based. Here, we consider two possible generalizations of this concept to nonbinary networks, which we call tree-based and strictly-tree-based nonbinary phylogenetic networks. We give simple graph-theoretic characterizations of tree-based and strictly-tree-based nonbinary phylogenetic networks. Moreover, we show for each of these two classes that it can be decided in polynomial time whether a given network is contained in the class. Our approach also provides a new view on tree-based binary phylogenetic networks. Finally, we discuss two examples of nonbinary phylogenetic networks in biology and show how our results can be applied to them.
NASA Astrophysics Data System (ADS)
Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph
2017-04-01
Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30
An emergency medical planning guide for commercial spaceflight events.
Law, Jennifer; Vanderploeg, James
2012-09-01
Commercial spaceflight events transporting paying passengers into space will begin to take place at various spaceports around the country within the next few years. Many spaceports are located in remote areas that are far from major hospitals and trauma centers. Spaceport medical directors should develop emergency medical plans (EMPs) to prepare for potential medical contingencies that may occur during commercial spaceflight events. The aim of this article is to guide spaceport medical directors in emergency medical planning for commercial spaceflight events. This guide is based on our experience and a recently developed EMP for Spaceport America which incorporated a literature review of mass gathering medicine, existing planning guides for mass gathering events, and EMPs for analogous aerospace events. We propose a multipronged approach to emergency medical planning, consisting of event planning, medical reconnaissance, medical personnel, protocols, physical facility and hardware, and documentation. Medical directors should use this guide to develop an emergency medical plan tailored to the resources and constraints specific to their events.
NASA Astrophysics Data System (ADS)
Nardi, F.; Grimaldi, S.; Petroselli, A.
2012-12-01
Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.
To use or not to use: a stage-based approach to understanding condom use among homeless youth.
Tucker, Joan S; Ober, Allison; Ryan, Gery; Golinelli, Daniela; Ewing, Brett; Wenzel, Suzanne L
2014-01-01
This study used a stage-based approach to understand condom use behavior in a representative sample of 309 sexually active homeless youth recruited from shelters, drop-in centers, and street sites in Los Angeles County. Focusing on the youth's most recent sexual event, the three stages of condom use examined were: (1) whether the partners decided prior to the event about using condoms; (2) whether a condom was available at the event; and (3) whether a condom was used at the event. Logistic regression analysis was used to identify attitudinal, relationship, and contextual correlates of each of these three stages. Deciding ahead of time about condom use was associated with being Hispanic, level of education, condom attitudes, and various relationship characteristics (e.g., partner type, monogamy, relationship abuse), with the nature of these associations varying depending on the type of decision (i.e., deciding to use, deciding to not use). Condom availability was more likely to be reported by males, if the event was described as being special in some way, or if the event lacked privacy. Condom use was more likely among youth with more positive condom attitudes and among youth who decide ahead of time to use a condom, but less likely among those in monogamous relationships or when hard drugs were used prior to sex. Whether sexual intercourse is protected or unprotected is the end result of a series of decisions and actions by sexual partners. Results from this study illustrate how condom use can be better understood by unpacking the stages and identifying influential factors at each stage. Each stage may, in and of itself, be an important target for intervention with homeless youth.
Oliva, Elizabeth M; Bowe, Thomas; Tavakoli, Sara; Martins, Susana; Lewis, Eleanor T; Paik, Meenah; Wiechers, Ilse; Henderson, Patricia; Harvey, Michael; Avoundjian, Tigran; Medhanie, Amanuel; Trafton, Jodie A
2017-02-01
Concerns about opioid-related adverse events, including overdose, prompted the Veterans Health Administration (VHA) to launch an Opioid Safety Initiative and Overdose Education and Naloxone Distribution program. To mitigate risks associated with opioid prescribing, a holistic approach that takes into consideration both risk factors (e.g., dose, substance use disorders) and risk mitigation interventions (e.g., urine drug screening, psychosocial treatment) is needed. This article describes the Stratification Tool for Opioid Risk Mitigation (STORM), a tool developed in VHA that reflects this holistic approach and facilitates patient identification and monitoring. STORM prioritizes patients for review and intervention according to their modeled risk for overdose/suicide-related events and displays risk factors and risk mitigation interventions obtained from VHA electronic medical record (EMR)-data extracts. Patients' estimated risk is based on a predictive risk model developed using fiscal year 2010 (FY2010: 10/1/2009-9/30/2010) EMR-data extracts and mortality data among 1,135,601 VHA patients prescribed opioid analgesics to predict risk for an overdose/suicide-related event in FY2011 (2.1% experienced an event). Cross-validation was used to validate the model, with receiver operating characteristic curves for the training and test data sets performing well (>.80 area under the curve). The predictive risk model distinguished patients based on risk for overdose/suicide-related adverse events, allowing for identification of high-risk patients and enrichment of target populations of patients with greater safety concerns for proactive monitoring and application of risk mitigation interventions. Results suggest that clinical informatics can leverage EMR-extracted data to identify patients at-risk for overdose/suicide-related events and provide clinicians with actionable information to mitigate risk. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Modeling Compound Flood Hazards in Coastal Embayments
NASA Astrophysics Data System (ADS)
Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.
2017-12-01
Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the strengths/weaknesses of each approach and helps modelers choose the appropriate scenario that best fit to the needs of their project. The proposed risk assessment approach can help flood hazard modeling practitioners achieve a more reliable estimate of risk, by cautiously reducing the dimensionality of the hazard analysis.
NASA Astrophysics Data System (ADS)
Ángel López Comino, José; Cesca, Simone; Heimann, Sebastian; Grigoli, Francesco; Milkereit, Claus; Dahm, Torsten; Zang, Arno
2017-04-01
A crucial issue to analyse the induced seismicity for hydraulic fracturing is the detection and location of massive microseismic or acoustic emissions (AE) activity, with robust and sufficiently accurate automatic algorithms. Waveform stacking and coherence analysis have been tested for local seismic monitoring and mining induced seismicity improving the classical detection and location methods (e.g. short-term-average/long-term-average and automatic picking of the P and S waves first arrivals). These techniques are here applied using a full waveform approach for a hydraulic fracturing experiment (Nova project 54-14-1) that took place 410 m below surface in the Äspö Hard Rock Laboratory (Sweden). Continuous waveform recording with a near field network composed by eleven AE sensors are processed. The piezoelectric sensors have their highest sensitive in the frequency range 1 to 100 kHz, but sampling rates were extended to 1 MHz. We present the results obtained during the conventional, continuous water-injection experiment HF2 (Hydraulic Fracture 2). The event detector is based on the stacking of characteristic functions. It follows a delay-and-stack approach, where the likelihood of the hypocenter location in a pre-selected seismogenic volume is mapped by assessing the coherence of the P onset times at different stations. A low detector threshold is chosen, in order not to loose weaker events. This approach also increases the number of false detections. Therefore, the dataset has been revised manually, and detected events classified in terms of true AE events related to the fracturing process, electronic noise related to 50 Hz overtones, long period and other signals. The location of the AE events is further refined using a more accurate waveform stacking method which uses both P and S phases. A 3D grid is generated around the hydraulic fracturing volume and we retrieve a multidimensional matrix, whose absolute maximum corresponds to the spatial coordinates of the seismic event. The relative location accuracy is improved using a master event approach to correct for travel time perturbations. The master event is selected based on a good signal to noise ratio leading to a robust location with small uncertainties. Relative magnitudes are finally estimated upon the decay of the maximal recorded amplitude from the AE location. The resulting catalogue is composed of more than 4000 AEs. Their hypocenters are spatially clustered in a planar region, resembling the main fracture plane; its orientation and size are estimated from the spatial distribution of AEs. This work is funded by the EU H2020 SHEER project. Nova project 54-14-1 was financially supported by the GFZ German Research Center for Geosciences (75%), the KIT Karlsruhe Institute of Technology (15%) and the Nova Center for University Studies, Research and Development (10%). An additional in-kind contribution of SKB for using Äspö Hard Rock Laboratory as test site for geothermal research is greatly acknowledged.
NASA Astrophysics Data System (ADS)
Porto, Paolo; Walling, Des E.; Cogliandro, Vanessa; Callegari, Giovanni
2016-07-01
Use of the fallout radionuclides cesium-137 and excess lead-210 offers important advantages over traditional methods of quantifying erosion and soil redistribution rates. However, both radionuclides provide information on longer-term (i.e., 50-100 years) average rates of soil redistribution. Beryllium-7, with its half-life of 53 days, can provide a basis for documenting short-term soil redistribution and it has been successfully employed in several studies. However, the approach commonly used introduces several important constraints related to the timing and duration of the study period. A new approach proposed by the authors that overcomes these constraints has been successfully validated using an erosion plot experiment undertaken in southern Italy. Here, a further validation exercise undertaken in a small (1.38 ha) catchment is reported. The catchment was instrumented to measure event sediment yields and beryllium-7 measurements were employed to document the net soil loss for a series of 13 events that occurred between November 2013 and June 2015. In the absence of significant sediment storage within the catchment's ephemeral channel system and of a significant contribution from channel erosion to the measured sediment yield, the estimates of net soil loss for the individual events could be directly compared with the measured sediment yields to validate the former. The close agreement of the two sets of values is seen as successfully validating the use of beryllium-7 measurements and the new approach to obtain estimates of net soil loss for a sequence of individual events occurring over an extended period at the scale of a small catchment.
NASA Astrophysics Data System (ADS)
Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.
2017-12-01
Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.
Oisjöen, Fredrik; Schneiderman, Justin F; Astalan, Andrea Prieto; Kalabukhov, Alexey; Johansson, Christer; Winkler, Dag
2010-01-15
We demonstrate a one-step wash-free bioassay measurement system capable of tracking biochemical binding events. Our approach combines the high resolution of frequency- and high speed of time-domain measurements in a single device in combination with a fast one-step bioassay. The one-step nature of our magnetic nanoparticle (MNP) based assay reduces the time between sample extraction and quantitative results while mitigating the risks of contamination related to washing steps. Our method also enables tracking of binding events, providing the possibility of, for example, investigation of how chemical/biological environments affect the rate of a binding process or study of the action of certain drugs. We detect specific biological binding events occurring on the surfaces of fluid-suspended MNPs that modify their magnetic relaxation behavior. Herein, we extrapolate a modest sensitivity to analyte of 100 ng/ml with the present setup using our rapid one-step bioassay. More importantly, we determine the size-distributions of the MNP systems with theoretical fits to our data obtained from the two complementary measurement modalities and demonstrate quantitative agreement between them. Copyright 2009 Elsevier B.V. All rights reserved.
Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay
2016-04-01
Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Chen, Gong; Qi, Peng; Guo, Zhao; Yu, Haoyong
2017-06-01
In the field of gait rehabilitation robotics, achieving human-robot synchronization is very important. In this paper, a novel human-robot synchronization method using gait event information is proposed. This method includes two steps. First, seven gait events in one gait cycle are detected in real time with a hidden Markov model; second, an adaptive oscillator is utilized to estimate the stride percentage of human gait using any one of the gait events. Synchronous reference trajectories for the robot are then generated with the estimated stride percentage. This method is based on a bioinspired adaptive oscillator, which is a mathematical tool, first proposed to explain the phenomenon of synchronous flashing among fireflies. The proposed synchronization method is implemented in a portable knee-ankle-foot robot and tested in 15 healthy subjects. This method has the advantages of simple structure, flexible selection of gait events, and fast adaptation. Gait event is the only information needed, and hence the performance of synchronization holds when an abnormal gait pattern is involved. The results of the experiments reveal that our approach is efficient in achieving human-robot synchronization and feasible for rehabilitation robotics application.
Event-Based User Classification in Weibo Media
Wang, Wendong; Cheng, Shiduan; Que, Xirong
2014-01-01
Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235
Event-based user classification in Weibo media.
Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong
2014-01-01
Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.
Deterministic versus evidence-based attitude towards clinical diagnosis.
Soltani, Akbar; Moayyeri, Alireza
2007-08-01
Generally, two basic classes have been proposed for scientific explanation of events. Deductive reasoning emphasizes on reaching conclusions about a hypothesis based on verification of universal laws pertinent to that hypothesis, while inductive or probabilistic reasoning explains an event by calculation of some probabilities for that event to be related to a given hypothesis. Although both types of reasoning are used in clinical practice, evidence-based medicine stresses on the advantages of the second approach for most instances in medical decision making. While 'probabilistic or evidence-based' reasoning seems to involve more mathematical formulas at the first look, this attitude is more dynamic and less imprisoned by the rigidity of mathematics comparing with 'deterministic or mathematical attitude'. In the field of medical diagnosis, appreciation of uncertainty in clinical encounters and utilization of likelihood ratio as measure of accuracy seem to be the most important characteristics of evidence-based doctors. Other characteristics include use of series of tests for refining probability, changing diagnostic thresholds considering external evidences and nature of the disease, and attention to confidence intervals to estimate uncertainty of research-derived parameters.
Identifying spatially integrated floodplains/riparian areas and wetlands
Floodplain delineation may play an important role in managing wetlands and riparian areas at multiple scales - local, state, and federal. This poster demonstrates multiple GIS-based approaches to delimiting floodplains and contrasts these with observed flooding events from a majo...
Fine Dining and Fast Computers: A Commercial College's Recipe for Success.
ERIC Educational Resources Information Center
Borrego, Anne Marie
2001-01-01
Describes the "slow and steady" approach to growth embraced by the owner of Stratford College, a Virginia-based, for-profit school that offers degrees in the culinary arts, information technology, and hotel and event management and business. (EV)
Manuscript 116 Mechanisms: DNA Reactive Aagents
ABSTRACT The U.S. Environmental Protection Agency’s Guidelines for Carcinogen Risk Assessment (2005) uses an analytical framework for conducting a quantitative cancer risk assessment that is based on mode of action/key events and human relevance. The approach stresses the enh...
Ground robotic measurement of aeolian processes
USDA-ARS?s Scientific Manuscript database
Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These d...
Particle Filter Based Tracking in a Detection Sparse Discrete Event Simulation Environment
2007-03-01
obtained by disqualifying a large number of particles. 52 (a) (b) ( c ) Figure 31. Particle Disqualification via Sanitization b...1 B. RESEARCH APPROACH..............................................................................5 C . THESIS ORGANIZATION...38 b. Detection Distribution Sampling............................................43 c . Estimated Position Calculation
NASA Astrophysics Data System (ADS)
Gauduel, Y. A.
2017-05-01
A major challenge of spatio-temporal radiation biomedicine concerns the understanding of biophysical events triggered by an initial energy deposition inside confined ionization tracks. This contribution deals with an interdisciplinary approach that concerns cutting-edge advances in real-time radiation events, considering the potentialities of innovating strategies based on ultrafast laser science, from femtosecond photon sources to advanced techniques of ultrafast TW laser-plasma accelerator. Recent advances of powerful TW laser sources ( 1019 W cm-2) and laser-plasma interactions providing ultra-short relativistic particle beams in the energy domain 5-200 MeV open promising opportunities for the development of high energy radiation femtochemistry (HERF) in the prethermal regime of secondary low-energy electrons and for the real-time imaging of radiation-induced biomolecular alterations at the nanoscopic scale. New developments would permit to correlate early radiation events triggered by ultrashort radiation sources with a molecular approach of Relative Biological Effectiveness (RBE). These emerging research developments are crucial to understand simultaneously, at the sub-picosecond and nanometric scales, the early consequences of ultra-short-pulsed radiation on biomolecular environments or integrated biological entities. This innovating approach would be applied to biomedical relevant concepts such as the emerging domain of real-time nanodosimetry for targeted pro-drug activation and pulsed radio-chimiotherapy of cancers.
Tyler, Carl; Werner, James J.
2016-01-01
There is often a rich but untold history of events that occurred and relationships that formed prior to the launching of a practice-based research network (PBRN.) This is particularly the case in PBRNs that are community-based and comprised of partnerships outside of the health care system. In this article we summarize an organizational "prenatal history" prior to the birth of a PBRN devoted to persons with developmental disabilities. Using a case study approach, this article describes the historical events that preceded and fostered the evolution of this PBRN and contrasts how the processes leading to the creation of this multi-stakeholder community-based PBRN differ from those of typical academic-clinical practice PBRNs. We propose potential advantages and complexities inherent to this newest iteration of PBRNs. PMID:25381081
A logic programming approach to medical errors in imaging.
Rodrigues, Susana; Brandão, Paulo; Nelas, Luís; Neves, José; Alves, Victor
2011-09-01
In 2000, the Institute of Medicine reported disturbing numbers on the scope it covers and the impact of medical error in the process of health delivery. Nevertheless, a solution to this problem may lie on the adoption of adverse event reporting and learning systems that can help to identify hazards and risks. It is crucial to apply models to identify the adverse events root causes, enhance the sharing of knowledge and experience. The efficiency of the efforts to improve patient safety has been frustratingly slow. Some of this insufficiency of progress may be assigned to the lack of systems that take into account the characteristic of the information about the real world. In our daily lives, we formulate most of our decisions normally based on incomplete, uncertain and even forbidden or contradictory information. One's knowledge is less based on exact facts and more on hypothesis, perceptions or indications. From the data collected on our adverse event treatment and learning system on medical imaging, and through the use of Extended Logic Programming to knowledge representation and reasoning, and the exploitation of new methodologies for problem solving, namely those based on the perception of what is an agent and/or multi-agent systems, we intend to generate reports that identify the most relevant causes of error and define improvement strategies, concluding about the impact, place of occurrence, form or type of event recorded in the healthcare institutions. The Eindhoven Classification Model was extended and adapted to the medical imaging field and used to classify adverse events root causes. Extended Logic Programming was used for knowledge representation with defective information, allowing for the modelling of the universe of discourse in terms of data and knowledge default. A systematization of the evolution of the body of knowledge about Quality of Information embedded in the Root Cause Analysis was accomplished. An adverse event reporting and learning system was developed based on the presented approach to medical errors in imaging. This system was deployed in two Portuguese healthcare institutions, with an appealing outcome. The system enabled to verify that the majority of occurrences were concentrated in a few events that could be avoided. The developed system allowed automatic knowledge extraction, enabling report generation with strategies for the improvement of quality-of-care. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, J.; Astitha, M.; Anagnostou, E. N.; Hartman, B.; Kallos, G. B.
2015-12-01
Weather prediction accuracy has become very important for the Northeast U.S. given the devastating effects of extreme weather events in the recent years. Weather forecasting systems are used towards building strategies to prevent catastrophic losses for human lives and the environment. Concurrently, weather forecast tools and techniques have evolved with improved forecast skill as numerical prediction techniques are strengthened by increased super-computing resources. In this study, we examine the combination of two state-of-the-science atmospheric models (WRF and RAMS/ICLAMS) by utilizing a Bayesian regression approach to improve the prediction of extreme weather events for NE U.S. The basic concept behind the Bayesian regression approach is to take advantage of the strengths of two atmospheric modeling systems and, similar to the multi-model ensemble approach, limit their weaknesses which are related to systematic and random errors in the numerical prediction of physical processes. The first part of this study is focused on retrospective simulations of seventeen storms that affected the region in the period 2004-2013. Optimal variances are estimated by minimizing the root mean square error and are applied to out-of-sample weather events. The applicability and usefulness of this approach are demonstrated by conducting an error analysis based on in-situ observations from meteorological stations of the National Weather Service (NWS) for wind speed and wind direction, and NCEP Stage IV radar data, mosaicked from the regional multi-sensor for precipitation. The preliminary results indicate a significant improvement in the statistical metrics of the modeled-observed pairs for meteorological variables using various combinations of the sixteen events as predictors of the seventeenth. This presentation will illustrate the implemented methodology and the obtained results for wind speed, wind direction and precipitation, as well as set the research steps that will be followed in the future.
Integrated Safety Risk Reduction Approach to Enhancing Human-Rated Spaceflight Safety
NASA Astrophysics Data System (ADS)
Mikula, J. F. Kip
2005-12-01
This paper explores and defines the current accepted concept and philosophy of safety improvement based on a Reliability enhancement (called here Reliability Enhancement Based Safety Theory [REBST]). In this theory a Reliability calculation is used as a measure of the safety achieved on the program. This calculation may be based on a math model or a Fault Tree Analysis (FTA) of the system, or on an Event Tree Analysis (ETA) of the system's operational mission sequence. In each case, the numbers used in this calculation are hardware failure rates gleaned from past similar programs. As part of this paper, a fictional but representative case study is provided that helps to illustrate the problems and inaccuracies of this approach to safety determination. Then a safety determination and enhancement approach based on hazard, worst case analysis, and safety risk determination (called here Worst Case Based Safety Theory [WCBST]) is included. This approach is defined and detailed using the same example case study as shown in the REBST case study. In the end it is concluded that an approach combining the two theories works best to reduce Safety Risk.
Parental and Infant Gender Factors in Parent-Infant Interaction: State-Space Dynamic Analysis.
Cerezo, M Angeles; Sierra-García, Purificación; Pons-Salvador, Gemma; Trenado, Rosa M
2017-01-01
This study aimed to investigate the influence of parental gender on their interaction with their infants, considering, as well, the role of the infant's gender. The State Space Grid (SSG) method, a graphical tool based on the non-linear dynamic system (NDS) approach was used to analyze the interaction, in Free-Play setting, of 52 infants, aged 6 to 10 months, divided into two groups: half of the infants interacted with their fathers and half with their mothers. There were 50% boys in each group. MANOVA results showed no differential parenting of boys and girls. Additionally, mothers and fathers showed no differences in the Diversity of behavioral dyadic states nor in Predictability. However, differences associated with parent's gender were found in that the paternal dyads were more "active" than the maternal dyads: they were faster in the rates per second of behavioral events and transitions or change of state. In contrast, maternal dyads were more repetitive because, once they visited a certain dyadic state, they tend to be involved in more events. Results showed a significant discriminant function on the parental groups, fathers and mothers. Specifically, the content analyses carried out for the three NDS variables, that previously showed differences between groups, showed particular dyadic behavioral states associated with the rate of Transitions and the Events per Visit ratio. Thus, the transitions involving 'in-out' of 'Child Social Approach neutral - Sensitive Approach neutral' state and the repetitions of events in the dyadic state 'Child Play-Sensitive Approach neutral' distinguished fathers from mothers. The classification of dyads (with fathers and mothers) based on this discriminant function identified 73.10% (19/26) of the father-infant dyads and 88.5% (23/26) of the mother-infant dyads. The study of father-infant interaction using the SSG approach offers interesting possibilities because it characterizes and quantifies the actual moment-to-moment flow of parent-infant interactive dynamics. Our findings showed how observational methods applied to natural contexts offer new facets in father vs. mother interactive behavior with their infants that can inform further developments in this field.
Promoting the role of the personal narrative in teaching controversial socio-scientific issues
NASA Astrophysics Data System (ADS)
Levinson, Ralph
2008-09-01
Citizens participating in contemporary socio-scientific issues (SSI) need to draw on local knowledge and personal experience. If curricular developments in the teaching of controversial SSI are to reflect contemporary notions of citizenship then the personal narrative is an indispensable instrument in bridging the gap between the local/personal and the emergent science. In the context of controversy personal narratives help contending parties to see events in the light of those who do not share their views. A goal-oriented protagonist is the narrator in the personal narrative, which consists of three components - situation-event-reaction—the reaction being an evaluation of the event. Promoting personal narratives in science-based curricula is considered problematic given the dominant role of science’s explanatory frameworks. An inter-disciplinary approach is proposed based on McLaughlin’s levels of disagreement.
Analysis of the geophysical data using a posteriori algorithms
NASA Astrophysics Data System (ADS)
Voskoboynikova, Gyulnara; Khairetdinov, Marat
2016-04-01
The problems of monitoring, prediction and prevention of extraordinary natural and technogenic events are priority of modern problems. These events include earthquakes, volcanic eruptions, the lunar-solar tides, landslides, falling celestial bodies, explosions utilized stockpiles of ammunition, numerous quarry explosion in open coal mines, provoking technogenic earthquakes. Monitoring is based on a number of successive stages, which include remote registration of the events responses, measurement of the main parameters as arrival times of seismic waves or the original waveforms. At the final stage the inverse problems associated with determining the geographic location and time of the registration event are solving. Therefore, improving the accuracy of the parameters estimation of the original records in the high noise is an important problem. As is known, the main measurement errors arise due to the influence of external noise, the difference between the real and model structures of the medium, imprecision of the time definition in the events epicenter, the instrumental errors. Therefore, posteriori algorithms more accurate in comparison with known algorithms are proposed and investigated. They are based on a combination of discrete optimization method and fractal approach for joint detection and estimation of the arrival times in the quasi-periodic waveforms sequence in problems of geophysical monitoring with improved accuracy. Existing today, alternative approaches to solving these problems does not provide the given accuracy. The proposed algorithms are considered for the tasks of vibration sounding of the Earth in times of lunar and solar tides, and for the problem of monitoring of the borehole seismic source location in trade drilling.
Husak, Gregory J.; Michaelsen, Joel; Kyriakidis, P.; Verdin, James P.; Funk, Chris; Galu, Gideon
2011-01-01
Probabilistic forecasts are produced from a variety of outlets to help predict rainfall, and other meteorological events, for periods of 1 month or more. Such forecasts are expressed as probabilities of a rainfall event, e.g. being in the upper, middle, or lower third of the relevant distribution of rainfall in the region. The impact of these forecasts on the expectation for the event is not always clear or easily conveyed. This article proposes a technique based on Monte Carlo simulation for adjusting existing climatologic statistical parameters to match forecast information, resulting in new parameters defining the probability of events for the forecast interval. The resulting parameters are shown to approximate the forecasts with reasonable accuracy. To show the value of the technique as an application for seasonal rainfall, it is used with consensus forecast developed for the Greater Horn of Africa for the 2009 March-April-May season. An alternative, analytical approach is also proposed, and discussed in comparison to the first simulation-based technique.
Xu, Wenying; Wang, Zidong; Ho, Daniel W C
2018-05-01
This paper is concerned with the finite-horizon consensus problem for a class of discrete time-varying multiagent systems with external disturbances and missing measurements. To improve the communication reliability, redundant channels are introduced and the corresponding protocol is constructed for the information transmission over redundant channels. An event-triggered scheme is adopted to determine whether the information of agents should be transmitted to their neighbors. Subsequently, an observer-type event-triggered control protocol is proposed based on the latest received neighbors' information. The purpose of the addressed problem is to design a time-varying controller based on the observed information to achieve the consensus performance in a finite horizon. By utilizing a constrained recursive Riccati difference equation approach, some sufficient conditions are obtained to guarantee the consensus performance, and the controller parameters are also designed. Finally, a numerical example is provided to demonstrate the desired reliability of redundant channels and the effectiveness of the event-triggered control protocol.
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
Automatic identification of alpine mass movements based on seismic and infrasound signals
NASA Astrophysics Data System (ADS)
Schimmel, Andreas; Hübl, Johannes
2017-04-01
The automatic detection and identification of alpine mass movements like debris flows, debris floods or landslides gets increasing importance for mitigation measures in the densely populated and intensively used alpine regions. Since this mass movement processes emits characteristically seismic and acoustic waves in the low frequency range this events can be detected and identified based on this signals. So already several approaches for detection and warning systems based on seismic or infrasound signals has been developed. But a combination of both methods, which can increase detection probability and reduce false alarms is currently used very rarely and can serve as a promising method for developing an automatic detection and identification system. So this work presents an approach for a detection and identification system based on a combination of seismic and infrasound sensors, which can detect sediment related mass movements from a remote location unaffected by the process. The system is based on one infrasound sensor and one geophone which are placed co-located and a microcontroller where a specially designed detection algorithm is executed which can detect mass movements in real time directly at the sensor site. Further this work tries to get out more information from the seismic and infrasound spectrum produced by different sediment related mass movements to identify the process type and estimate the magnitude of the event. The system is currently installed and tested on five test sites in Austria, two in Italy and one in Switzerland as well as one in Germany. This high number of test sites is used to get a large database of very different events which will be the basis for a new identification method for alpine mass movements. These tests shows promising results and so this system provides an easy to install and inexpensive approach for a detection and warning system.
Sundvall, Erik; Nyström, Mikael; Forss, Mattias; Chen, Rong; Petersson, Håkan; Ahlfeldt, Hans
2007-01-01
This paper describes selected earlier approaches to graphically relating events to each other and to time; some new combinations are also suggested. These are then combined into a unified prototyping environment for visualization and navigation of electronic health records. Google Earth (GE) is used for handling display and interaction of clinical information stored using openEHR data structures and 'archetypes'. The strength of the approach comes from GE's sophisticated handling of detail levels, from coarse overviews to fine-grained details that has been combined with linear, polar and region-based views of clinical events related to time. The system should be easy to learn since all the visualization styles can use the same navigation. The structured and multifaceted approach to handling time that is possible with archetyped openEHR data lends itself well to visualizing and integration with openEHR components is provided in the environment.
NASA Astrophysics Data System (ADS)
Kumar, Rohit; Puri, Rajeev K.
2018-03-01
Employing the quantum molecular dynamics (QMD) approach for nucleus-nucleus collisions, we test the predictive power of the energy-based clusterization algorithm, i.e., the simulating annealing clusterization algorithm (SACA), to describe the experimental data of charge distribution and various event-by-event correlations among fragments. The calculations are constrained into the Fermi-energy domain and/or mildly excited nuclear matter. Our detailed study spans over different system masses, and system-mass asymmetries of colliding partners show the importance of the energy-based clusterization algorithm for understanding multifragmentation. The present calculations are also compared with the other available calculations, which use one-body models, statistical models, and/or hybrid models.
A Patch-Based Method for Repetitive and Transient Event Detection in Fluorescence Imaging
Boulanger, Jérôme; Gidon, Alexandre; Kervran, Charles; Salamero, Jean
2010-01-01
Automatic detection and characterization of molecular behavior in large data sets obtained by fast imaging in advanced light microscopy become key issues to decipher the dynamic architectures and their coordination in the living cell. Automatic quantification of the number of sudden and transient events observed in fluorescence microscopy is discussed in this paper. We propose a calibrated method based on the comparison of image patches expected to distinguish sudden appearing/vanishing fluorescent spots from other motion behaviors such as lateral movements. We analyze the performances of two statistical control procedures and compare the proposed approach to a frame difference approach using the same controls on a benchmark of synthetic image sequences. We have then selected a molecular model related to membrane trafficking and considered real image sequences obtained in cells stably expressing an endocytic-recycling trans-membrane protein, the Langerin-YFP, for validation. With this model, we targeted the efficient detection of fast and transient local fluorescence concentration arising in image sequences from a data base provided by two different microscopy modalities, wide field (WF) video microscopy using maximum intensity projection along the axial direction and total internal reflection fluorescence microscopy. Finally, the proposed detection method is briefly used to statistically explore the effect of several perturbations on the rate of transient events detected on the pilot biological model. PMID:20976222
Improving the Emergency Manager’s Hurricane Evacuation Decision Making Through Serious Gaming
2016-06-17
Serious Gaming Hayley J. Davison Reynolds, Maxwell H. Perlman Darren P. Wilson MIT Lincoln Laboratory DHS Science and Technology Directorate...transfer it to an actual evacuation event. Through this work, a web-based, ‘serious gaming ’ approach was used to develop hurricane evacuation decision...training for the emergency manager. This paper describes the iterative design approach to developing a training game and collect initial feedback
1989-12-29
1.1.2. General Performance Criteria for Gamma Ray Spectrometers 4 1.1.3. Special Criteria for Space-Based Spectrometer Systems 7 1.1.4. Prior Approaches...calculations were performed for selected incident gamma ray energies and were used to generate tabular and graphical listings of gamma scattering results. The... generated . These output presentations were studied to identify behavior patterns of "good" and "bad" event sequences. For the specific gamma energy
A Blind Segmentation Approach to Acoustic Event Detection Based on I Vector
2013-08-25
Hui Lee1 1 School of ECE, Georgia Institute of Technology , Atlanta, GA. 30332-0250, USA 2 School of Computing, University of Eastern Finland, Finland...recordings obtained at low signal-to-noise-ratio (SNR) enviroments with highly-mixed events in a single acous- tic segment. Research in AED [1] is...2532–2535. [28] C.-C. Chang and C.-J. Lin, “LIBSVM: A library for support vector machines,” ACM Transactions on Intelligent Systems and Technology
Remembered or Forgotten?—An EEG-Based Computational Prediction Approach
Sun, Xuyun; Qian, Cunle; Chen, Zhongqin; Wu, Zhaohui; Luo, Benyan; Pan, Gang
2016-01-01
Prediction of memory performance (remembered or forgotten) has various potential applications not only for knowledge learning but also for disease diagnosis. Recently, subsequent memory effects (SMEs)—the statistical differences in electroencephalography (EEG) signals before or during learning between subsequently remembered and forgotten events—have been found. This finding indicates that EEG signals convey the information relevant to memory performance. In this paper, based on SMEs we propose a computational approach to predict memory performance of an event from EEG signals. We devise a convolutional neural network for EEG, called ConvEEGNN, to predict subsequently remembered and forgotten events from EEG recorded during memory process. With the ConvEEGNN, prediction of memory performance can be achieved by integrating two main stages: feature extraction and classification. To verify the proposed approach, we employ an auditory memory task to collect EEG signals from scalp electrodes. For ConvEEGNN, the average prediction accuracy was 72.07% by using EEG data from pre-stimulus and during-stimulus periods, outperforming other approaches. It was observed that signals from pre-stimulus period and those from during-stimulus period had comparable contributions to memory performance. Furthermore, the connection weights of ConvEEGNN network can reveal prominent channels, which are consistent with the distribution of SME studied previously. PMID:27973531
Trinh, T; Ishida, K; Kavvas, M L; Ercan, A; Carr, K
2017-05-15
Along with socioeconomic developments, and population increase, natural disasters around the world have recently increased the awareness of harmful impacts they cause. Among natural disasters, drought is of great interest to scientists due to the extraordinary diversity of their severity and duration. Motivated by the development of a potential approach to investigate future possible droughts in a probabilistic framework based on climate change projections, a methodology to consider thirteen future climate projections based on four emission scenarios to characterize droughts is presented. The proposed approach uses a regional climate model coupled with a physically-based hydrology model (Watershed Environmental Hydrology Hydro-Climate Model; WEHY-HCM) to generate thirteen equally likely future water supply projections. The water supply projections were compared to the current water demand for the detection of drought events and estimation of drought properties. The procedure was applied to Shasta Dam watershed to analyze drought conditions at the watershed outlet, Shasta Dam. The results suggest an increasing water scarcity at Shasta Dam with more severe and longer future drought events in some future scenarios. An important advantage of the proposed approach to the probabilistic analysis of future droughts is that it provides the drought properties of the 100-year and 200-year return periods without resorting to any extrapolation of the frequency curve. Copyright © 2017 Elsevier B.V. All rights reserved.
Explosive Yield Estimation using Fourier Amplitude Spectra of Velocity Histories
NASA Astrophysics Data System (ADS)
Steedman, D. W.; Bradley, C. R.
2016-12-01
The Source Physics Experiment (SPE) is a series of explosive shots of various size detonated at varying depths in a borehole in jointed granite. The testbed includes an extensive array of accelerometers for measuring the shock environment close-in to the explosive source. One goal of SPE is to develop greater understanding of the explosion phenomenology in all regimes: from near-source, non-linear response to the far-field linear elastic region, and connecting the analyses from the respective regimes. For example, near-field analysis typically involves review of kinematic response (i.e., acceleration, velocity and displacement) in the time domain and looks at various indicators (e.g., peaks, pulse duration) to facilitate comparison among events. Review of far-field data more often is based on study of response in the frequency domain to facilitate comparison of event magnitudes. To try to "bridge the gap" between approaches, we have developed a scaling law for Fourier amplitude spectra of near-field velocity histories that successfully collapses data from a wide range of yields (100 kg to 5000 kg) and range to sensors in jointed granite. Moreover, we show that we can apply this scaling law to data from a new event to accurately estimate the explosive yield of that event. This approach presents a new way of working with near-field data that will be more compatible with traditional methods of analysis of seismic data and should serve to facilitate end-to-end event analysis. The goal is that this new approach to data analysis will eventually result in improved methods for discrimination of event type (i.e., nuclear or chemical explosion, or earthquake) and magnitude.
NASA Astrophysics Data System (ADS)
Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.
2009-04-01
When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE database with different record periods, and located in different climatic regions. Results indicate that there are no significant differences in the mean contribution of aggregated 5-largest daily erosion events between different agricultural divisions (i.e. different regional climate), and the differences detected can be attributed to specific site and plots conditions. Expected contribution of 5-largest daily event for 100 total daily events recorded is estimated around 40% of total soil erosion. We discuss the possible causes of such results and the applicability of them to the design of field research on soil erosion plots.
Unraveling multiple changes in complex climate time series using Bayesian inference
NASA Astrophysics Data System (ADS)
Berner, Nadine; Trauth, Martin H.; Holschneider, Matthias
2016-04-01
Change points in time series are perceived as heterogeneities in the statistical or dynamical characteristics of observations. Unraveling such transitions yields essential information for the understanding of the observed system. The precise detection and basic characterization of underlying changes is therefore of particular importance in environmental sciences. We present a kernel-based Bayesian inference approach to investigate direct as well as indirect climate observations for multiple generic transition events. In order to develop a diagnostic approach designed to capture a variety of natural processes, the basic statistical features of central tendency and dispersion are used to locally approximate a complex time series by a generic transition model. A Bayesian inversion approach is developed to robustly infer on the location and the generic patterns of such a transition. To systematically investigate time series for multiple changes occurring at different temporal scales, the Bayesian inversion is extended to a kernel-based inference approach. By introducing basic kernel measures, the kernel inference results are composed into a proxy probability to a posterior distribution of multiple transitions. Thus, based on a generic transition model a probability expression is derived that is capable to indicate multiple changes within a complex time series. We discuss the method's performance by investigating direct and indirect climate observations. The approach is applied to environmental time series (about 100 a), from the weather station in Tuscaloosa, Alabama, and confirms documented instrumentation changes. Moreover, the approach is used to investigate a set of complex terrigenous dust records from the ODP sites 659, 721/722 and 967 interpreted as climate indicators of the African region of the Plio-Pleistocene period (about 5 Ma). The detailed inference unravels multiple transitions underlying the indirect climate observations coinciding with established global climate events.
Play it forward! A community-based participatory research approach to childhood obesity prevention.
Berge, Jerica M; Jin, Seok Won; Hanson, Carrie; Doty, Jennifer; Jagaraj, Kimberly; Braaten, Kent; Doherty, William J
2016-03-01
To date there has been limited success with childhood obesity prevention interventions. This may be due in part, to the challenge of reaching and engaging parents in interventions. The current study used a community-based participatory research (CBPR) approach to engage parents in cocreating and pilot testing a childhood obesity prevention intervention. Because CBPR approaches to childhood obesity prevention are new, this study aims to detail the creation, including the formation of the citizen action group (CAG), and implementation of a childhood obesity prevention intervention using CBPR methods. A CBPR approach was used to recruit community members to partner with university researchers in the CAG (n = 12) to create and implement the Play It Forward! childhood obesity intervention. The intervention creation and implementation took 2 years. During Year 1 (2011-2012), the CAG carried out a community needs and resources assessment and designed a community-based and family focused childhood obesity prevention intervention. During Year 2 (2012-2013), the CAG implemented the intervention and conducted an evaluation. Families (n = 50; 25 experimental/25 control group) with children ages 6-12 years participated in Play It Forward! Feasibility and process evaluation data suggested that the intervention was highly feasible and participants in both the CAG and intervention were highly satisfied. Specifically, over half of the families attended 75% of the Play It Forward! events and 33% of families attended all the events. Equal collaboration between parents and academic researchers to address childhood obesity may be a promising approach that merits further testing. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Approaches to the Surveillance of Foodborne Disease: A Review of the Evidence.
Ford, Laura; Miller, Megge; Cawthorne, Amy; Fearnley, Emily; Kirk, Martyn
2015-12-01
Foodborne disease surveillance aims to reduce the burden of illness due to contaminated food. There are several different types of surveillance systems, including event-based surveillance, indicator-based surveillance, and integrated food chain surveillance. These approaches are not mutually exclusive, have overlapping data sources, require distinct capacities and resources, and can be considered a hierarchy, with each level being more complex and resulting in a greater ability to detect and control foodborne disease. Event-based surveillance is generally the least resource-intensive system and makes use of informal data sources. Indicator-based surveillance is seen as traditional notifiable disease surveillance and consists of routinely collected data. Integrated food chain surveillance is viewed as the optimal practice for conducting continuous risk analysis for foodborne diseases, but also requires significant ongoing resources and greater multisectoral collaboration compared to the other systems. Each country must determine the most appropriate structure for their surveillance system for foodborne diseases based on their available resources. This review explores the evidence on the principles, minimum capabilities, and minimum requirements of each type of surveillance and discusses examples from a range of countries. This review forms the evidence base for the Strengthening the Surveillance and Response for Foodborne Diseases: A Practical Manual.
Comment on "drug discovery: turning the titanic".
Lesterhuis, W Joost; Bosco, Anthony; Lake, Richard A
2014-03-26
The pathobiology-based approach to research and development has been the dominant paradigm for successful drug discovery over the last decades. We propose that the molecular and cellular events that govern a resolving, rather than an evolving, disease may reveal new druggable pathways.
Locally Based Kernel PLS Regression De-noising with Application to Event-Related Potentials
NASA Technical Reports Server (NTRS)
Rosipal, Roman; Trejo, Leonard J.; Wheeler, Kevin; Tino, Peter
2002-01-01
The close relation of signal de-noising and regression problems dealing with the estimation of functions reflecting dependency between a set of inputs and dependent outputs corrupted with some level of noise have been employed in our approach.
Picking vs Waveform based detection and location methods for induced seismicity monitoring
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2017-04-01
Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for regional and teleseismic applications, thus its performance with microseismic data might be limited. We analyze the performance of the three methodologies for a synthetic dataset with realistic noise conditions as well as for the first hour of continuous waveform data, including the Ml 3.5 St. Gallen earthquake, recorded by a microseismic network deployed in the area. We finally compare the results obtained all these three methods with a manually revised catalogue.
Solar Energetic Particle Forecasting Algorithms and Associated False Alarms
NASA Astrophysics Data System (ADS)
Swalwell, B.; Dalla, S.; Walsh, R. W.
2017-11-01
Solar energetic particle (SEP) events are known to occur following solar flares and coronal mass ejections (CMEs). However, some high-energy solar events do not result in SEPs being detected at Earth, and it is these types of event which may be termed "false alarms". We define two simple SEP forecasting algorithms based upon the occurrence of a magnetically well-connected CME with a speed in excess of 1500 km s^{-1} (a "fast" CME) or a well-connected X-class flare and analyse them with respect to historical datasets. We compare the parameters of those solar events which produced an enhancement of {>} 40 MeV protons at Earth (an "SEP event") and the parameters of false alarms. We find that an SEP forecasting algorithm based solely upon the occurrence of a well-connected fast CME produces fewer false alarms (28.8%) than an algorithm which is based solely upon a well-connected X-class flare (50.6%). Both algorithms fail to forecast a relatively high percentage of SEP events (53.2% and 50.6%, respectively). Our analysis of the historical datasets shows that false-alarm X-class flares were either not associated with any CME, or were associated with a CME slower than 500 km s^{-1}; false-alarm fast CMEs tended to be associated with flare classes lower than M3. A better approach to forecasting would be an algorithm which takes as its base the occurrence of both CMEs and flares. We define a new forecasting algorithm which uses a combination of CME and flare parameters, and we show that the false-alarm ratio is similar to that for the algorithm based upon fast CMEs (29.6%), but the percentage of SEP events not forecast is reduced to 32.4%. Lists of the solar events which gave rise to {>} 40 MeV protons and the false alarms have been derived and are made available to aid further study.
From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model
NASA Astrophysics Data System (ADS)
Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.
2014-12-01
European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.
Using Geo-Data Corporately on the Response Phase of Emergency Management
NASA Astrophysics Data System (ADS)
Demir Ozbek, E.; Ates, S.; Aydinoglu, A. C.
2015-08-01
Response phase of emergency management is the most complex phase in the entire cycle because it requires cooperation between various actors relating to emergency sectors. A variety of geo-data is needed at the emergency response such as; existing data provided by different institutions and dynamic data collected by different sectors at the time of the disaster. Disaster event is managed according to elaborately defined activity-actor-task-geodata cycle. In this concept, every activity of emergency response is determined with Standard Operation Procedure that enables users to understand their tasks and required data in any activity. In this study, a general conceptual approach for disaster and emergency management system is developed based on the regulations to serve applications in Istanbul Governorship Provincial Disaster and Emergency Directorate. The approach is implemented to industrial facility explosion example. In preparation phase, optimum ambulance locations are determined according to general response time of the ambulance to all injury cases in addition to areas that have industrial fire risk. Management of the industrial fire case is organized according to defined actors, activities, and working cycle that describe required geo-data. A response scenario was prepared and performed for an industrial facility explosion event to exercise effective working cycle of actors. This scenario provides using geo-data corporately between different actors while required data for each task is defined to manage the industrial facility explosion event. Following developing web technologies, this scenario based approach can be effective to use geo-data on the web corporately.
NASA Astrophysics Data System (ADS)
Dukas, Georg
Though research in emerging technologies is vital to fulfilling their incredible potential for educational applications, it is often fraught with analytic challenges related to large datasets. This thesis explores these challenges in researching multiuser virtual environments (MUVEs). In a MUVE, users assume a persona and traverse a virtual space often depicted as a physical world, interacting with other users and digital artifacts. As students participate in MUVE-based curricula, detailed records of their paths through the virtual world are typically collected in event logs. Although many studies have demonstrated the instructional power of MUVEs (e.g., Barab, Hay, Barnett, & Squire, 2001; Ketelhut, Dede, Clarke, Nelson, & Bowman, 2008), none have successfully quantified these student paths for analysis in the aggregate. This thesis constructs several frameworks for conducting research involving student navigational choices in MUVEs based on a case study of data generated from the River City project. After providing a context for the research and an introduction to the River City dataset, the first part of this thesis explores the issues associated with data compression and presents a grounded theory approach (Glaser & Strauss, 1967) to the cleaning, compacting, and coding or MUVE datasets. In summary of this section, I discuss the implication of preparation choices for further analysis. Second, two conceptually different approaches to analyzing behavioral sequences are investigated. For each approach, a theoretical context, description of possible exploratory and confirmatory methods, and illustrative examples from River City are provided. The thesis then situates these specific analytic approaches within the constellation of possible research utilizing MUVE event log data. Finally, based on the lessons of River City and the investigation of a spectrum of possible event logs, a set of design heuristics for data collection in MUVEs is constructed and a possible future for research in these environments is envisioned.
Event attribution using data assimilation in an intermediate complexity atmospheric model
NASA Astrophysics Data System (ADS)
Metref, Sammy; Hannart, Alexis; Ruiz, Juan; Carrassi, Alberto; Bocquet, Marc; Ghil, Michael
2016-04-01
A new approach, coined DADA (Data Assimilation for Detection and Attribution) has been recently introduced by Hannart et al. 2015, and is potentially useful for near real time, systematic causal attribution of weather and climate-related events The method is purposely designed to allow its operability at meteorological centers by synergizing causal attribution with Data Assimilation (DA) methods usually designed to deal with large nonlinear models. In Hannart et al. 2015, the DADA proposal is illustrated in the context of a low-order nonlinear model (forced three-variable Lorenz model) that is of course not realistic to represent the events considered. As a continuation of this stream of work, we therefore propose an implementation of the DADA approach in a realistic intermediate complexity atmospheric model (ICTP AGCM, nicknamed SPEEDY). The SPEEDY model is based on a spectral dynamical core developed at the Geophysical Fluid Dynamics Laboratory (see Held and Suarez 1994). It is a hydrostatic, r-coordinate, spectral-transform model in the vorticity-divergence form described by Bourke (1974). A synthetic dataset of observations of an extreme precipitation event over Southeastern South America is extracted from a long SPEEDY simulation under present climatic conditions (i.e. factual conditions). Then, following the DADA approach, observations of this event are assimilated twice in the SPEEDY model: first in the factual configuration of the model and second under its counterfactual, pre-industrial configuration. We show that attribution can be performed based on the likelihood ratio as in Hannart et al. 2015, but we further extend this result by showing that the likelihood can be split in space, time and variables in order to help identify the specific physical features of the event that bear the causal signature. References: Hannart A., A. Carrassi, M. Bocquet, M. Ghil, P. Naveau, M. Pulido, J. Ruiz, P. Tandeo (2015) DADA: Data assimilation for the detection and attribution of weather and climate-related events, Climatic Change, (in press). Held I. M. and M. J. Suarez, (1994): A Proposal for the Intercomparison of the Dynamical Cores of Atmospheric General Circulation Models. Bull. Amer. Meteor. Soc., 75, 1825-1830. Bourke W. (1972): A multi-level spectral model. I. Formulation and hemispheric integrations. Mon. Wea. Rev., 102, 687-701.
A generic multi-hazard and multi-risk framework and its application illustrated in a virtual city
NASA Astrophysics Data System (ADS)
Mignan, Arnaud; Euchner, Fabian; Wiemer, Stefan
2013-04-01
We present a generic framework to implement hazard correlations in multi-risk assessment strategies. We consider hazard interactions (process I), time-dependent vulnerability (process II) and time-dependent exposure (process III). Our approach is based on the Monte Carlo method to simulate a complex system, which is defined from assets exposed to a hazardous region. We generate 1-year time series, sampling from a stochastic set of events. Each time series corresponds to one risk scenario and the analysis of multiple time series allows for the probabilistic assessment of losses and for the recognition of more or less probable risk paths. Each sampled event is associated to a time of occurrence, a damage footprint and a loss footprint. The occurrence of an event depends on its rate, which is conditional on the occurrence of past events (process I, concept of correlation matrix). Damage depends on the hazard intensity and on the vulnerability of the asset, which is conditional on previous damage on that asset (process II). Losses are the product of damage and exposure value, this value being the original exposure minus previous losses (process III, no reconstruction considered). The Monte Carlo method allows for a straightforward implementation of uncertainties and for implementation of numerous interactions, which is otherwise challenging in an analytical multi-risk approach. We apply our framework to a synthetic data set, defined by a virtual city within a virtual region. This approach gives the opportunity to perform multi-risk analyses in a controlled environment while not requiring real data, which may be difficultly accessible or simply unavailable to the public. Based on the heuristic approach, we define a 100 by 100 km region where earthquakes, volcanic eruptions, fluvial floods, hurricanes and coastal floods can occur. All hazards are harmonized to a common format. We define a 20 by 20 km city, composed of 50,000 identical buildings with a fixed economic value. Vulnerability curves are defined in terms of mean damage ratio as a function of hazard intensity. All data are based on simple equations found in the literature and on other simplifications. We show the impact of earthquake-earthquake interaction and hurricane-storm surge coupling, as well as of time-dependent vulnerability and exposure, on aggregated loss curves. One main result is the emergence of low probability-high consequences (extreme) events when correlations are implemented. While the concept of virtual city can suggest the theoretical benefits of multi-risk assessment for decision support, identifying their real-world practicality will require the study of real test sites.
A new approach for the assessment of temporal clustering of extratropical wind storms
NASA Astrophysics Data System (ADS)
Schuster, Mareike; Eddounia, Fadoua; Kuhnel, Ivan; Ulbrich, Uwe
2017-04-01
A widely-used methodology to assess the clustering of storms in a region is based on dispersion statistics of a simple homogeneous Poisson process. This clustering measure is determined by the ratio of the variance and the mean of the local storm statistics per grid point. Resulting values larger than 1, i.e. when the variance is larger than the mean, indicate clustering; while values lower than 1 indicate a sequencing of storms that is more regular than a random process. However, a disadvantage of this methodology is that the characteristics are valid for a pre-defined climatological time period, and it is not possible to identify a temporal variability of clustering. Also, the absolute value of the dispersion statistics is not particularly intuitive. We have developed an approach to describe temporal clustering of storms which offers a more intuitive comprehension, and at the same time allows to assess temporal variations. The approach is based on the local distribution of waiting times between the occurrence of two individual storm events, the former being computed through the post-processing of individual windstorm tracks which in turn are obtained by an objective tracking algorithm. Based on this distribution a threshold can be set, either by the waiting time expected from a random process or by a quantile of the observed distribution. Thus, it can be determined if two consecutive wind storm events count as part of a (temporal) cluster. We analyze extratropical wind storms in a reanalysis dataset and compare the results of the traditional clustering measure with our new methodology. We assess what range of clustering events (in terms of duration and frequency) is covered and identify if the historically known clustered seasons are detectable by the new clustering measure in the reanalysis.
Endedijk, Maaike D; Brekelmans, Mieke; Sleegers, Peter; Vermunt, Jan D
Self-regulated learning has benefits for students' academic performance in school, but also for expertise development during their professional career. This study examined the validity of an instrument to measure student teachers' regulation of their learning to teach across multiple and different kinds of learning events in the context of a postgraduate professional teacher education programme. Based on an analysis of the literature, we developed a log with structured questions that could be used as a multiple-event instrument to determine the quality of student teachers' regulation of learning by combining data from multiple learning experiences. The findings showed that this structured version of the instrument measured student teachers' regulation of their learning in a valid and reliable way. Furthermore, with the aid of the Structured Learning Report individual differences in student teachers' regulation of learning could be discerned. Together the findings indicate that a multiple-event instrument can be used to measure regulation of learning in multiple contexts for various learning experiences at the same time, without the necessity of relying on students' ability to rate themselves across all these different experiences. In this way, this instrument can make an important contribution to bridging the gap between two dominant approaches to measure SRL, the traditional aptitude and event measurement approach.
Impact of whole-genome duplication events on diversification rates in angiosperms.
Landis, Jacob B; Soltis, Douglas E; Li, Zheng; Marx, Hannah E; Barker, Michael S; Tank, David C; Soltis, Pamela S
2018-03-01
Polyploidy or whole-genome duplication (WGD) pervades the evolutionary history of angiosperms. Despite extensive progress in our understanding of WGD, the role of these events in promoting diversification is still not well understood. We seek to clarify the possible association between WGD and diversification rates in flowering plants. Using a previously published phylogeny spanning all land plants (31,749 tips) and WGD events inferred from analyses of the 1000 Plants (1KP) transcriptome data, we analyzed the association of WGDs and diversification rates following numerous WGD events across the angiosperms. We used a stepwise AIC approach (MEDUSA), a Bayesian mixture model approach (BAMM), and state-dependent diversification analyses (MuSSE) to investigate patterns of diversification. Sister-clade comparisons were used to investigate species richness after WGDs. Based on the density of 1KP taxon sampling, 106 WGDs were unambiguously placed on the angiosperm phylogeny. We identified 334-530 shifts in diversification rates. We found that 61 WGD events were tightly linked to changes in diversification rates, and state-dependent diversification analyses indicated higher speciation rates for subsequent rounds of WGD. Additionally, 70 of 99 WGD events showed an increase in species richness compared to the sister clade. Forty-six of the 106 WGDs analyzed appear to be closely associated with upshifts in the rate of diversification in angiosperms. Shifts in diversification do not appear more likely than random within a four-node lag phase following a WGD; however, younger WGD events are more likely to be followed by an upshift in diversification than older WGD events. © 2018 Botanical Society of America.
Sheehan, Katie J; Sobolev, Boris; Guy, Pierre; Bohm, Eric; Hellsten, Erik; Sutherland, Jason M; Kuramoto, Lisa; Jaglal, Susan
2016-02-01
Episodes of care defined by the event of hip fracture surgery are widely used for the assessment of surgical wait times and outcomes. However, this approach does not consider nonoperative deaths, implying that survival time begins at the time of procedure. This approach makes treatment effect implicitly conditional on surviving to treatment. The purpose of this article is to describe a novel conceptual framework for constructing an episode of hip fracture care to fully evaluate the incidence of adverse events related to time after admission for hip fracture. This admission-based approach enables the assessment of the full harm of delay by including deaths while waiting for surgery, not just deaths after surgery. Some patients wait until their conditions are optimized for surgery, whereas others have to wait until surgical service becomes available. We provide definitions, linkage rules, and algorithms to capture all hip fracture patients and events other than surgery. Finally, we discuss data elements for stratifying patients according to administrative factors for delay to allow researchers and policymakers to determine who will benefit most from expedited access to surgery. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
Earthquake Declustering via a Nearest-Neighbor Approach in Space-Time-Magnitude Domain
NASA Astrophysics Data System (ADS)
Zaliapin, I. V.; Ben-Zion, Y.
2016-12-01
We propose a new method for earthquake declustering based on nearest-neighbor analysis of earthquakes in space-time-magnitude domain. The nearest-neighbor approach was recently applied to a variety of seismological problems that validate the general utility of the technique and reveal the existence of several different robust types of earthquake clusters. Notably, it was demonstrated that clustering associated with the largest earthquakes is statistically different from that of small-to-medium events. In particular, the characteristic bimodality of the nearest-neighbor distances that helps separating clustered and background events is often violated after the largest earthquakes in their vicinity, which is dominated by triggered events. This prevents using a simple threshold between the two modes of the nearest-neighbor distance distribution for declustering. The current study resolves this problem hence extending the nearest-neighbor approach to the problem of earthquake declustering. The proposed technique is applied to seismicity of different areas in California (San Jacinto, Coso, Salton Sea, Parkfield, Ventura, Mojave, etc.), as well as to the global seismicity, to demonstrate its stability and efficiency in treating various clustering types. The results are compared with those of alternative declustering methods.
Two-layer symbolic representation for stochastic models with phase-type distributed events
NASA Astrophysics Data System (ADS)
Longo, Francesco; Scarpa, Marco
2015-07-01
Among the techniques that have been proposed for the analysis of non-Markovian models, the state space expansion approach showed great flexibility in terms of modelling capacities.The principal drawback is the explosion of the state space. This paper proposes a two-layer symbolic method for efficiently storing the expanded reachability graph of a non-Markovian model in the case in which continuous phase-type distributions are associated with the firing times of system events, and different memory policies are considered. At the lower layer, the reachability graph is symbolically represented in the form of a set of Kronecker matrices, while, at the higher layer, all the information needed to correctly manage event memory is stored in a multi-terminal multi-valued decision diagram. Such an information is collected by applying a symbolic algorithm, which is based on a couple of theorems. The efficiency of the proposed approach, in terms of memory occupation and execution time, is shown by applying it to a set of non-Markovian stochastic Petri nets and comparing it with a classical explicit expansion algorithm. Moreover, a comparison with a classical symbolic approach is performed whenever possible.
Sensemaking of patient safety risks and hazards.
Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S
2006-08-01
In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced.
Sensemaking of Patient Safety Risks and Hazards
Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S
2006-01-01
In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced. PMID:16898979
Spatial-temporal event detection in climate parameter imagery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKenna, Sean Andrew; Gutierrez, Karen A.
Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to themore » earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.« less
A Novel Approach for Creating Activity-Aware Applications in a Hospital Environment
NASA Astrophysics Data System (ADS)
Bardram, Jakob E.
Context-aware and activity-aware computing has been proposed as a way to adapt the computer to the user’s ongoing activity. However, deductively moving from physical context - like location - to establishing human activity has proved difficult. This paper proposes a novel approach to activity-aware computing. Instead of inferring activities, this approach enables the user to explicitly model their activity, and then use sensor-based events to create, manage, and use these computational activities adjusted to a specific context. This approach was crafted through a user-centered design process in collaboration with a hospital department. We propose three strategies for activity-awareness: context-based activity matching, context-based activity creation, and context-based activity adaptation. We present the implementation of these strategies and present an experimental evaluation of them. The experiments demonstrate that rather than considering context as information, context can be a relational property that links ’real-world activities’ with their ’computational activities’.
Dynamic Event Tree advancements and control logic improvements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less
Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios
NASA Astrophysics Data System (ADS)
Ragno, E.; AghaKouchak, A.
2016-12-01
Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.
Optimal weighted averaging of event related activity from acquisitions with artifacts.
Vollero, Luca; Petrichella, Sara; Innello, Giulio
2016-08-01
In several biomedical applications that require the signal processing of biological data, the starting procedure for noise reduction is the ensemble averaging of multiple repeated acquisitions (trials). This method is based on the assumption that each trial is composed of two additive components: (i) a time-locked activity related to some sensitive/stimulation phenomenon (ERA, Event Related Activity in the following) and (ii) a sum of several other non time-locked background activities. The averaging aims at estimating the ERA activity under very low Signal to Noise and Interference Ratio (SNIR). Although averaging is a well established tool, its performance can be improved in the presence of high-power disturbances (artifacts) by a trials classification and removal stage. In this paper we propose, model and evaluate a new approach that avoids trials removal, managing trials classified as artifact-free and artifact-prone with two different weights. Based on the model, a weights tuning is possible and through modeling and simulations we show that, when optimally configured, the proposed solution outperforms classical approaches.
Transient modeling in simulation of hospital operations for emergency response.
Paul, Jomon Aliyas; George, Santhosh K; Yi, Pengfei; Lin, Li
2006-01-01
Rapid estimates of hospital capacity after an event that may cause a disaster can assist disaster-relief efforts. Due to the dynamics of hospitals, following such an event, it is necessary to accurately model the behavior of the system. A transient modeling approach using simulation and exponential functions is presented, along with its applications in an earthquake situation. The parameters of the exponential model are regressed using outputs from designed simulation experiments. The developed model is capable of representing transient, patient waiting times during a disaster. Most importantly, the modeling approach allows real-time capacity estimation of hospitals of various sizes and capabilities. Further, this research is an analysis of the effects of priority-based routing of patients within the hospital and the effects on patient waiting times determined using various patient mixes. The model guides the patients based on the severity of injuries and queues the patients requiring critical care depending on their remaining survivability time. The model also accounts the impact of prehospital transport time on patient waiting time.
A Systems Approach to Scalable Transportation Network Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S
2006-01-01
Emerging needs in transportation network modeling and simulation are raising new challenges with respect to scal-ability of network size and vehicular traffic intensity, speed of simulation for simulation-based optimization, and fidel-ity of vehicular behavior for accurate capture of event phe-nomena. Parallel execution is warranted to sustain the re-quired detail, size and speed. However, few parallel simulators exist for such applications, partly due to the challenges underlying their development. Moreover, many simulators are based on time-stepped models, which can be computationally inefficient for the purposes of modeling evacuation traffic. Here an approach is presented to de-signing a simulator with memory andmore » speed efficiency as the goals from the outset, and, specifically, scalability via parallel execution. The design makes use of discrete event modeling techniques as well as parallel simulation meth-ods. Our simulator, called SCATTER, is being developed, incorporating such design considerations. Preliminary per-formance results are presented on benchmark road net-works, showing scalability to one million vehicles simu-lated on one processor.« less
Pervasive Monitoring—An Intelligent Sensor Pod Approach for Standardised Measurement Infrastructures
Resch, Bernd; Mittlboeck, Manfred; Lippautz, Michael
2010-01-01
Geo-sensor networks have traditionally been built up in closed monolithic systems, thus limiting trans-domain usage of real-time measurements. This paper presents the technical infrastructure of a standardised embedded sensing device, which has been developed in the course of the Live Geography approach. The sensor pod implements data provision standards of the Sensor Web Enablement initiative, including an event-based alerting mechanism and location-aware Complex Event Processing functionality for detection of threshold transgression and quality assurance. The goal of this research is that the resultant highly flexible sensing architecture will bring sensor network applications one step further towards the realisation of the vision of a “digital skin for planet earth”. The developed infrastructure can potentially have far-reaching impacts on sensor-based monitoring systems through the deployment of ubiquitous and fine-grained sensor networks. This in turn allows for the straight-forward use of live sensor data in existing spatial decision support systems to enable better-informed decision-making. PMID:22163537
Resch, Bernd; Mittlboeck, Manfred; Lippautz, Michael
2010-01-01
Geo-sensor networks have traditionally been built up in closed monolithic systems, thus limiting trans-domain usage of real-time measurements. This paper presents the technical infrastructure of a standardised embedded sensing device, which has been developed in the course of the Live Geography approach. The sensor pod implements data provision standards of the Sensor Web Enablement initiative, including an event-based alerting mechanism and location-aware Complex Event Processing functionality for detection of threshold transgression and quality assurance. The goal of this research is that the resultant highly flexible sensing architecture will bring sensor network applications one step further towards the realisation of the vision of a "digital skin for planet earth". The developed infrastructure can potentially have far-reaching impacts on sensor-based monitoring systems through the deployment of ubiquitous and fine-grained sensor networks. This in turn allows for the straight-forward use of live sensor data in existing spatial decision support systems to enable better-informed decision-making.
An ensemble method for extracting adverse drug events from social media.
Liu, Jing; Zhao, Songzheng; Zhang, Xiaodi
2016-06-01
Because adverse drug events (ADEs) are a serious health problem and a leading cause of death, it is of vital importance to identify them correctly and in a timely manner. With the development of Web 2.0, social media has become a large data source for information on ADEs. The objective of this study is to develop a relation extraction system that uses natural language processing techniques to effectively distinguish between ADEs and non-ADEs in informal text on social media. We develop a feature-based approach that utilizes various lexical, syntactic, and semantic features. Information-gain-based feature selection is performed to address high-dimensional features. Then, we evaluate the effectiveness of four well-known kernel-based approaches (i.e., subset tree kernel, tree kernel, shortest dependency path kernel, and all-paths graph kernel) and several ensembles that are generated by adopting different combination methods (i.e., majority voting, weighted averaging, and stacked generalization). All of the approaches are tested using three data sets: two health-related discussion forums and one general social networking site (i.e., Twitter). When investigating the contribution of each feature subset, the feature-based approach attains the best area under the receiver operating characteristics curve (AUC) values, which are 78.6%, 72.2%, and 79.2% on the three data sets. When individual methods are used, we attain the best AUC values of 82.1%, 73.2%, and 77.0% using the subset tree kernel, shortest dependency path kernel, and feature-based approach on the three data sets, respectively. When using classifier ensembles, we achieve the best AUC values of 84.5%, 77.3%, and 84.5% on the three data sets, outperforming the baselines. Our experimental results indicate that ADE extraction from social media can benefit from feature selection. With respect to the effectiveness of different feature subsets, lexical features and semantic features can enhance the ADE extraction capability. Kernel-based approaches, which can stay away from the feature sparsity issue, are qualified to address the ADE extraction problem. Combining different individual classifiers using suitable combination methods can further enhance the ADE extraction effectiveness. Copyright © 2016 Elsevier B.V. All rights reserved.
Top-down proteomics for the analysis of proteolytic events - Methods, applications and perspectives.
Tholey, Andreas; Becker, Alexander
2017-11-01
Mass spectrometry based proteomics is an indispensable tool for almost all research areas relevant for the understanding of proteolytic processing, ranging from the identification of substrates, products and cleavage sites up to the analysis of structural features influencing protease activity. The majority of methods for these studies are based on bottom-up proteomics performing analysis at peptide level. As this approach is characterized by a number of pitfalls, e.g. loss of molecular information, there is an ongoing effort to establish top-down proteomics, performing separation and MS analysis both at intact protein level. We briefly introduce major approaches of bottom-up proteomics used in the field of protease research and highlight the shortcomings of these methods. We then discuss the present state-of-the-art of top-down proteomics. Together with the discussion of known challenges we show the potential of this approach and present a number of successful applications of top-down proteomics in protease research. This article is part of a Special Issue entitled: Proteolysis as a Regulatory Event in Pathophysiology edited by Stefan Rose-John. Copyright © 2017 Elsevier B.V. All rights reserved.
Evaluating Water Demand Using Agent-Based Modeling
NASA Astrophysics Data System (ADS)
Lowry, T. S.
2004-12-01
The supply and demand of water resources are functions of complex, inter-related systems including hydrology, climate, demographics, economics, and policy. To assess the safety and sustainability of water resources, planners often rely on complex numerical models that relate some or all of these systems using mathematical abstractions. The accuracy of these models relies on how well the abstractions capture the true nature of the systems interactions. Typically, these abstractions are based on analyses of observations and/or experiments that account only for the statistical mean behavior of each system. This limits the approach in two important ways: 1) It cannot capture cross-system disruptive events, such as major drought, significant policy change, or terrorist attack, and 2) it cannot resolve sub-system level responses. To overcome these limitations, we are developing an agent-based water resources model that includes the systems of hydrology, climate, demographics, economics, and policy, to examine water demand during normal and extraordinary conditions. Agent-based modeling (ABM) develops functional relationships between systems by modeling the interaction between individuals (agents), who behave according to a probabilistic set of rules. ABM is a "bottom-up" modeling approach in that it defines macro-system behavior by modeling the micro-behavior of individual agents. While each agent's behavior is often simple and predictable, the aggregate behavior of all agents in each system can be complex, unpredictable, and different than behaviors observed in mean-behavior models. Furthermore, the ABM approach creates a virtual laboratory where the effects of policy changes and/or extraordinary events can be simulated. Our model, which is based on the demographics and hydrology of the Middle Rio Grande Basin in the state of New Mexico, includes agent groups of residential, agricultural, and industrial users. Each agent within each group determines its water usage based on its own condition and the condition of the world around it. For example, residential agents can make decisions to convert to or from xeriscaping and/or low-flow appliances based on policy implementation, economic status, weather, and climatic conditions. Agricultural agents may vary their usage by making decisions on crop distribution and irrigation design. Preliminary results show that water usage can be highly irrational under certain conditions. Results also identify sub-sectors within each group that have the highest influence on ensemble group behavior, providing a means for policy makers to target their efforts. Finally, the model is able to predict the impact of low-probability, high-impact events such as catastrophic denial of service due to natural and/or man-made events.
Dynamically adaptive data-driven simulation of extreme hydrological flows
NASA Astrophysics Data System (ADS)
Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint
2018-02-01
Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.
Anomaly Detection Based on Local Nearest Neighbor Distance Descriptor in Crowded Scenes
Hu, Shiqiang; Zhang, Huanlong; Luo, Lingkun
2014-01-01
We propose a novel local nearest neighbor distance (LNND) descriptor for anomaly detection in crowded scenes. Comparing with the commonly used low-level feature descriptors in previous works, LNND descriptor has two major advantages. First, LNND descriptor efficiently incorporates spatial and temporal contextual information around the video event that is important for detecting anomalous interaction among multiple events, while most existing feature descriptors only contain the information of single event. Second, LNND descriptor is a compact representation and its dimensionality is typically much lower than the low-level feature descriptor. Therefore, not only the computation time and storage requirement can be accordingly saved by using LNND descriptor for the anomaly detection method with offline training fashion, but also the negative aspects caused by using high-dimensional feature descriptor can be avoided. We validate the effectiveness of LNND descriptor by conducting extensive experiments on different benchmark datasets. Experimental results show the promising performance of LNND-based method against the state-of-the-art methods. It is worthwhile to notice that the LNND-based approach requires less intermediate processing steps without any subsequent processing such as smoothing but achieves comparable event better performance. PMID:25105164
Predicting the Trends of Social Events on Chinese Social Media.
Zhou, Yang; Zhang, Lei; Liu, Xiaoqian; Zhang, Zhen; Bai, Shuotian; Zhu, Tingshao
2017-09-01
Growing interest in social events on social media came along with the rapid development of the Internet. Social events that occur in the "real" world can spread on social media (e.g., Sina Weibo) rapidly, which may trigger severe consequences and thus require the government's timely attention and responses. This article proposes to predict the trends of social events on Sina Weibo, which is currently the most popular social media in China. Based on the theories of social psychology and communication sciences, we extract an unprecedented amount of comprehensive and effective features that relate to the trends of social events on Chinese social media, and we construct the trends of prediction models by using three classical regression algorithms. We found that lasso regression performed better with the precision 0.78 and the recall 0.88. The results of our experiments demonstrated the effectiveness of our proposed approach.
Using HFACS-Healthcare to Identify Systemic Vulnerabilities During Surgery.
Cohen, Tara N; Francis, Sarah E; Wiegmann, Douglas A; Shappell, Scott A; Gewertz, Bruce L
2018-03-01
The Human Factors Analysis and Classification System for Healthcare (HFACS-Healthcare) was used to classify surgical near miss events reported via a hospital's event reporting system over the course of 1 year. Two trained analysts identified causal factors within each event narrative and subsequently categorized the events using HFACS-Healthcare. Of 910 original events, 592 could be analyzed further using HFACS-Healthcare, resulting in the identification of 726 causal factors. Most issues (n = 436, 60.00%) involved preconditions for unsafe acts, followed by unsafe acts (n = 257, 35.39%), organizational influences (n = 27, 3.72%), and supervisory factors (n = 6, 0.82%). These findings go beyond the traditional methods of trending incident data that typically focus on documenting the frequency of their occurrence. Analyzing near misses based on their underlying contributing human factors affords a greater opportunity to develop process improvements to reduce reoccurrence and better provide patient safety approaches.
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.
2016-01-01
Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...
2017-01-24
We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less
Prospective memory: A comparative perspective
Crystal, Jonathon D.; Wilson, A. George
2014-01-01
Prospective memory consists of forming a representation of a future action, temporarily storing that representation in memory, and retrieving it at a future time point. Here we review the recent development of animal models of prospective memory. We review experiments using rats that focus on the development of time-based and event-based prospective memory. Next, we review a number of prospective-memory approaches that have been used with a variety of non-human primates. Finally, we review selected approaches from the human literature on prospective memory to identify targets for development of animal models of prospective memory. PMID:25101562
A simplified real time method to forecast semi-enclosed basins storm surge
NASA Astrophysics Data System (ADS)
Pasquali, D.; Di Risio, M.; De Girolamo, P.
2015-11-01
Semi-enclosed basins are often prone to storm surge events. Indeed, their meteorological exposition, the presence of large continental shelf and their shape can lead to strong sea level set-up. A real time system aimed at forecasting storm surge may be of great help to protect human activities (i.e. to forecast flooding due to storm surge events), to manage ports and to safeguard coasts safety. This paper aims at illustrating a simple method able to forecast storm surge events in semi-enclosed basins in real time. The method is based on a mixed approach in which the results obtained by means of a simplified physics based model with low computational costs are corrected by means of statistical techniques. The proposed method is applied to a point of interest located in the Northern part of the Adriatic Sea. The comparison of forecasted levels against observed values shows the satisfactory reliability of the forecasts.
A data-based model to locate mass movements triggered by seismic events in Sichuan, China.
de Souza, Fabio Teodoro
2014-01-01
Earthquakes affect the entire world and have catastrophic consequences. On May 12, 2008, an earthquake of magnitude 7.9 on the Richter scale occurred in the Wenchuan area of Sichuan province in China. This event, together with subsequent aftershocks, caused many avalanches, landslides, debris flows, collapses, and quake lakes and induced numerous unstable slopes. This work proposes a methodology that uses a data mining approach and geographic information systems to predict these mass movements based on their association with the main and aftershock epicenters, geologic faults, riverbeds, and topography. A dataset comprising 3,883 mass movements is analyzed, and some models to predict the location of these mass movements are developed. These predictive models could be used by the Chinese authorities as an important tool for identifying risk areas and rescuing survivors during similar events in the future.
Confidence intervals for the first crossing point of two hazard functions.
Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng
2009-12-01
The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.
FRET and BRET-based biosensors in live cell compound screens.
Robinson, Katie Herbst; Yang, Jessica R; Zhang, Jin
2014-01-01
Live cell compound screening with genetically encoded fluorescence or bioluminescence-based biosensors offers a potentially powerful approach to identify novel regulators of a signaling event of interest. In particular, compound screening in living cells has the added benefit that the entire signaling network remains intact, and thus the screen is not just against a single molecule of interest but against any molecule within the signaling network that may modulate the distinct signaling event reported by the biosensor in use. Furthermore, only molecules that are cell permeable or act at cell surface receptors will be identified as "hits," thus reducing further optimization of the compound in terms of cell penetration. Here we discuss a detailed protocol for using genetically encoded biosensors in living cells in a 96-well format for the execution of high throughput compound screens and the identification of small molecules which modulate a signaling event of interest.
Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought
NASA Astrophysics Data System (ADS)
Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.
2017-10-01
Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.
Statistical analysis of life history calendar data.
Eerola, Mervi; Helske, Satu
2016-04-01
The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries. © The Author(s) 2012.
Sparse Event Modeling with Hierarchical Bayesian Kernel Methods
2016-01-05
SECURITY CLASSIFICATION OF: The research objective of this proposal was to develop a predictive Bayesian kernel approach to model count data based on...several predictive variables. Such an approach, which we refer to as the Poisson Bayesian kernel model , is able to model the rate of occurrence of...which adds specificity to the model and can make nonlinear data more manageable. Early results show that the 1. REPORT DATE (DD-MM-YYYY) 4. TITLE
Teaching nutritional biochemistry: an experimental approach using yeast.
Alonso, Manuel; Stella, Carlos A
2012-12-01
In this report, we present a practical approach to teaching several topics in nutrition to science students at the high school and college freshmen levels. This approach uses baker's yeast (Saccharomyces cerevisiae) as a biological system model. The diameters of yeast colonies, which vary according to the nutrients present in the medium, can be observed, compared, and used to teach metabolic requirements. The experiments described in this report show simple macroscopic evidence of submicroscopic nutritional events. This can serve as a useful base for an analogy of heterotrophic human cell nutrition.
FPGA-Based X-Ray Detection and Measurement for an X-Ray Polarimeter
NASA Technical Reports Server (NTRS)
Gregory, Kyle; Hill, Joanne; Black, Kevin; Baumgartner, Wayne
2013-01-01
This technology enables detection and measurement of x-rays in an x-ray polarimeter using a field-programmable gate array (FPGA). The technology was developed for the Gravitational and Extreme Magnetism Small Explorer (GEMS) mission. It performs precision energy and timing measurements, as well as rejection of non-x-ray events. It enables the GEMS polarimeter to detect precisely when an event has taken place so that additional measurements can be made. The technology also enables this function to be performed in an FPGA using limited resources so that mass and power can be minimized while reliability for a space application is maximized and precise real-time operation is achieved. This design requires a low-noise, charge-sensitive preamplifier; a highspeed analog to digital converter (ADC); and an x-ray detector with a cathode terminal. It functions by computing a sum of differences for time-samples whose difference exceeds a programmable threshold. A state machine advances through states as a programmable number of consecutive samples exceeds or fails to exceed this threshold. The pulse height is recorded as the accumulated sum. The track length is also measured based on the time from the start to the end of accumulation. For track lengths longer than a certain length, the algorithm estimates the barycenter of charge deposit by comparing the accumulator value at the midpoint to the final accumulator value. The design also employs a number of techniques for rejecting background events. This innovation enables the function to be performed in space where it can operate autonomously with a rapid response time. This implementation combines advantages of computing system-based approaches with those of pure analog approaches. The result is an implementation that is highly reliable, performs in real-time, rejects background events, and consumes minimal power.
Neural network classification of questionable EGRET events
NASA Astrophysics Data System (ADS)
Meetre, C. A.; Norris, J. P.
1992-02-01
High energy gamma rays (greater than 20 MeV) pair producing in the spark chamber of the Energetic Gamma Ray Telescope Experiment (EGRET) give rise to a characteristic but highly variable 3-D locus of spark sites, which must be processed to decide whether the event is to be included in the database. A significant fraction (about 15 percent or 104 events/day) of the candidate events cannot be categorized (accept/reject) by an automated rule-based procedure; they are therefore tagged, and must be examined and classified manually by a team of expert analysts. We describe a feedforward, back-propagation neural network approach to the classification of the questionable events. The algorithm computes a set of coefficients using representative exemplars drawn from the preclassified set of questionable events. These coefficients map a given input event into a decision vector that, ideally, describes the correct disposition of the event. The net's accuracy is then tested using a different subset of preclassified events. Preliminary results demonstrate the net's ability to correctly classify a large proportion of the events for some categories of questionables. Current work includes the use of much larger training sets to improve the accuracy of the net.
Neural network classification of questionable EGRET events
NASA Technical Reports Server (NTRS)
Meetre, C. A.; Norris, J. P.
1992-01-01
High energy gamma rays (greater than 20 MeV) pair producing in the spark chamber of the Energetic Gamma Ray Telescope Experiment (EGRET) give rise to a characteristic but highly variable 3-D locus of spark sites, which must be processed to decide whether the event is to be included in the database. A significant fraction (about 15 percent or 10(exp 4) events/day) of the candidate events cannot be categorized (accept/reject) by an automated rule-based procedure; they are therefore tagged, and must be examined and classified manually by a team of expert analysts. We describe a feedforward, back-propagation neural network approach to the classification of the questionable events. The algorithm computes a set of coefficients using representative exemplars drawn from the preclassified set of questionable events. These coefficients map a given input event into a decision vector that, ideally, describes the correct disposition of the event. The net's accuracy is then tested using a different subset of preclassified events. Preliminary results demonstrate the net's ability to correctly classify a large proportion of the events for some categories of questionables. Current work includes the use of much larger training sets to improve the accuracy of the net.
Fault tree analysis of the causes of waterborne outbreaks.
Risebro, Helen L; Doria, Miguel F; Andersson, Yvonne; Medema, Gertjan; Osborn, Keith; Schlosser, Olivier; Hunter, Paul R
2007-01-01
Prevention and containment of outbreaks requires examination of the contribution and interrelation of outbreak causative events. An outbreak fault tree was developed and applied to 61 enteric outbreaks related to public drinking water supplies in the EU. A mean of 3.25 causative events per outbreak were identified; each event was assigned a score based on percentage contribution per outbreak. Source and treatment system causative events often occurred concurrently (in 34 outbreaks). Distribution system causative events occurred less frequently (19 outbreaks) but were often solitary events contributing heavily towards the outbreak (a mean % score of 87.42). Livestock and rainfall in the catchment with no/inadequate filtration of water sources contributed concurrently to 11 of 31 Cryptosporidium outbreaks. Of the 23 protozoan outbreaks experiencing at least one treatment causative event, 90% of these events were filtration deficiencies; by contrast, for bacterial, viral, gastroenteritis and mixed pathogen outbreaks, 75% of treatment events were disinfection deficiencies. Roughly equal numbers of groundwater and surface water outbreaks experienced at least one treatment causative event (18 and 17 outbreaks, respectively). Retrospective analysis of multiple outbreaks of enteric disease can be used to inform outbreak investigations, facilitate corrective measures, and further develop multi-barrier approaches.
Bayesian Approach for Flexible Modeling of Semicompeting Risks Data
Han, Baoguang; Yu, Menggang; Dignam, James J.; Rathouz, Paul J.
2016-01-01
Summary Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445
Modelling approaches: the case of schizophrenia.
Heeg, Bart M S; Damen, Joep; Buskens, Erik; Caleo, Sue; de Charro, Frank; van Hout, Ben A
2008-01-01
Schizophrenia is a chronic disease characterized by periods of relative stability interrupted by acute episodes (or relapses). The course of the disease may vary considerably between patients. Patient histories show considerable inter- and even intra-individual variability. We provide a critical assessment of the advantages and disadvantages of three modelling techniques that have been used in schizophrenia: decision trees, (cohort and micro-simulation) Markov models and discrete event simulation models. These modelling techniques are compared in terms of building time, data requirements, medico-scientific experience, simulation time, clinical representation, and their ability to deal with patient heterogeneity, the timing of events, prior events, patient interaction, interaction between co-variates and variability (first-order uncertainty). We note that, depending on the research question, the optimal modelling approach should be selected based on the expected differences between the comparators, the number of co-variates, the number of patient subgroups, the interactions between co-variates, and simulation time. Finally, it is argued that in case micro-simulation is required for the cost-effectiveness analysis of schizophrenia treatments, a discrete event simulation model is best suited to accurately capture all of the relevant interdependencies in this chronic, highly heterogeneous disease with limited long-term follow-up data.
Investigating Montara platform oil spill accident by implementing RST-OIL approach.
NASA Astrophysics Data System (ADS)
Satriano, Valeria; Ciancia, Emanuele; Coviello, Irina; Di Polito, Carmine; Lacava, Teodosio; Pergola, Nicola; Tramutoli, Valerio
2016-04-01
Oil Spills represent one of the most harmful events to marine ecosystems and their timely detection is crucial for their mitigation and management. The potential of satellite data for their detection and monitoring has been largely investigated. Traditional satellite techniques usually identify oil spill presence applying a fixed threshold scheme only after the occurrence of an event, which make them not well suited for their prompt identification. The Robust Satellite Technique (RST) approach, in its oil spill detection version (RST-OIL), being based on the comparison of the latest satellite acquisition with its historical value, previously identified, allows the automatic and near real-time detection of events. Such a technique has been already successfully applied on data from different sources (AVHRR-Advanced Very High Resolution Radiometer and MODIS-Moderate Resolution Imaging Spectroradiometer) showing excellent performance in detecting oil spills both during day- and night-time conditions, with an high level of sensitivity (detection also of low intensity events) and reliability (no false alarm on scene). In this paper, RST-OIL has been implemented on MODIS thermal infrared data for the analysis of the Montara Platform (Timor Sea - Australia) oil spill disaster occurred in August 2009. Preliminary achievements are presented and discussed in this paper.
The chemistry side of AOP: implications for toxicity ...
An adverse outcome pathway (AOP) is a structured representation of the biological events that lead to adverse impacts following a molecular initiating event caused by chemical interaction with a macromolecule. AOPs have been proposed to facilitate toxicity extrapolation across species through understanding of species similarity in the sequence of molecular, cellular, organ and organismal level responses. However, AOPs are non-specific regarding the identity of the chemical initiators, and the range of structures for which an AOP is considered applicable has generally been poorly defined. Applicability domain has been widely understood in the field of QSAR as the response and chemical structure space in which the model makes predictions with a given reliability, and has been traditionally applied to define the similarity of query molecules within the training set. Three dimensional (3D) receptor modeling offers an approach to better define the applicability domain for selected AOPs through determination of the chemical space of the molecular initiating event. Universal 3D-QSAR models were developed for acetylcholinesterase inhibitors and estrogen receptor agonists and antagonists using a combination of fingerprint, molecular docking and structure-based pharmacophore approaches. The models were based on the critical molecular interactions within each receptor ligand binding domain, and included the key amino acid residues responsible for high binding affinity. T
Autonomous control of production networks using a pheromone approach
NASA Astrophysics Data System (ADS)
Armbruster, D.; de Beer, C.; Freitag, M.; Jagalski, T.; Ringhofer, C.
2006-04-01
The flow of parts through a production network is usually pre-planned by a central control system. Such central control fails in presence of highly fluctuating demand and/or unforeseen disturbances. To manage such dynamic networks according to low work-in-progress and short throughput times, an autonomous control approach is proposed. Autonomous control means a decentralized routing of the autonomous parts themselves. The parts’ decisions base on backward propagated information about the throughput times of finished parts for different routes. So, routes with shorter throughput times attract parts to use this route again. This process can be compared to ants leaving pheromones on their way to communicate with following ants. The paper focuses on a mathematical description of such autonomously controlled production networks. A fluid model with limited service rates in a general network topology is derived and compared to a discrete-event simulation model. Whereas the discrete-event simulation of production networks is straightforward, the formulation of the addressed scenario in terms of a fluid model is challenging. Here it is shown, how several problems in a fluid model formulation (e.g. discontinuities) can be handled mathematically. Finally, some simulation results for the pheromone-based control with both the discrete-event simulation model and the fluid model are presented for a time-dependent influx.
Approaches to Interactive Video Anchors in Problem-based Science Learning
NASA Astrophysics Data System (ADS)
Kumar, David Devraj
2010-02-01
This paper is an invited adaptation of the IEEE Education Society Distinguished Lecture Approaches to Interactive Video Anchors in Problem-Based Science Learning. Interactive video anchors have a cognitive theory base, and they help to enlarge the context of learning with information-rich real-world situations. Carefully selected movie clips and custom-developed regular videos and virtual simulations have been successfully used as anchors in problem-based science learning. Examples discussed include a range of situations such as Indiana Jones tackling a trap, a teenager misrepresenting lead for gold, an agriculture inspection at the US border, counterintuitive events, analyzing a river ecosystem for pollution, and finding the cause of illness in a nineteenth century river city. Suggestions for teachers are provided.
Galper, Benjamin Z.; Wang, Y. Claire; Einstein, Andrew J.
2015-01-01
Background Several approaches have been proposed for risk-stratification and primary prevention of coronary heart disease (CHD), but their comparative and cost-effectiveness is unknown. Methods We constructed a state-transition microsimulation model to compare multiple approaches to the primary prevention of CHD in a simulated cohort of men aged 45–75 and women 55–75. Risk-stratification strategies included the 2013 American College of Cardiology/American Heart Association (ACC/AHA) guidelines on the treatment of blood cholesterol, the Adult Treatment Panel (ATP) III guidelines, and approaches based on coronary artery calcium (CAC) scoring and C-reactive protein (CRP). Additionally we assessed a treat-all strategy in which all individuals were prescribed either moderate-dose or high-dose statins and all males received low-dose aspirin. Outcome measures included CHD events, costs, medication-related side effects, radiation-attributable cancers, and quality-adjusted-life-years (QALYs) over a 30-year timeframe. Results Treat-all with high-dose statins dominated all other strategies for both men and women, gaining 15.7 million QALYs, preventing 7.3 million myocardial infarctions, and saving over $238 billion, compared to the status quo, far outweighing its associated adverse events including bleeding, hepatitis, myopathy, and new-onset diabetes. ACC/AHA guidelines were more cost-effective than ATP III guidelines for both men and women despite placing 8.7 million more people on statins. For women at low CHD risk, treat-all with high-dose statins was more likely to cause a statin-related adverse event than to prevent a CHD event. Conclusions Despite leading to a greater proportion of the population placed on statin therapy, the ACC/AHA guidelines are more cost-effective than ATP III. Even so, at generic prices, treating all men and women with statins and all men with low-dose aspirin appears to be more cost-effective than all risk-stratification approaches for the primary prevention of CHD. Especially for low-CHD risk women, decisions on the appropriate primary prevention strategy should be based on shared decision making between patients and healthcare providers. PMID:26422204
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
A temporal discriminability account of children's eyewitness suggestibility.
Bright-Paul, Alexandra; Jarrold, Christopher
2009-07-01
Children's suggestibility is typically measured using a three-stage 'event-misinformation-test' procedure. We examined whether suggestibility is influenced by the time delays imposed between these stages, and in particular whether the temporal discriminability of sources (event and misinformation) predicts performance. In a novel approach, the degree of source discriminability was calculated as the relative magnitude of two intervals (the ratio of event-misinformation and misinformation-test intervals), based on an adaptation of existing 'ratio-rule' accounts of memory. Five-year-olds (n =150) watched an event, and were exposed to misinformation, before memory for source was tested. The absolute event-test delay (12 versus 24 days) and the 'ratio' of event-misinformation/misinformation-test intervals (11:1, 3:1, 1:1, 1:3 and 1:11) were manipulated across participants. The temporal discriminability of sources, measured by the ratio, was indeed a strong predictor of suggestibility. Most importantly, if the ratio was constant (e.g. 18/6 versus 9/3 days), performance was remarkably similar despite variations in absolute delay (e.g. 24 versus 12 days). This intriguing finding not only extends the ratio-rule of distinctiveness to misinformation paradigms, but also serves to illustrate a new empirical means of differentiating between explanations of suggestibility based on interference between sources and disintegration of source information over time.
Analysis and suppression of passive noise in surface microseismic data
NASA Astrophysics Data System (ADS)
Forghani-Arani, Farnoush
Surface microseismic surveys are gaining popularity in monitoring the hydraulic fracturing process. The effectiveness of these surveys, however, is strongly dependent on the signal-to-noise ratio of the acquired data. Cultural and industrial noise generated during hydraulic fracturing operations usually dominate the data, thereby decreasing the effectiveness of using these data in identifying and locating microseismic events. Hence, noise suppression is a critical step in surface microseismic monitoring. In this thesis, I focus on two important aspects in using surface-recorded microseismic seismic data: first, I take advantage of the unwanted surface noise to understand the characteristics of these noise and extract information about the propagation medium from the noise; second, I propose effective techniques to suppress the surface noise while preserving the waveforms that contain information about the source of microseisms. Automated event identification on passive seismic data using only a few receivers is challenging especially when the record lengths span over long durations of time. I introduce an automatic event identification algorithm that is designed specifically for detecting events in passive data acquired with a small number of receivers. I demonstrate that the conventional STA/LTA (Short-term Average/Long-term Average) algorithm is not sufficiently effective in event detection in the common case of low signal-to-noise ratio. With a cross-correlation based method as an extension of the STA/LTA algorithm, even low signal-to-noise events (that were not detectable with conventional STA/LTA) were revealed. Surface microseismic data contains surface-waves (generated primarily from hydraulic fracturing activities) and body-waves in the form of microseismic events. It is challenging to analyze the surface-waves on the recorded data directly because of the randomness of their source and their unknown source signatures. I use seismic interferometry to extract the surface-wave arrivals. Interferometry is a powerful tool to extract waves (including body-wave and surface-waves) that propagate from any receiver in the array (called a pseudo source) to the other receivers across the array. Since most of the noise sources in surface microseismic data lie on the surface, seismic interferometry yields pseudo source gathers dominated by surface-wave energy. The dispersive characteristics of these surface-waves are important properties that can be used to extract information necessary for suppressing these waves. I demonstrate the application of interferometry to surface passive data recorded during the hydraulic fracturing operation of a tight gas reservoir and extract the dispersion properties of surface-waves corresponding to a pseudo-shot gather. Comparison of the dispersion characteristics of the surface waves from the pseudo-shot gather with that of an active shot-gather shows interesting similarities and differences. The dispersion character (e.g. velocity change with frequency) of the fundamental mode was observed to have the same behavior for both the active and passive data. However, for the higher mode surface-waves, the dispersion properties are extracted at different frequency ranges. Conventional noise suppression techniques in passive data are mostly stacking-based that rely on enforcing the amplitude of the signal by stacking the waveforms at the receivers and are unable to preserve the waveforms at the individual receivers necessary for estimating the microseismic source location and source mechanism. Here, I introduce a technique based on the tau - p transform, that effectively identifies and separates microseismic events from surface-wave noise in the tau -p domain. This technique is superior to conventional stacking-based noise suppression techniques, because it preserves the waveforms at individual receivers. Application of this methodology to microseismic events with isotropic and double-couple source mechanism, show substantial improvement in the signal-to-noise ratio. Imaging of the processed field data also show improved imaging of the hypocenter location of the microseismic source. In the case of double-couple source mechanism, I suggest two approaches for unifying the polarities at the receivers, a cross-correlation approach and a semblance-based prediction approach. The semblance-based approach is more effective at unifying the polarities, especially for low signal-to-noise ratio data.
A Measurement Plane for Optical Networks to Manage Emergency Events
NASA Astrophysics Data System (ADS)
Tego, E.; Carciofi, C.; Grazioso, P.; Petrini, V.; Pompei, S.; Matera, F.; Attanasio, V.; Nastri, E.; Restuccia, E.
2017-11-01
In this work, we show a wide geographical area optical network test bed, adopting the mPlane measurement plane for monitoring its performance and to manage software defined network approaches, with some specific tests and procedures dedicated to respond to disaster events and to support emergency networks. Such a test bed includes FTTX accesses, and it is currently implemented to support future 5G wireless services with slicing procedures based on Carrier Ethernet. The characteristics of this platform have been experimentally tested in the case of a damage-causing link failure and traffic congestion, showing a fast reactions to these disastrous events, allowing the user to recharge the initial QoS parameters.
Hartog, Iris; Scherer-Rath, Michael; Kruizinga, Renske; Netjes, Justine; Henriques, José; Nieuwkerk, Pythia; Sprangers, Mirjam; van Laarhoven, Hanneke
2017-09-01
Falling seriously ill is often experienced as a life event that causes conflict with people's personal goals and expectations in life and evokes existential questions. This article presents a new humanities approach to the way people make meaning of such events and how this influences their quality of life. Incorporating theories on contingency, narrative identity, and quality of life, we developed a theoretical model entailing the concepts life event, worldview, ultimate life goals, experience of contingency, narrative meaning making, narrative integration, and quality of life. We formulate testable hypotheses and describe the self-report questionnaire that was developed based on the model.
NASA Astrophysics Data System (ADS)
Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne
2017-04-01
The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine background site (Sonnblick Observatory, Saharan Dust Event from April 2016). Further, chemical imaging of biological aerosol constituents of an autumnal pollen breakout in Vienna, with background samples from nearby locations from November 2016 will demonstrate the advantages of the chemical imaging approach. Additionally, the chemical fingerprint of an exceptional air pollution event from a local emission source, caused by the pull down process of a building in Vienna will unravel the needs for multisensor imaging, especially the combinational access. Obtained chemical images will be correlated to bulk analytical results. Benefits of the overall methodical access by combining bulk analytics and combined chemical imaging of exceptional episodic air pollution events will be discussed.
A risk-based multi-objective model for optimal placement of sensors in water distribution system
NASA Astrophysics Data System (ADS)
Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein
2018-02-01
In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.
A Cross-Layer User Centric Vertical Handover Decision Approach Based on MIH Local Triggers
NASA Astrophysics Data System (ADS)
Rehan, Maaz; Yousaf, Muhammad; Qayyum, Amir; Malik, Shahzad
Vertical handover decision algorithm that is based on user preferences and coupled with Media Independent Handover (MIH) local triggers have not been explored much in the literature. We have developed a comprehensive cross-layer solution, called Vertical Handover Decision (VHOD) approach, which consists of three parts viz. mechanism for collecting and storing user preferences, Vertical Handover Decision (VHOD) algorithm and the MIH Function (MIHF). MIHF triggers the VHOD algorithm which operates on user preferences to issue handover commands to mobility management protocol. VHOD algorithm is an MIH User and therefore needs to subscribe events and configure thresholds for receiving triggers from MIHF. In this regard, we have performed experiments in WLAN to suggest thresholds for Link Going Down trigger. We have also critically evaluated the handover decision process, proposed Just-in-time interface activation technique, compared our proposed approach with prominent user centric approaches and analyzed our approach from different aspects.
Chai, C T; Putuhena, F J; Selaman, O S
2017-12-01
The influences of climate on the retention capability of green roof have been widely discussed in existing literature. However, knowledge on how the retention capability of green roof is affected by the tropical climate is limited. This paper highlights the retention performance of the green roof situated in Kuching under hot-humid tropical climatic conditions. Using the green roof water balance modelling approach, this study simulated the hourly runoff generated from a virtual green roof from November 2012 to October 2013 based on past meteorological data. The result showed that the overall retention performance was satisfactory with a mean retention rate of 72.5% from 380 analysed rainfall events but reduced to 12.0% only for the events that potentially trigger the occurrence of flash flood. By performing the Spearman rank's correlation analysis, it was found that the rainfall depth and mean rainfall intensity, individually, had a strong negative correlation with event retention rate, suggesting that the retention rate increases with decreased rainfall depth. The expected direct relationship between retention rate and antecedent dry weather period was found to be event size dependent.
Figueiro, Ana Claudia; de Araújo Oliveira, Sydia Rosana; Hartz, Zulmira; Couturier, Yves; Bernier, Jocelyne; do Socorro Machado Freire, Maria; Samico, Isabella; Medina, Maria Guadalupe; de Sa, Ronice Franco; Potvin, Louise
2017-03-01
Public health interventions are increasingly represented as complex systems. Research tools for capturing the dynamic of interventions processes, however, are practically non-existent. This paper describes the development and proof of concept process of an analytical tool, the critical event card (CEC), which supports the representation and analysis of complex interventions' evolution, based on critical events. Drawing on the actor-network theory (ANT), we developed and field-tested the tool using three innovative health interventions in northeastern Brazil. Interventions were aimed to promote health equity through intersectoral approaches; were engaged in participatory evaluation and linked to professional training programs. The CEC developing involve practitioners and researchers from projects. Proof of concept was based on document analysis, face-to-face interviews and focus groups. Analytical categories from CEC allow identifying and describing critical events as milestones in the evolution of complex interventions. Categories are (1) event description; (2) actants (human and non-human) involved; (3) interactions between actants; (4) mediations performed; (5) actions performed; (6) inscriptions produced; and (7) consequences for interventions. The CEC provides a tool to analyze and represent intersectoral internvetions' complex and dynamic evolution.
NASA Astrophysics Data System (ADS)
Gu, Chen; Marzouk, Youssef M.; Toksöz, M. Nafi
2018-03-01
Small earthquakes occur due to natural tectonic motions and are induced by oil and gas production processes. In many oil/gas fields and hydrofracking processes, induced earthquakes result from fluid extraction or injection. The locations and source mechanisms of these earthquakes provide valuable information about the reservoirs. Analysis of induced seismic events has mostly assumed a double-couple source mechanism. However, recent studies have shown a non-negligible percentage of non-double-couple components of source moment tensors in hydraulic fracturing events, assuming a full moment tensor source mechanism. Without uncertainty quantification of the moment tensor solution, it is difficult to determine the reliability of these source models. This study develops a Bayesian method to perform waveform-based full moment tensor inversion and uncertainty quantification for induced seismic events, accounting for both location and velocity model uncertainties. We conduct tests with synthetic events to validate the method, and then apply our newly developed Bayesian inversion approach to real induced seismicity in an oil/gas field in the sultanate of Oman—determining the uncertainties in the source mechanism and in the location of that event.
Siddiqui, Mohd Maroof; Srivastava, Geetika; Saeed, Syed Hasan
2016-01-01
Insomnia is a sleep disorder in which the subject encounters problems in sleeping. The aim of this study is to identify insomnia events from normal or effected person using time frequency analysis of PSD approach applied on EEG signals using channel ROC-LOC. In this research article, attributes and waveform of EEG signals of Human being are examined. The aim of this study is to draw the result in the form of signal spectral analysis of the changes in the domain of different stages of sleep. The analysis and calculation is performed in all stages of sleep of PSD of each EEG segment. Results indicate the possibility of recognizing insomnia events based on delta, theta, alpha and beta segments of EEG signals.
Experimental Learning Enhancing Improvisation Skills
ERIC Educational Resources Information Center
Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa
2016-01-01
Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…
DOT National Transportation Integrated Search
2011-06-14
This paper presents a novel analytical approach to and techniques for translating characteristics of uncertainty in predicting sector entry times and times in sector for individual flights into characteristics of uncertainty in predicting one-minute ...
The chordate proteome history database.
Levasseur, Anthony; Paganini, Julien; Dainat, Jacques; Thompson, Julie D; Poch, Olivier; Pontarotti, Pierre; Gouret, Philippe
2012-01-01
The chordate proteome history database (http://ioda.univ-provence.fr) comprises some 20,000 evolutionary analyses of proteins from chordate species. Our main objective was to characterize and study the evolutionary histories of the chordate proteome, and in particular to detect genomic events and automatic functional searches. Firstly, phylogenetic analyses based on high quality multiple sequence alignments and a robust phylogenetic pipeline were performed for the whole protein and for each individual domain. Novel approaches were developed to identify orthologs/paralogs, and predict gene duplication/gain/loss events and the occurrence of new protein architectures (domain gains, losses and shuffling). These important genetic events were localized on the phylogenetic trees and on the genomic sequence. Secondly, the phylogenetic trees were enhanced by the creation of phylogroups, whereby groups of orthologous sequences created using OrthoMCL were corrected based on the phylogenetic trees; gene family size and gene gain/loss in a given lineage could be deduced from the phylogroups. For each ortholog group obtained from the phylogenetic or the phylogroup analysis, functional information and expression data can be retrieved. Database searches can be performed easily using biological objects: protein identifier, keyword or domain, but can also be based on events, eg, domain exchange events can be retrieved. To our knowledge, this is the first database that links group clustering, phylogeny and automatic functional searches along with the detection of important events occurring during genome evolution, such as the appearance of a new domain architecture.
Butler, Troy; Wildey, Timothy
2018-01-01
In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, Troy; Wildey, Timothy
In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less
Air Traffic Controllers' Control Strategies in the Terminal Area Under Off-Nominal Conditions
NASA Technical Reports Server (NTRS)
Martin, Lynne; Mercer, Joey; Callantine, Todd; Kupfer, Michael; Cabrall, Christopher
2012-01-01
A human-in-the-loop simulation investigated the robustness of a schedule-based terminal-area air traffic management concept, and its supporting controller tools, to off-nominal events - events that led to situations in which runway arrival schedules required adjustments and controllers could no longer use speed control alone to impose the necessary delays. The main research question was exploratory: to assess whether controllers could safely resolve and control the traffic during off-nominal events. A focus was the role of the supervisor - how he managed the schedules, how he assisted the controllers, what strategies he used, and which combinations of tools he used. Observations and questionnaire responses revealed supervisor strategies for resolving events followed a similar pattern: a standard approach specific to each type of event often resolved to a smooth conclusion. However, due to the range of factors influencing the event (e.g., environmental conditions, aircraft density on the schedule, etc.), sometimes the plan required revision and actions had a wide-ranging effect.
Constructing regional climate networks in the Amazonia during recent drought events.
Guo, Heng; Ramos, Antônio M T; Macau, Elbert E N; Zou, Yong; Guan, Shuguang
2017-01-01
Climate networks are powerful approaches to disclose tele-connections in climate systems and to predict severe climate events. Here we construct regional climate networks from precipitation data in the Amazonian region and focus on network properties under the recent drought events in 2005 and 2010. Both the networks of the entire Amazon region and the extreme networks resulted from locations severely affected by drought events suggest that network characteristics show slight difference between the two drought events. Based on network degrees of extreme drought events and that without drought conditions, we identify regions of interest that are correlated to longer expected drought period length. Moreover, we show that the spatial correlation length to the regions of interest decayed much faster in 2010 than in 2005, which is because of the dual roles played by both the Pacific and Atlantic oceans. The results suggest that hub nodes in the regional climate network of Amazonia have fewer long-range connections when more severe drought conditions appeared in 2010 than that in 2005.
Bednarek, Piotr T; Orłowska, Renata; Niedziela, Agnieszka
2017-04-21
We present a new methylation-sensitive amplified polymorphism (MSAP) approach for the evaluation of relative quantitative characteristics such as demethylation, de novo methylation, and preservation of methylation status of CCGG sequences, which are recognized by the isoschizomers HpaII and MspI. We applied the technique to analyze aluminum (Al)-tolerant and non-tolerant control and Al-stressed inbred triticale lines. The approach is based on detailed analysis of events affecting HpaII and MspI restriction sites in control and stressed samples, and takes advantage of molecular marker profiles generated by EcoRI/HpaII and EcoRI/MspI MSAP platforms. Five Al-tolerant and five non-tolerant triticale lines were exposed to aluminum stress using the physiologicaltest. Total genomic DNA was isolated from root tips of all tolerant and non-tolerant lines before and after Al stress following metAFLP and MSAP approaches. Based on codes reflecting events affecting cytosines within a given restriction site recognized by HpaII and MspI in control and stressed samples demethylation (DM), de novo methylation (DNM), preservation of methylated sites (MSP), and preservation of nonmethylatedsites (NMSP) were evaluated. MSAP profiles were used for Agglomerative hierarchicalclustering (AHC) based on Squared Euclidean distance and Ward's Agglomeration method whereas MSAP characteristics for ANOVA. Relative quantitative MSAP analysis revealed that both Al-tolerant and non-tolerant triticale lines subjected to Al stress underwent demethylation, with demethylation of CG predominating over CHG. The rate of de novo methylation in the CG context was ~3-fold lower than demethylation, whereas de novo methylation of CHG was observed only in Al-tolerant lines. Our relative quantitative MSAP approach, based on methylation events affecting cytosines within HpaII-MspI recognition sequences, was capable of quantifying de novo methylation, demethylation, methylation, and non-methylated status in control and stressed Al-tolerant and non-tolerant triticale inbred lines. The method could also be used to analyze methylation events affecting CG and CHG contexts, which were differentially methylated under Al stress. We cannot exclude that the methylation changes revealed among lines as well as between Al-tolerant and non-tolerant groups of lines were due to some experimental errors or that the number of lines was too small for ANOVA to prove the influence of Al stress. Nevertheless, we suspect that Al tolerance in triticale could be partly regulated by epigenetic factors acting at the level of DNA methylation. This method provides a valuable tool for studies of abiotic stresses in plants.
Radar rainfall estimation in the context of post-event analysis of flash-flood events
NASA Astrophysics Data System (ADS)
Delrieu, G.; Bouilloud, L.; Boudevillain, B.; Kirstetter, P.-E.; Borga, M.
2009-09-01
This communication is about a methodology for radar rainfall estimation in the context of post-event analysis of flash-flood events developed within the HYDRATE project. For such extreme events, some raingauge observations (operational, amateur) are available at the event time scale, while few raingauge time series are generally available at the hydrologic time steps. Radar data is therefore the only way to access to the rainfall space-time organization, but the quality of the radar data may be highly variable as a function of (1) the relative locations of the event and the radar(s) and (2) the radar operating protocol(s) and maintenance. A positive point: heavy rainfall is associated with convection implying better visibility and lesser bright band contamination compared with more current situations. In parallel with the development of a regionalized and adaptive radar data processing system (TRADHy; Delrieu et al. 2009), a pragmatic approach is proposed here to make best use of the available radar and raingauge data for a given flash-flood event by: (1) Identifying and removing residual ground clutter, (2) Applying the "hydrologic visibility" concept (Pellarin et al. 2002) to correct for range-dependent errors (screening and VPR effects for non-attenuating wavelengths, (3) Estimating an effective Z-R relationship through a radar-raingauge optimization approach to remove the mean field bias (Dinku et al. 2002) A sensitivity study, based on the high-quality volume radar datasets collected during two intense rainfall events of the Bollène 2002 experiment (Delrieu et al. 2009), is first proposed. Then the method is implemented for two other historical events occurred in France (Avène 1997 and Aude 1999) with datasets of lesser quality. References: Delrieu, G., B. Boudevillain, J. Nicol, B. Chapon, P.-E. Kirstetter, H. Andrieu, and D. Faure, 2009: Bollène 2002 experiment: radar rainfall estimation in the Cévennes-Vivarais region, France. Journal of Applied Meteorology and Climatology, in press. Dinku, T., E.N. Anagnostou, and M. Borga, 2002: Improving Radar-Based Estimation of Rainfall over Complex Terrain. J. Appl. Meteor., 41, 1163-1178. Pellarin, T., G. Delrieu, G. M. Saulnier, H. Andrieu, B. Vignal, and J. D. Creutin, 2002: Hydrologic visibility of weather radar systems operating in mountainous regions: Case study for the Ardeche Catchment (France). Journal of Hydrometeorology, 3, 539-555.
Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E
To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani
2016-01-01
This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
NASA Astrophysics Data System (ADS)
Ward, D.; Henderson, S.; Newman, S. J.
2012-12-01
Citizen science projects in ecology are in a unique position to address the needs of both the science and education communities. Such projects can provide needed data to further understanding of ecological processes at multiple spatial scales while also increasing public understanding of the importance of the ecological sciences. Balancing the needs of both communities, it is important that citizen science programs also provide different 'entry' points to appeal to diverse segments of society. In the case of NEON's Project BudBurst, a national plant phenology citizen science program, two approaches were developed to address the ongoing challenge to recruitment and retention of participants. Initially, Project BudBurst was designed to be an event-based phenology program. Participants were asked to identify a plant and report on the timing of specific phenoevents throughout the year. This approach requires a certain level of participation, which while yielding useful results, is not going to appeal to the broadest audience possible. To broaden participation, in 2011 and 2012, Project BudBurst added campaigns targeted at engaging individuals in making simple status-based reports of a plant they chose. Three targeted field campaigns were identified to take advantage of times when people notice changes to plants in their environment, using simple status-based protocols: Fall Into Phenology, Cherry Blossom Blitz, and Summer Solstice Snapshot. The interest and participation in these single report phenological status-based campaigns exceeded initial expectations. For example, Fall Into Phenology attracted individuals who otherwise had not considered participating in an ongoing field campaign. In the past, observations of fall phenology events submitted to Project BudBurst had been limited. By providing the opportunity for submitting simple, single reports, the number of both new participants and submitted observations increased significantly.
Peng, Ting; Sun, Xiaochun; Mumm, Rita H
2014-01-01
Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity as well as seed chipping and tissue sampling approaches to facilitate genotyping. With selfing approaches, two generations of selfing rather than one for trait fixation (i.e. 'F2 enrichment' as per Bonnett et al. in Strategies for efficient implementation of molecular markers in wheat breeding. Mol Breed 15:75-85, 2005) were utilized to eliminate bottlenecking due to extremely low frequencies of desired genotypes in the population. The efficiency indicators such as total number of plants grown across generations, total number of marker data points, total number of generations, number of seeds sampled by seed chipping, number of plants requiring tissue sampling, and number of pollinations (i.e. selfing and crossing) were considered in comparisons of breeding strategies. A breeding strategy involving seed chipping and a two-generation selfing approach (SC + SELF) was determined to be the most efficient breeding strategy in terms of time to market and resource requirements. Doubled haploidy may have limited utility in trait fixation for MTI under the defined breeding scenario. This outcome paves the way for optimizing the last step in the MTI process, version testing, which involves hybridization of female and male RP conversions to create versions of the converted hybrid for performance evaluation and possible commercial release.
A new modeling and inference approach for the Systolic Blood Pressure Intervention Trial outcomes.
Yang, Song; Ambrosius, Walter T; Fine, Lawrence J; Bress, Adam P; Cushman, William C; Raj, Dominic S; Rehman, Shakaib; Tamariz, Leonardo
2018-06-01
Background/aims In clinical trials with time-to-event outcomes, usually the significance tests and confidence intervals are based on a proportional hazards model. Thus, the temporal pattern of the treatment effect is not directly considered. This could be problematic if the proportional hazards assumption is violated, as such violation could impact both interim and final estimates of the treatment effect. Methods We describe the application of inference procedures developed recently in the literature for time-to-event outcomes when the treatment effect may or may not be time-dependent. The inference procedures are based on a new model which contains the proportional hazards model as a sub-model. The temporal pattern of the treatment effect can then be expressed and displayed. The average hazard ratio is used as the summary measure of the treatment effect. The test of the null hypothesis uses adaptive weights that often lead to improvement in power over the log-rank test. Results Without needing to assume proportional hazards, the new approach yields results consistent with previously published findings in the Systolic Blood Pressure Intervention Trial. It provides a visual display of the time course of the treatment effect. At four of the five scheduled interim looks, the new approach yields smaller p values than the log-rank test. The average hazard ratio and its confidence interval indicates a treatment effect nearly a year earlier than a restricted mean survival time-based approach. Conclusion When the hazards are proportional between the comparison groups, the new methods yield results very close to the traditional approaches. When the proportional hazards assumption is violated, the new methods continue to be applicable and can potentially be more sensitive to departure from the null hypothesis.
Fortwaengler, Kurt; Parkin, Christopher G.; Neeser, Kurt; Neumann, Monika; Mast, Oliver
2017-01-01
The modeling approach described here is designed to support the development of spreadsheet-based simple predictive models. It is based on 3 pillars: association of the complications with HbA1c changes, incidence of the complications, and average cost per event of the complication. For each pillar, the goal of the analysis was (1) to find results for a large diversity of populations with a focus on countries/regions, diabetes type, age, diabetes duration, baseline HbA1c value, and gender; (2) to assess the range of incidences and associations previously reported. Unlike simple predictive models, which mostly are based on only 1 source of information for each of the pillars, we conducted a comprehensive, systematic literature review. Each source found was thoroughly reviewed and only sources meeting quality expectations were considered. The approach allows avoidance of unintended use of extreme data. The user can utilize (1) one of the found sources, (2) the found range as validation for the found figures, or (3) the average of all found publications for an expedited estimate. The modeling approach is intended for use in average insulin-treated diabetes populations in which the baseline HbA1c values are within an average range (6.5% to 11.5%); it is not intended for use in individuals or unique diabetes populations (eg, gestational diabetes). Because the modeling approach only considers diabetes-related complications that are positively associated with HbA1c decreases, the costs of negatively associated complications (eg, severe hypoglycemic events) must be calculated separately. PMID:27510441
Evolution of risk assessment strategies for food and feed uses of stacked GM events.
Kramer, Catherine; Brune, Phil; McDonald, Justin; Nesbitt, Monique; Sauve, Alaina; Storck-Weyhermueller, Sabine
2016-09-01
Data requirements are not harmonized globally for the regulation of food and feed derived from stacked genetically modified (GM) events, produced by combining individual GM events through conventional breeding. The data required by some regulatory agencies have increased despite the absence of substantiated adverse effects to animals or humans from the consumption of GM crops. Data from studies conducted over a 15-year period for several stacked GM event maize (Zea mays L.) products (Bt11 × GA21, Bt11 × MIR604, MIR604 × GA21, Bt11 × MIR604 × GA21, Bt11 × MIR162 × GA21 and Bt11 × MIR604 × MIR162 × GA21), together with their component single events, are presented. These data provide evidence that no substantial changes in composition, protein expression or insert stability have occurred after combining the single events through conventional breeding. An alternative food and feed risk assessment strategy for stacked GM events is suggested based on a problem formulation approach that utilizes (i) the outcome of the single event risk assessments, and (ii) the potential for interactions in the stack, based on an understanding of the mode of action of the transgenes and their products. © 2016 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.
Paivio, Allan; Sadoski, Mark
2011-01-01
Elman (2009) proposed that the traditional role of the mental lexicon in language processing can largely be replaced by a theoretical model of schematic event knowledge founded on dynamic context-dependent variables. We evaluate Elman's approach and propose an alternative view, based on dual coding theory and evidence that modality-specific cognitive representations contribute strongly to word meaning and language performance across diverse contexts which also have effects predictable from dual coding theory. Copyright © 2010 Cognitive Science Society, Inc.
Advanced Microsystems for Automotive Applications 2005
NASA Astrophysics Data System (ADS)
Valldorf, Jürgen; Gessner, Wolfgang
Since 1995 the annual international forum on Advanced Microsystems for Automotive Applications (AMAA) has been held in Berlin. The event offers a unique opportunity for microsystems component developers, system suppliers and car manufacturers to show and to discuss competing technological approaches of microsystems based solutions in vehicles. The book accompanying the event has demonstrated to be an efficient instrument for the diffusion of new concepts and technology results. The present volume including the papers of the AMAA 2005 gives an overview on the state-of-the-art and outlines imminent and mid-term R&D perspectives.
Tang, Zhongwen
2015-01-01
An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.
Modeling of Aerosol Optical Depth Variability during the 1998 Canadian Forest Fire Smoke Event
NASA Astrophysics Data System (ADS)
Aubé, M.; O`Neill, N. T.; Royer, A.; Lavoué, D.
2003-04-01
Monitoring of aerosol optical depth (AOD) is of particular importance due to the significant role of aerosols in the atmospheric radiative budget. Up to now the two standard techniques used for retrieving AOD are; (i) sun photometry which provides measurements of high temporal frequency and sparse spatial frequency, and (ii) satellite based approaches such as based DDV (Dense Dark Vegetation) inversion algorithms which extract AOD over dark targets in remotely sensed imagery. Although the latter techniques allow AOD retrieval over appreciable spatial domains, the irregular spatial pattern of dark targets and the typically low repeat frequencies of imaging satellites exclude the acquisition of AOD databases on a continuous spatio-temporal basis. We attempt to fill gaps in spatio-temporal AOD measurements using a new methodology that links AOD measurements and particulate matter Transport Model using a data assimilation approach. This modelling package (AODSEM for Aerosol Optical Depth Spatio-temporal Evolution Model) uses a size and aerosol type segregated semi-Lagrangian-Eulerian trajectory algorithm driven by analysed meteorological data. Its novelty resides in the fact that the model evolution is tied to both ground based and satellite level AOD measurement and all physical processes have been optimized to track this important but crude parameter. We applied this methodology to a significant smoke event that occurred over Canada in august 1998. The results show the potential of this approach inasmuch as residuals between AODSEM assimilated analysis and measurements are smaller than typical errors associated to remotely sensed AOD (satellite or ground based). The AODSEM assimilation approach also gives better results than classical interpolation techniques. This improvement is especially evident when the available number of AOD measurements is small.
New Approach for Investigating Reaction Dynamics and Rates with Ab Initio Calculations.
Fleming, Kelly L; Tiwary, Pratyush; Pfaendtner, Jim
2016-01-21
Herein, we demonstrate a convenient approach to systematically investigate chemical reaction dynamics using the metadynamics (MetaD) family of enhanced sampling methods. Using a symmetric SN2 reaction as a model system, we applied infrequent metadynamics, a theoretical framework based on acceleration factors, to quantitatively estimate the rate of reaction from biased and unbiased simulations. A systematic study of the algorithm and its application to chemical reactions was performed by sampling over 5000 independent reaction events. Additionally, we quantitatively reweighed exhaustive free-energy calculations to obtain the reaction potential-energy surface and showed that infrequent metadynamics works to effectively determine Arrhenius-like activation energies. Exact agreement with unbiased high-temperature kinetics is also shown. The feasibility of using the approach on actual ab initio molecular dynamics calculations is then presented by using Car-Parrinello MD+MetaD to sample the same reaction using only 10-20 calculations of the rare event. Owing to the ease of use and comparatively low-cost of computation, the approach has extensive potential applications for catalysis, combustion, pyrolysis, and enzymology.
Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs
NASA Astrophysics Data System (ADS)
Purba, J. H.
2018-02-01
Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.
Wang, Xuesong; Xing, Yilun; Luo, Lian; Yu, Rongjie
2018-08-01
Risky driving behavior is one of the main causes of commercial vehicle related crashes. In order to achieve safer vehicle operation, safety education for drivers is often provided. However, the education programs vary in quality and may not always be successful in reducing crash rates. Behavior-Based Safety (BBS) education is a popular approach found effective by numerous studies, but even this approach varies as to the combination of frequency, mode and content used by different education providers. This study therefore evaluates and compares the effectiveness of BBS education methods. Thirty-five drivers in Shanghai, China, were coached with one of three different BBS education methods for 13 weeks following a 13-week baseline phase with no education. A random-effects negative binomial (NB) model was built and calibrated to investigate the relationship between BBS education and the driver at-fault safety-related event rate. Based on the results of the random-effects NB model, event modification factors (EMF) were calculated to evaluate and compare the effectiveness of the methods. Results show that (1) BBS education was confirmed to be effective in safety-related event reduction; (2) the most effective method among the three applied monthly face-to-face coaching, including feedback with video and statistical data, and training on strategies to avoid driver-specific unsafe behaviors; (3) weekly telephone coaching using statistics and strategies was rated by drivers as the most convenient delivery mode, and was also significantly effective. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.
2005-03-01
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.
Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling
NASA Astrophysics Data System (ADS)
Tramblay, Yves; Bouvier, Christophe; Martin, Claude; Didon-Lescot, Jean-François; Todorovik, Dragana; Domergue, Jean-Marc
2010-06-01
Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture, modelled soil moisture through the Interaction-Sol-Biosphère-Atmosphère (ISBA) component of the SIM model (Météo-France), antecedent precipitation and base flow. A modelling approach based on the Soil Conservation Service-Curve Number method (SCS-CN) is used to simulate the flood events in a small headwater catchment in the Cevennes region (France). The model involves two parameters: one for the runoff production, S, and one for the routing component, K. The S parameter can be interpreted as the maximal water retention capacity, and acts as the initial condition of the model, depending on the antecedent moisture conditions. The model was calibrated from a 20-flood sample, and led to a median Nash value of 0.9. The local TDR measurements in the deepest layers of soil (80-140 cm) were found to be the best predictors for the S parameter. TDR measurements averaged over the whole soil profile, outputs of the SIM model, and the logarithm of base flow also proved to be good predictors, whereas antecedent precipitations were found to be less efficient. The good correlations observed between the TDR predictors and the S calibrated values indicate that monitoring soil moisture could help setting the initial conditions for simplified event-based models in small basins.
Meta-Analysis of Rare Binary Adverse Event Data
Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.
2013-01-01
We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068
Capturing rogue waves by multi-point statistics
NASA Astrophysics Data System (ADS)
Hadjihosseini, A.; Wächter, Matthias; Hoffmann, N. P.; Peinke, J.
2016-01-01
As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics.
NASA Technical Reports Server (NTRS)
Hejduk, M. D.; Pachura, D. A.
2017-01-01
Conjunction Assessment screening volumes used in the protection of NASA satellites are constructed as geometric volumes about these satellites, of a size expected to capture a certain percentage of the serious conjunction events by a certain time before closest approach. However, the analyses that established these sizes were grounded on covariance-based projections rather than empirical screening results, did not tailor the volume sizes to ensure operational actionability of those results, and did not consider the adjunct ability to produce data that could provide prevenient assistance for maneuver planning. The present study effort seeks to reconsider these questions based on a six-month dataset of empirical screening results using an extremely large screening volume. The results, pursued here for a highly-populated orbit regime near 700 km altitude, identify theoretical limits of screening volume performance, explore volume configuration to facilitate both maneuver remediation planning as well as basic asset protection, and recommend sizing principles that maximize volume performance while minimizing the capture of "chaff" conjunctions that are unlikely ever to become serious events.
BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods
NASA Astrophysics Data System (ADS)
Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.
2017-12-01
Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
A defense in depth approach for nuclear power plant accident management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chih-Yao Hsieh; Hwai-Pwu Chou
2015-07-01
An initiating event may lead to a severe accident if the plant safety functions have been challenged or operators do not follow the appropriate accident management procedures. Beyond design basis accidents are those corresponding to events of very low occurrence probability but such an accident may lead to significant consequences. The defense in depth approach is important to assure nuclear safety even in a severe accident. Plant Damage States (PDS) can be defined by the combination of the possible values for each of the PDS parameters which are showed on the nuclear power plant simulator. PDS is used to identifymore » what the initiating event is, and can also give the information of safety system's status whether they are bypassed, inoperable or not. Initiating event and safety system's status are used in the construction of Containment Event Tree (CET) to determine containment failure modes by using probabilistic risk assessment (PRA) technique. Different initiating events will correspond to different CETs. With these CETs, the core melt frequency of an initiating event can be found. The use of Plant Damage States (PDS) is a symptom-oriented approach. On the other hand, the use of Containment Event Tree (CET) is an event-oriented approach. In this study, the Taiwan's fourth nuclear power plants, the Lungmen nuclear power station (LNPS), which is an advanced boiling water reactor (ABWR) with fully digitized instrumentation and control (I and C) system is chosen as the target plant. The LNPS full scope engineering simulator is used to generate the testing data for method development. The following common initiating events are considered in this study: loss of coolant accidents (LOCA), total loss of feedwater (TLOFW), loss of offsite power (LOOP), station blackout (SBO). Studies have indicated that the combination of the symptom-oriented approach and the event-oriented approach can be helpful to find mitigation strategies and is useful for the accident management. (authors)« less
Comparing flood loss models of different complexity
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Riggelsen, Carsten; Scherbaum, Frank; Merz, Bruno
2013-04-01
Any deliberation on flood risk requires the consideration of potential flood losses. In particular, reliable flood loss models are needed to evaluate cost-effectiveness of mitigation measures, to assess vulnerability, for comparative risk analysis and financial appraisal during and after floods. In recent years, considerable improvements have been made both concerning the data basis and the methodological approaches used for the development of flood loss models. Despite of that, flood loss models remain an important source of uncertainty. Likewise the temporal and spatial transferability of flood loss models is still limited. This contribution investigates the predictive capability of different flood loss models in a split sample cross regional validation approach. For this purpose, flood loss models of different complexity, i.e. based on different numbers of explaining variables, are learned from a set of damage records that was obtained from a survey after the Elbe flood in 2002. The validation of model predictions is carried out for different flood events in the Elbe and Danube river basins in 2002, 2005 and 2006 for which damage records are available from surveys after the flood events. The models investigated are a stage-damage model, the rule based model FLEMOps+r as well as novel model approaches which are derived using data mining techniques of regression trees and Bayesian networks. The Bayesian network approach to flood loss modelling provides attractive additional information concerning the probability distribution of both model predictions and explaining variables.
Identification of Flood Reactivity Regions via the Functional Clustering of Hydrographs
NASA Astrophysics Data System (ADS)
Brunner, Manuela I.; Viviroli, Daniel; Furrer, Reinhard; Seibert, Jan; Favre, Anne-Catherine
2018-03-01
Flood hydrograph shapes contain valuable information on the flood-generation mechanisms of a catchment. To make good use of this information, we express flood hydrograph shapes as continuous functions using a functional data approach. We propose a clustering approach based on functional data for flood hydrograph shapes to identify a set of representative hydrograph shapes on a catchment scale and use these catchment-specific sets of representative hydrographs to establish regions of catchments with similar flood reactivity on a regional scale. We applied this approach to flood samples of 163 medium-size Swiss catchments. The results indicate that three representative hydrograph shapes sufficiently describe the hydrograph shape variability within a catchment and therefore can be used as a proxy for the flood behavior of a catchment. These catchment-specific sets of three hydrographs were used to group the catchments into three reactivity regions of similar flood behavior. These regions were not only characterized by similar hydrograph shapes and reactivity but also by event magnitudes and triggering event conditions. We envision these regions to be useful in regionalization studies, regional flood frequency analyses, and to allow for the construction of synthetic design hydrographs in ungauged catchments. The clustering approach based on functional data which establish these regions is very flexible and has the potential to be extended to other geographical regions or toward the use in climate impact studies.
Embodied Perspective Taking in Learning about Complex Systems
ERIC Educational Resources Information Center
Soylu, Firat; Holbert, Nathan; Brady, Corey; Wilensky, Uri
2017-01-01
In this paper we present a learning design approach that leverages perspective-taking to help students learn about complex systems. We define perspective-taking as projecting one's identity onto external entities (both animate and inanimate) in an effort to predict and anticipate events based on ecological cues, to automatically sense the…
Academic and Artistic Freedom.
ERIC Educational Resources Information Center
Strossen, Nadine
1992-01-01
Issues and recent events concerning censorship of the arts in the United States are examined, and the threat to artistic freedom posed by recent Supreme Court decisions is examined. Focus is on erosion of the actual or imminent harm requirement of the law and on the court's class-based approach to free speech. (MSE)
Development of high-throughput assays for chemical screening and hazard identification is a pressing priority worldwide. One approach uses in vitro, cell-based assays which recapitulate biological events observed in vivo. Neurite outgrowth is one such critical cellular process un...
Hua, Wei; Sun, Guoying; Dodd, Caitlin N; Romio, Silvana A; Whitaker, Heather J; Izurieta, Hector S; Black, Steven; Sturkenboom, Miriam C J M; Davis, Robert L; Deceuninck, Genevieve; Andrews, N J
2013-08-01
The assumption that the occurrence of outcome event must not alter subsequent exposure probability is critical for preserving the validity of the self-controlled case series (SCCS) method. This assumption is violated in scenarios in which the event constitutes a contraindication for exposure. In this simulation study, we compared the performance of the standard SCCS approach and two alternative approaches when the event-independent exposure assumption was violated. Using the 2009 H1N1 and seasonal influenza vaccines and Guillain-Barré syndrome as a model, we simulated a scenario in which an individual may encounter multiple unordered exposures and each exposure may be contraindicated by the occurrence of outcome event. The degree of contraindication was varied at 0%, 50%, and 100%. The first alternative approach used only cases occurring after exposure with follow-up time starting from exposure. The second used a pseudo-likelihood method. When the event-independent exposure assumption was satisfied, the standard SCCS approach produced nearly unbiased relative incidence estimates. When this assumption was partially or completely violated, two alternative SCCS approaches could be used. While the post-exposure cases only approach could handle only one exposure, the pseudo-likelihood approach was able to correct bias for both exposures. Violation of the event-independent exposure assumption leads to an overestimation of relative incidence which could be corrected by alternative SCCS approaches. In multiple exposure situations, the pseudo-likelihood approach is optimal; the post-exposure cases only approach is limited in handling a second exposure and may introduce additional bias, thus should be used with caution. Copyright © 2013 John Wiley & Sons, Ltd.
Evidence for spin correlation in tt production
Abazov, Victor Mukhamedovich
2012-01-19
We present a measurement of the ratio of events with correlated t and t spins to the total number of tt events. This ratio f is evaluated using a matrix-element-based approach in 729 tt candidate events with a single lepton ℓ (electron or muon) and at least four jets. The analyzed pp collisions data correspond to an integrated luminosity of 5.3 fb -1 and were collected with the D0 detector at the Fermilab Tevatron collider operating at a center-of-mass energy \\(\\sqrt{s}=1.96\\) TeV. Combining this result with a recent measurement of f in dileptonic final states, we find f in agreementmore » with the standard model. In addition, the combination provides evidence for the presence of spin correlation in tt events with a significance of more than 3 standard deviations.« less
Ding, Fangyu; Ge, Quansheng; Fu, Jingying; Hao, Mengmeng
2017-01-01
Terror events can cause profound consequences for the whole society. Finding out the regularity of terrorist attacks has important meaning for the global counter-terrorism strategy. In the present study, we demonstrate a novel method using relatively popular and robust machine learning methods to simulate the risk of terrorist attacks at a global scale based on multiple resources, long time series and globally distributed datasets. Historical data from 1970 to 2015 was adopted to train and evaluate machine learning models. The model performed fairly well in predicting the places where terror events might occur in 2015, with a success rate of 96.6%. Moreover, it is noteworthy that the model with optimized tuning parameter values successfully predicted 2,037 terrorism event locations where a terrorist attack had never happened before. PMID:28591138
Ding, Fangyu; Ge, Quansheng; Jiang, Dong; Fu, Jingying; Hao, Mengmeng
2017-01-01
Terror events can cause profound consequences for the whole society. Finding out the regularity of terrorist attacks has important meaning for the global counter-terrorism strategy. In the present study, we demonstrate a novel method using relatively popular and robust machine learning methods to simulate the risk of terrorist attacks at a global scale based on multiple resources, long time series and globally distributed datasets. Historical data from 1970 to 2015 was adopted to train and evaluate machine learning models. The model performed fairly well in predicting the places where terror events might occur in 2015, with a success rate of 96.6%. Moreover, it is noteworthy that the model with optimized tuning parameter values successfully predicted 2,037 terrorism event locations where a terrorist attack had never happened before.
Formal analysis of imprecise system requirements with Event-B.
Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan
2016-01-01
Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.
Near real-time vaccine safety surveillance with partially accrued data.
Greene, Sharon K; Kulldorff, Martin; Yin, Ruihua; Yih, W Katherine; Lieu, Tracy A; Weintraub, Eric S; Lee, Grace M
2011-06-01
The Vaccine Safety Datalink (VSD) Project conducts near real-time vaccine safety surveillance using sequential analytic methods. Timely surveillance is critical in identifying potential safety problems and preventing additional exposure before most vaccines are administered. For vaccines that are administered during a short period, such as influenza vaccines, timeliness can be improved by undertaking analyses while risk windows following vaccination are ongoing and by accommodating predictable and unpredictable data accrual delays. We describe practical solutions to these challenges, which were adopted by the VSD Project during pandemic and seasonal influenza vaccine safety surveillance in 2009/2010. Adjustments were made to two sequential analytic approaches. The Poisson-based approach compared the number of pre-defined adverse events observed following vaccination with the number expected using historical data. The expected number was adjusted for the proportion of the risk window elapsed and the proportion of inpatient data estimated to have accrued. The binomial-based approach used a self-controlled design, comparing the observed numbers of events in risk versus comparison windows. Events were included in analysis only if they occurred during a week that had already passed for both windows. Analyzing data before risk windows fully elapsed improved the timeliness of safety surveillance. Adjustments for data accrual lags were tailored to each data source and avoided biasing analyses away from detecting a potential safety problem, particularly early during surveillance. The timeliness of vaccine and drug safety surveillance can be improved by properly accounting for partially elapsed windows and data accrual delays. Copyright © 2011 John Wiley & Sons, Ltd.
Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments
NASA Astrophysics Data System (ADS)
Berk, Mario; Å pačková, Olga; Straub, Daniel
2017-12-01
The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.
Johansson, Kerstin; Strömbergsson, Sofia; Robieux, Camille; McAllister, Anita
2017-01-01
Reduced respiratory function following lower cervical spinal cord injuries (CSCIs) may indirectly result in vocal dysfunction. Although self-reports indicate voice change and limitations following CSCI, earlier efforts using global perceptual ratings to distinguish speakers with CSCI from noninjured speakers have not been very successful. We investigate the use of an audience response system-based approach to distinguish speakers with CSCI from noninjured speakers, and explore whether specific vocal traits can be identified as characteristic for speakers with CSCI. Fourteen speech-language pathologists participated in a web-based perceptual task, where their overt reactions to vocal dysfunction were registered during the continuous playback of recordings of 36 speakers (18 with CSCI, and 18 matched controls). Dysphonic events were identified through manual perceptual analysis, to allow the exploration of connections between dysphonic events and listener reactions. More dysphonic events, and more listener reactions, were registered for speakers with CSCI than for noninjured speakers. Strain (particularly in phrase-final position) and creak (particularly in nonphrase-final position) distinguish speakers with CSCI from noninjured speakers. For the identification of intermittent and subtle signs of vocal dysfunction, an approach where the temporal distribution of symptoms is registered offers a viable means to distinguish speakers affected by voice dysfunction from non-affected speakers. In speakers with CSCI, clinicians should listen for presence of final strain and nonfinal creak, and pay attention to self-reported voice function and voice problems, to identify individuals in need for clinical assessment and intervention. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Climate and its change over the Tibetan Plateau and its Surroundings in 1963-2015
NASA Astrophysics Data System (ADS)
Ding, J.; Cuo, L.
2017-12-01
Tibetan Plateau and its surroundings (TPS, 23°-43°N, 73°-106°E) lies in the southwest of China and includes Tibet Autonomous Region, Qinghai Province, southern Xinjiang Uygur Autonomous Region, part of Gansu Province, western Sichuan Province, and northern Yunnan Province. The region is of strategic importance in water resources because it is the headwater of ten large rivers that support more than 16 billion population. In this study, we use daily temperature maximum and minimum, precipitation and wind speed in 1963-2015 obtained from Climate Data Center of China Meteorological Administration and Qinghai Meteorological Bureau to investigate extreme climate conditions and their changes over the TPS. The extreme events are selected based on annual extreme values and percentiles. Annual extreme value approach produces one value each year for all variables, which enables us to examine the magnitude of extreme events; whereas percentile approach selects extreme values by setting 95th percentile as thresholds for maximum temperature, precipitation and wind speed, and 5th percentile for minimum temperature. Percentile approach not only enables us to investigate the magnitude but also frequency of the extreme events. Also, Mann-Kendall trend and mutation analysis were applied to analyze the changes in mean and extreme conditions. The results will help us understand more about the extreme events during the past five decades on the TPS and will provide valuable information for the upcoming IPCC reports on climate change.
Projecting adverse event incidence rates using empirical Bayes methodology.
Ma, Guoguang Julie; Ganju, Jitendra; Huang, Jing
2016-08-01
Although there is considerable interest in adverse events observed in clinical trials, projecting adverse event incidence rates in an extended period can be of interest when the trial duration is limited compared to clinical practice. A naïve method for making projections might involve modeling the observed rates into the future for each adverse event. However, such an approach overlooks the information that can be borrowed across all the adverse event data. We propose a method that weights each projection using a shrinkage factor; the adverse event-specific shrinkage is a probability, based on empirical Bayes methodology, estimated from all the adverse event data, reflecting evidence in support of the null or non-null hypotheses. Also proposed is a technique to estimate the proportion of true nulls, called the common area under the density curves, which is a critical step in arriving at the shrinkage factor. The performance of the method is evaluated by projecting from interim data and then comparing the projected results with observed results. The method is illustrated on two data sets. © The Author(s) 2013.
An Efficient Pattern Mining Approach for Event Detection in Multivariate Temporal Data
Batal, Iyad; Cooper, Gregory; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
This work proposes a pattern mining approach to learn event detection models from complex multivariate temporal data, such as electronic health records. We present Recent Temporal Pattern mining, a novel approach for efficiently finding predictive patterns for event detection problems. This approach first converts the time series data into time-interval sequences of temporal abstractions. It then constructs more complex time-interval patterns backward in time using temporal operators. We also present the Minimal Predictive Recent Temporal Patterns framework for selecting a small set of predictive and non-spurious patterns. We apply our methods for predicting adverse medical events in real-world clinical data. The results demonstrate the benefits of our methods in learning accurate event detection models, which is a key step for developing intelligent patient monitoring and decision support systems. PMID:26752800
A Hot-Deck Multiple Imputation Procedure for Gaps in Longitudinal Recurrent Event Histories
Wang, Chia-Ning; Little, Roderick; Nan, Bin; Harlow, Siobán D.
2012-01-01
Summary We propose a regression-based hot deck multiple imputation method for gaps of missing data in longitudinal studies, where subjects experience a recurrent event process and a terminal event. Examples are repeated asthma episodes and death, or menstrual periods and the menopause, as in our motivating application. Research interest concerns the onset time of a marker event, defined by the recurrent-event process, or the duration from this marker event to the final event. Gaps in the recorded event history make it difficult to determine the onset time of the marker event, and hence, the duration from onset to the final event. Simple approaches such as jumping gap times or dropping cases with gaps have obvious limitations. We propose a procedure for imputing information in the gaps by substituting information in the gap from a matched individual with a completely recorded history in the corresponding interval. Predictive Mean Matching is used to incorporate information on longitudinal characteristics of the repeated process and the final event time. Multiple imputation is used to propagate imputation uncertainty. The procedure is applied to an important data set for assessing the timing and duration of the menopausal transition. The performance of the proposed method is assessed by a simulation study. PMID:21361886
Real-time Social Internet Data to Guide Forecasting Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Valle, Sara Y.
Our goal is to improve decision support by monitoring and forecasting events using social media, mathematical models, and quantifying model uncertainty. Our approach is real-time, data-driven forecasts with quantified uncertainty: Not just for weather anymore. Information flow from human observations of events through an Internet system and classification algorithms is used to produce quantitatively uncertain forecast. In summary, we want to develop new tools to extract useful information from Internet data streams, develop new approaches to assimilate real-time information into predictive models, validate approaches by forecasting events, and our ultimate goal is to develop an event forecasting system using mathematicalmore » approaches and heterogeneous data streams.« less
Tamblyn, Robyn; Huang, Allen R; Meguerditchian, Ari N; Winslade, Nancy E; Rochefort, Christian; Forster, Alan; Eguale, Tewodros; Buckeridge, David; Jacques, André; Naicker, Kiyuri; Reidel, Kristen E
2012-08-27
Adverse drug events are responsible for up to 7% of all admissions to acute care hospitals. At least 58% of these are preventable, resulting from incomplete drug information, prescribing or dispensing errors, and overuse or underuse of medications. Effective implementation of medication reconciliation is considered essential to reduce preventable adverse drug events occurring at transitions between community and hospital care. An electronically enabled discharge reconciliation process represents an innovative approach to this problem. Participants will be recruited in Quebec and are eligible for inclusion if they are using prescription medication at admission, covered by the Quebec drug insurance plan, admitted from the community, 18 years or older, admitted to a general or intensive care medical or surgical unit, and discharged alive. A sample size of 3,714 will be required to detect a 5% reduction in adverse drug events. The intervention will comprise electronic retrieval of the community drug list, combined with an electronic discharge reconciliation module and an electronic discharge communication module. The primary outcomes will be adverse drug events occurring 30 days post-discharge, identified by a combination of patient self-report and chart abstraction. All emergency room visits and hospital readmission during this period will be measured as secondary outcomes. A cluster randomization approach will be used to allocate 16 medical and 10 surgical units to electronic discharge reconciliation and communication versus usual care. An intention-to-treat approach will be used to analyse data. Logistic regression will be undertaken within a generalized estimating equation framework to account for clustering within units. The goal of this prospective trial is to determine if electronically enabled discharge reconciliation will reduce the risk of adverse drug events, emergency room visits and readmissions 30 days post-discharge compared with usual care. We expect that this intervention will improve adherence to medication reconciliation at discharge, the accuracy of the community-based drug history and effective communication of hospital-based treatment changes to community care providers. The results may support policy-directed investments in computerizing and training of hospital staff, generate key requirements for future hospital accreditation standards, and highlight functional requirements for software vendors. NCT01179867.
Probabilistic clustering of rainfall condition for landslide triggering
NASA Astrophysics Data System (ADS)
Rossi, Mauro; Luciani, Silvia; Cesare Mondini, Alessandro; Kirschbaum, Dalia; Valigi, Daniela; Guzzetti, Fausto
2013-04-01
Landslides are widespread natural and man made phenomena. They are triggered by earthquakes, rapid snow melting, human activities, but mostly by typhoons and intense or prolonged rainfall precipitations. In Italy mostly they are triggered by intense precipitation. The prediction of landslide triggered by rainfall precipitations over large areas is commonly based on the exploitation of empirical models. Empirical landslide rainfall thresholds are used to identify rainfall conditions for the possible landslide initiation. It's common practice to define rainfall thresholds by assuming a power law lower boundary in the rainfall intensity-duration or cumulative rainfall-duration space above which landslide can occur. The boundary is defined considering rainfall conditions associated to landslide phenomena using heuristic approaches, and doesn't consider rainfall events not causing landslides. Here we present a new fully automatic method to identify the probability of landslide occurrence associated to rainfall conditions characterized by measures of intensity or cumulative rainfall and rainfall duration. The method splits the rainfall events of the past in two groups: a group of events causing landslides and its complementary, then estimate their probabilistic distributions. Next, the probabilistic membership of the new event to one of the two clusters is estimated. The method doesn't assume a priori any threshold model, but simple exploits the real empirical distribution of rainfall events. The approach was applied in the Umbria region, Central Italy, where a catalogue of landslide timing, were obtained through the search of chronicles, blogs and other source of information in the period 2002-2012. The approach was tested using rain gauge measures and satellite rainfall estimates (NASA TRMM-v6), allowing in both cases the identification of the rainfall condition triggering landslides in the region. Compared to the other existing threshold definition methods, the prosed one (i) largely reduces the subjectivity in the choice of the threshold model and in how it is calculated, and (ii) it can be easier set-up in other study areas. The proposed approach can be conveniently integrated in existing early-warning system to improve the accuracy of the estimation of the real landslide occurrence probability associated to rainfall events and its uncertainty.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
NASA Astrophysics Data System (ADS)
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Cold black holes in the Harlow–Hayden approach to firewalls
Ong, Yen Chin; McInnes, Brett; Chen, Pisin
2014-12-31
Firewalls are controversial principally because they seem to imply departures from general relativistic expectations in regions of spacetime where the curvature need not be particularly large. One of the virtues of the Harlow–Hayden approach to the firewall paradox, concerning the time available for decoding of Hawking radiation emanating from charged AdS black holes, is precisely that it operates in the context of cold black holes, which are not strongly curved outside the event horizon. Here we clarify this point. The approach is based on ideas borrowed from applications of the AdS/CFT correspondence to the quark–gluon plasma. Firewalls aside, our workmore » presents a detailed analysis of the thermodynamics and evolution of evaporating charged AdS black holes with flat event horizons. We show that, in one way or another, these black holes are always eventually destroyed in a time which, while long by normal standards, is short relative to the decoding time of Hawking radiation.« less
The Earthquake Early Warning System In Southern Italy: Performance Tests And Next Developments
NASA Astrophysics Data System (ADS)
Zollo, A.; Elia, L.; Martino, C.; Colombelli, S.; Emolo, A.; Festa, G.; Iannaccone, G.
2011-12-01
PRESTo (PRobabilistic and Evolutionary early warning SysTem) is the software platform for Earthquake Early Warning (EEW) in Southern Italy, that integrates recent algorithms for real-time earthquake location, magnitude estimation and damage assessment, into a highly configurable and easily portable package. The system is under active experimentation based on the Irpinia Seismic Network (ISNet). PRESTo processes the live streams of 3C acceleration data for P-wave arrival detection and, while an event is occurring, promptly performs event detection and provides location, magnitude estimations and peak ground shaking predictions at target sites. The earthquake location is obtained by an evolutionary, real-time probabilistic approach based on an equal differential time formulation. At each time step, it uses information from both triggered and not-yet-triggered stations. Magnitude estimation exploits an empirical relationship that correlates it to the filtered Peak Displacement (Pd), measured over the first 2-4 s of P-signal. Peak ground-motion parameters at any distance can be finally estimated by ground motion prediction equations. Alarm messages containing the updated estimates of these parameters can thus reach target sites before the destructive waves, enabling automatic safety procedures. Using the real-time data streaming from the ISNet network, PRESTo has produced a bulletin for about a hundred low-magnitude events occurred during last two years. Meanwhile, the performances of the EEW system were assessed off-line playing-back the records for moderate and large events from Italy, Spain and Japan and synthetic waveforms for large historical events in Italy. These tests have shown that, when a dense seismic network is deployed in the fault area, PRESTo produces reliable estimates of earthquake location and size within 5-6 s from the event origin time (To). Estimates are provided as probability density functions whose uncertainty typically decreases with time, obtaining a stable solution within 10 s from To. The regional approach was recently integrated with a threshold-based early warning method for the definition of alert levels and the estimation of the Potential Damaged Zone (PDZ) in which the highest intensity levels are expected. The dominant period Tau_c and the peak displacement (Pd) are simultaneously measured in a 3s window after the first P-arrival time. Pd and Tau_c are then compared with threshold values, previously established through an empirical regression analysis, that define a decisional table with four alert levels. According to the real-time measured values of Pd and tau_c, each station provides a local alert level that can be used to warn distant sites and to define the extent of the PDZ. The integrated system was validated off-line for the M6.3, 2009 Central Italy earthquake and ten large Japanese events, due to the low-magnitude events currently occurring in Irpinia. The results confirmed the feasibility and the robustness of such an approach, providing reliable predictions of the earthquake damaging effects, that is a relevant information for the efficient planning of the rescue operations in the immediate post-event emergency phase.
Meier-Hirmer, Carolina; Schumacher, Martin
2013-06-20
The aim of this article is to propose several methods that allow to investigate how and whether the shape of the hazard ratio after an intermediate event depends on the waiting time to occurrence of this event and/or the sojourn time in this state. A simple multi-state model, the illness-death model, is used as a framework to investigate the occurrence of this intermediate event. Several approaches are shown and their advantages and disadvantages are discussed. All these approaches are based on Cox regression. As different time-scales are used, these models go beyond Markov models. Different estimation methods for the transition hazards are presented. Additionally, time-varying covariates are included into the model using an approach based on fractional polynomials. The different methods of this article are then applied to a dataset consisting of four studies conducted by the German Breast Cancer Study Group (GBSG). The occurrence of the first isolated locoregional recurrence (ILRR) is studied. The results contribute to the debate on the role of the ILRR with respect to the course of the breast cancer disease and the resulting prognosis. We have investigated different modelling strategies for the transition hazard after ILRR or in general after an intermediate event. Including time-dependent structures altered the resulting hazard functions considerably and it was shown that this time-dependent structure has to be taken into account in the case of our breast cancer dataset. The results indicate that an early recurrence increases the risk of death. A late ILRR increases the hazard function much less and after the successful removal of the second tumour the risk of death is almost the same as before the recurrence. With respect to distant disease, the appearance of the ILRR only slightly increases the risk of death if the recurrence was treated successfully. It is important to realize that there are several modelling strategies for the intermediate event and that each of these strategies has restrictions and may lead to different results. Especially in the medical literature considering breast cancer development, the time-dependency is often neglected in the statistical analyses. We show that the time-varying variables cannot be neglected in the case of ILRR and that fractional polynomials are a useful tool for finding the functional form of these time-varying variables.
Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin
2010-09-01
The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the evaluation of corrective measures and recognition of ineffective measures and efforts. The electronic system was relatively well accepted by personnel and resulted in minimal disruption of clinical work. Event reporting in the quarters with the fewest number of reported events, though voluntary, was almost four times greater than the most events reported in any one quarter with the paper-based system and remained consistent from the inception of the process through the date of this report. However, the acceptance was not universal, validating the need for improved education regarding reporting processes and systematic approaches to reporting culture development. Specially designed electronic event reporting systems in a radiotherapy setting can provide valuable data for process and patient safety improvement and are more effective reporting mechanisms than paper-based systems. Additional work is needed to develop methods that can more effectively utilize reported data for process improvement, including the development of standardized event taxonomy and a classification system for RT.
Data science approaches to pharmacogenetics.
Penrod, N M; Moore, J H
2014-01-01
Pharmacogenetic studies rely on applied statistics to evaluate genetic data describing natural variation in response to pharmacotherapeutics such as drugs and vaccines. In the beginning, these studies were based on candidate gene approaches that specifically focused on efficacy or adverse events correlated with variants of single genes. This hypothesis driven method required the researcher to have a priori knowledge of which genes or gene sets to investigate. According to rational design, the focus of these studies has been on drug metabolizing enzymes, drug transporters, and drug targets. As technology has progressed, these studies have transitioned to hypothesis-free explorations where markers across the entire genome can be measured in large scale, population based, genome-wide association studies (GWAS). This enables identification of novel genetic biomarkers, therapeutic targets, and analysis of gene-gene interactions, which may reveal molecular mechanisms of drug activities. Ultimately, the challenge is to utilize gene-drug associations to create dosing algorithms based individual genotypes, which will guide physicians and ensure they prescribe the correct dose of the correct drug the first time eliminating trial-and-error and adverse events. We review here basic concepts and applications of data science to the genetic analysis of pharmacologic outcomes.
Herrando-Grabulosa, Mireia; Mulet, Roger; Pujol, Albert; Mas, José Manuel; Navarro, Xavier; Aloy, Patrick; Coma, Mireia; Casas, Caty
2016-01-01
Amyotrophic Lateral Sclerosis is a fatal, progressive neurodegenerative disease characterized by loss of motor neuron function for which there is no effective treatment. One of the main difficulties in developing new therapies lies on the multiple events that contribute to motor neuron death in amyotrophic lateral sclerosis. Several pathological mechanisms have been identified as underlying events of the disease process, including excitotoxicity, mitochondrial dysfunction, oxidative stress, altered axonal transport, proteasome dysfunction, synaptic deficits, glial cell contribution, and disrupted clearance of misfolded proteins. Our approach in this study was based on a holistic vision of these mechanisms and the use of computational tools to identify polypharmacology for targeting multiple etiopathogenic pathways. By using a repositioning analysis based on systems biology approach (TPMS technology), we identified and validated the neuroprotective potential of two new drug combinations: Aliretinoin and Pranlukast, and Aliretinoin and Mefloquine. In addition, we estimated their molecular mechanisms of action in silico and validated some of these results in a well-established in vitro model of amyotrophic lateral sclerosis based on cultured spinal cord slices. The results verified that Aliretinoin and Pranlukast, and Aliretinoin and Mefloquine promote neuroprotection of motor neurons and reduce microgliosis. PMID:26807587
Pixel-based flood mapping from SAR imagery: a comparison of approaches
NASA Astrophysics Data System (ADS)
Landuyt, Lisa; Van Wesemael, Alexandra; Van Coillie, Frieke M. B.; Verhoest, Niko E. C.
2017-04-01
Due to their all-weather, day and night capabilities, SAR sensors have been shown to be particularly suitable for flood mapping applications. Thus, they can provide spatially-distributed flood extent data which are valuable for calibrating, validating and updating flood inundation models. These models are an invaluable tool for water managers, to take appropriate measures in times of high water levels. Image analysis approaches to delineate flood extent on SAR imagery are numerous. They can be classified into two categories, i.e. pixel-based and object-based approaches. Pixel-based approaches, e.g. thresholding, are abundant and in general computationally inexpensive. However, large discrepancies between these techniques exist and often subjective user intervention is needed. Object-based approaches require more processing but allow for the integration of additional object characteristics, like contextual information and object geometry, and thus have significant potential to provide an improved classification result. As means of benchmark, a selection of pixel-based techniques is applied on a ERS-2 SAR image of the 2006 flood event of River Dee, United Kingdom. This selection comprises Otsu thresholding, Kittler & Illingworth thresholding, the Fine To Coarse segmentation algorithm and active contour modelling. The different classification results are evaluated and compared by means of several accuracy measures, including binary performance measures.
Massé, Fabien; Gonzenbach, Roman R; Arami, Arash; Paraschiv-Ionescu, Anisoara; Luft, Andreas R; Aminian, Kamiar
2015-08-25
Stroke survivors often suffer from mobility deficits. Current clinical evaluation methods, including questionnaires and motor function tests, cannot provide an objective measure of the patients' mobility in daily life. Physical activity performance in daily-life can be assessed using unobtrusive monitoring, for example with a single sensor module fixed on the trunk. Existing approaches based on inertial sensors have limited performance, particularly in detecting transitions between different activities and postures, due to the inherent inter-patient variability of kinematic patterns. To overcome these limitations, one possibility is to use additional information from a barometric pressure (BP) sensor. Our study aims at integrating BP and inertial sensor data into an activity classifier in order to improve the activity (sitting, standing, walking, lying) recognition and the corresponding body elevation (during climbing stairs or when taking an elevator). Taking into account the trunk elevation changes during postural transitions (sit-to-stand, stand-to-sit), we devised an event-driven activity classifier based on fuzzy-logic. Data were acquired from 12 stroke patients with impaired mobility, using a trunk-worn inertial and BP sensor. Events, including walking and lying periods and potential postural transitions, were first extracted. These events were then fed into a double-stage hierarchical Fuzzy Inference System (H-FIS). The first stage processed the events to infer activities and the second stage improved activity recognition by applying behavioral constraints. Finally, the body elevation was estimated using a pattern-enhancing algorithm applied on BP. The patients were videotaped for reference. The performance of the algorithm was estimated using the Correct Classification Rate (CCR) and F-score. The BP-based classification approach was benchmarked against a previously-published fuzzy-logic classifier (FIS-IMU) and a conventional epoch-based classifier (EPOCH). The algorithm performance for posture/activity detection, in terms of CCR was 90.4 %, with 3.3 % and 5.6 % improvements against FIS-IMU and EPOCH, respectively. The proposed classifier essentially benefits from a better recognition of standing activity (70.3 % versus 61.5 % [FIS-IMU] and 42.5 % [EPOCH]) with 98.2 % CCR for body elevation estimation. The monitoring and recognition of daily activities in mobility-impaired stoke patients can be significantly improved using a trunk-fixed sensor that integrates BP, inertial sensors, and an event-based activity classifier.
Schilirò, Luca; Montrasio, Lorella; Scarascia Mugnozza, Gabriele
2016-11-01
In recent years, physically-based numerical models have frequently been used in the framework of early-warning systems devoted to rainfall-induced landslide hazard monitoring and mitigation. For this reason, in this work we describe the potential of SLIP (Shallow Landslides Instability Prediction), a simplified physically-based model for the analysis of shallow landslide occurrence. In order to test the reliability of this model, a back analysis of recent landslide events occurred in the study area (located SW of Messina, northeastern Sicily, Italy) on October 1st, 2009 was performed. The simulation results have been compared with those obtained for the same event by using TRIGRS, another well-established model for shallow landslide prediction. Afterwards, a simulation over a 2-year span period has been performed for the same area, with the aim of evaluating the performance of SLIP as early warning tool. The results confirm the good predictive capability of the model, both in terms of spatial and temporal prediction of the instability phenomena. For this reason, we recommend an operating procedure for the real-time definition of shallow landslide triggering scenarios at the catchment scale, which is based on the use of SLIP calibrated through a specific multi-methodological approach. Copyright © 2016 Elsevier B.V. All rights reserved.
LINEBACKER: LINE-speed Bio-inspired Analysis and Characterization for Event Recognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oehmen, Christopher S.; Bruillard, Paul J.; Matzke, Brett D.
2016-08-04
The cyber world is a complex domain, with digital systems mediating a wide spectrum of human and machine behaviors. While this is enabling a revolution in the way humans interact with each other and data, it also is exposing previously unreachable infrastructure to a worldwide set of actors. Existing solutions for intrusion detection and prevention that are signature-focused typically seek to detect anomalous and/or malicious activity for the sake of preventing or mitigating negative impacts. But a growing interest in behavior-based detection is driving new forms of analysis that move the emphasis from static indicators (e.g. rule-based alarms or tripwires)more » to behavioral indicators that accommodate a wider contextual perspective. Similar to cyber systems, biosystems have always existed in resource-constrained hostile environments where behaviors are tuned by context. So we look to biosystems as an inspiration for addressing behavior-based cyber challenges. In this paper, we introduce LINEBACKER, a behavior-model based approach to recognizing anomalous events in network traffic and present the design of this approach of bio-inspired and statistical models working in tandem to produce individualized alerting for a collection of systems. Preliminary results of these models operating on historic data are presented along with a plugin to support real-world cyber operations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Prescott, Steven; Coleman, Justin
This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communicationmore » and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.« less
Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G
2011-01-01
A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.
Investigation of persistent Multiplets at the EGS reservoir of Soultz-Sous-Forêts, France
NASA Astrophysics Data System (ADS)
Lengliné, O.; Cauchie, L.; Schmittbuhl, J.
2017-12-01
During the exploitation of geothermal reservoirs, abundant seismicity is generally observed, especially during phases of hydraulic stimulations. The induced seismicity at the Enhanced Geothermal System of Soultz-Sous-Forêts in France, has been thoroughly studied over the years of exploitation. The mechanism at its origin has been related to both fluid pressure increases during stimulation and aseismic creeping movements. The fluid-induced seismic events often exhibit a high degree of similarity and the mechanism at the origin of these repeated events is thought to be associated with slow slip process where asperities on the rupture zone act several times.To have a better understanding of the mechanisms associated with such events and on the damaged zones involved during the hydraulic stimulations, we investigate the behavior of the multiplets and their persistent nature over several water injection intervals. For this purpose, we analyzed large datasets recorded from a borehole seismic network for several water injection periods (1993, 2000). For each stimulation interval, thousands of events are recorded at depth. We detected the events using a STA/LTA approach and classified them into families of comparable waveforms using an approach based on cross-correlation analysis. Classification of the seismic events is then improved depending on their location within the multiplets. For this purpose, inter-event distances within multiplets are examined and determined from cross-correlation analysis between pairs of events. These distances are then compared to the source dimensions derived from the estimation of the corner frequencies estimation. The multiplets properties (location, events size) are then investigated within and over several hydraulic tests. Hopefully these steps will lead to increase the knowledge on the repetitive nature of these events and the investigation of their persistence will outline the heterogeneities of the structures (regional stress perturbations, fluid flow channeling) regularly involved during the different stimulations.
Pick- and waveform-based techniques for real-time detection of induced seismicity
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Scarabello, Luca; Böse, Maren; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2018-05-01
The monitoring of induced seismicity is a common operation in many industrial activities, such as conventional and non-conventional hydrocarbon production or mining and geothermal energy exploitation, to cite a few. During such operations, we generally collect very large and strongly noise-contaminated data sets that require robust and automated analysis procedures. Induced seismicity data sets are often characterized by sequences of multiple events with short interevent times or overlapping events; in these cases, pick-based location methods may struggle to correctly assign picks to phases and events, and errors can lead to missed detections and/or reduced location resolution and incorrect magnitudes, which can have significant consequences if real-time seismicity information are used for risk assessment frameworks. To overcome these issues, different waveform-based methods for the detection and location of microseismicity have been proposed. The main advantages of waveform-based methods is that they appear to perform better and can simultaneously detect and locate seismic events providing high-quality locations in a single step, while the main disadvantage is that they are computationally expensive. Although these methods have been applied to different induced seismicity data sets, an extensive comparison with sophisticated pick-based detection methods is still missing. In this work, we introduce our improved waveform-based detector and we compare its performance with two pick-based detectors implemented within the SeiscomP3 software suite. We test the performance of these three approaches with both synthetic and real data sets related to the induced seismicity sequence at the deep geothermal project in the vicinity of the city of St. Gallen, Switzerland.
Minian, Nadia; Noormohamed, Aliya; Zawertailo, Laurie; Baliunas, Dolly; Giesbrecht, Norman; Le Foll, Bernard; Rehm, Jürgen; Samokhvalov, Andriy; Selby, Peter L
2018-01-01
The purpose of this paper is to describe a patient engagement event designed to create an educational workbook with smokers who drink alcohol at harmful levels. The goal was to create a workbook that combined scientific evidence with patients' values, preferences, and needs. Fourteen adult smokers who drink alcohol were invited to the Centre for Addiction and Mental Health (CAMH) to take part in a four-hour event to help design the workbook with the CAMH research team. Participants provided their opinions and ideas to create an outline for the workbook, including activities, images, and titles. The workbook - called Self-Awareness - is currently being offered in a smoking cessation program in 221 primary care clinics across Ontario to help smokers quit or reduce their harmful alcohol use. The patient engagement event was a useful way to co-create educational materials that incorporate both scientific research and patient needs. Background Evidence-based medicine is the integration of best research evidence with clinical expertise and patient values. There are few methodologies on how to design evidence-based programs and resources to include patient values. The latter is an important aspect of patient-centered care, and is essential for patients to trust the recommendations and empower them as consumers to make informed choices. This manuscript describes a participatory research approach to design patient-facing educational materials that incorporate both evidence-based and community-sensitive principles. These materials are intended to support smokers to reduce or stop harmful alcohol consumption. Methods Adult smokers who report consuming alcohol were invited to a co-creation meeting at the Centre for Addiction and Mental Health's Nicotine Dependence Service to guide the adaptation of evidence-based materials. The four-hour event consisted of individual reflections, group discussions, and consensus-building interactions. Detailed notes were taken and then incorporated into the material. Results Fourteen individuals participated in the event. The end product was a descriptive outline of an educational resource - entitled Self-Awareness - incorporating material from evidence-based workbooks and patient-driven features. Participants collaboratively selected the resource's content, structure, and titles. Conclusions This model describes a participatory research method that emphasizes the value of the patient perspective; preliminary evidence finds this adaptation approach can increase the adoption of resources. The process described in this article could be replicated in other settings to co-create evidence-based resources, interventions, and programs that reflect the needs of the community. Trial registration ClinicalTrials.gov NCT03108144. Retrospectively registered 11 April 2017.
Avoiding your greatest fear--malpractice.
Coy, Kenneth; Stratton, Russell
2002-01-01
This article discusses ten clinically based behavioral approaches to minimizing the risk of a malpractice claim. Suggestions are stated in both a positive and negative way and ranked from least significant to most significant. Recommendations include the need to develop effective listening skills; learning to communicate with patients verbally and in writing; keeping patient expectations realistic; being thorough when examining and diagnosing; and knowing one's limitations. Also included is the need to inform patients concerning adverse events; keeping written records of what was said and done; discussing alternatives, risks, complications, and fees in advance; and developing a relationship with patients based on mutual respect and trust. Case examples are presented for each approach.
Toward of a highly integrated probe for improving wireless network quality
NASA Astrophysics Data System (ADS)
Ding, Fei; Song, Aiguo; Wu, Zhenyang; Pan, Zhiwen; You, Xiaohu
2016-10-01
Quality of service and customer perception is the focus of the telecommunications industry. This paper proposes a low-cost approach to the acquisition of terminal data, collected from LTE networks with the application of a soft probe, based on the Java language. The soft probe includes support for fast call in the form of a referenced library, and can be integrated into various Android-based applications to automatically monitor any exception event in the network. Soft probe-based acquisition of terminal data has the advantages of low cost and can be applied on large scale. Experiment shows that a soft probe can efficiently obtain terminal network data. With this method, the quality of service of LTE networks can be determined from acquired wireless data. This work contributes to efficient network optimization, and the analysis of abnormal network events.
Monitoring of waste disposal in deep geological formations
NASA Astrophysics Data System (ADS)
German, V.; Mansurov, V.
2003-04-01
In the paper application of kinetic approach for description of rock failure process and waste disposal microseismic monitoring is advanced. On base of two-stage model of failure process the capability of rock fracture is proved. The requests to monitoring system such as real time mode of data registration and processing and its precision range are formulated. The method of failure nuclei delineation in a rock masses is presented. This method is implemented in a software program for strong seismic events forecasting. It is based on direct use of the fracture concentration criterion. The method is applied to the database of microseismic events of the North Ural Bauxite Mine. The results of this application, such as: efficiency, stability, possibility of forecasting rockburst are discussed.
Detection of dominant flow and abnormal events in surveillance video
NASA Astrophysics Data System (ADS)
Kwak, Sooyeong; Byun, Hyeran
2011-02-01
We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.
Chaos in Practice: Techniques for Career Counsellors
ERIC Educational Resources Information Center
Pryor, Robert G. L.; Bright, Jim
2005-01-01
The chaos theory of careers emphasises continual change, the centrality and importance of chance events, the potential of minor events to have disproportionately large impacts on subsequent events, and the capacity for dramatic phase shifts in career behaviour. This approach challenges traditional approaches to career counselling, assumptions…
Vertically Integrated Seismological Analysis I : Modeling
NASA Astrophysics Data System (ADS)
Russell, S.; Arora, N. S.; Jordan, M. I.; Sudderth, E.
2009-12-01
As part of its CTBT verification efforts, the International Data Centre (IDC) analyzes seismic and other signals collected from hundreds of stations around the world. Current processing at the IDC proceeds in a series of pipelined stages. From station processing to network processing, each decision is made on the basis of local information. This has the advantage of efficiency, and simplifies the structure of software implementations. However, this approach may reduce accuracy in the detection and phase classification of arrivals, association of detections to hypothesized events, and localization of small-magnitude events.In our work, we approach such detection and association problems as ones of probabilistic inference. In simple terms, let X be a random variable ranging over all possible collections of events, with each event defined by time, location, magnitude, and type (natural or man-made). Let Y range over all possible waveform signal recordings at all detection stations. Then Pθ(X) describes a parameterized generative prior over events, and P[|#30#|]φ(Y | X) describes how the signal is propagated and measured (including travel time, selective absorption and scattering, noise, artifacts, sensor bias, sensor failures, etc.). Given observed recordings Y = y, we are interested in the posterior P(X | Y = y), and perhaps in the value of X that maximizes it—i.e., the most likely explanation for all the sensor readings. As detailed below, an additional focus of our work is to robustly learn appropriate model parameters θ and φ from historical data. The primary advantage we expect is that decisions about arrivals, phase classifications, and associations are made with the benefit of all available evidence, not just the local signal or predefined recipes. Important phenomena—such as the successful detection of sub-threshold signals, correction of phase classifications using arrival information at other stations, and removal of false events based on the absence of signals—should all fall out of our probabilistic framework without the need for special processing rules. In our baseline model, natural events occur according to a spatially inhomogeneous Poisson process. Complex events (swarms and aftershocks) may then be captured via temporally inhomogeneous extensions. Man-made events have a uniform probability of occurring anywhere on the earth, with a tendency to occur closer to the surface. Phases are modelled via their amplitude, frequency distribution, and origin. In the simplest case, transmission times are characterized via the one-dimensional IASPEI-91 model, accounting for model errors with Gaussian uncertainty. Such homogeneous, approximate physical models can be further refined via historical data and previously developed corrections. Signal measurements are captured by station-specific models, based on sensor types and geometries, local frequency absorption characteristics, and time-varying noise models. At the conference, we expect to be able to quantitatively demonstrate the advantages of our approach, at least for simulated data. When reporting their findings, such systems can easily flag low-confidence events, unexplained arrivals, and ambiguous classifications to focus the efforts of expert analysts.
Acquision of Geometrical Data of Small Rivers with AN Unmanned Water Vehicle
NASA Astrophysics Data System (ADS)
Sardemann, H.; Eltner, A.; Maas, H.-G.
2018-05-01
Rivers with small- and medium-scaled catchments have been increasingly affected by extreme events, i.e. flash floods, in the last years. New methods to describe and predict these events are developed in the interdisciplinary research project EXTRUSO. Flash flood events happen on small temporal and spatial scales, stressing the necessity of high-resolution input data for hydrological and hydrodynamic modelling. Among others, the benefit of high-resolution digital terrain models (DTMs) will be evaluated in the project. This article introduces a boat-based approach for the acquisition of geometrical and morphological data of small rivers and their banks. An unmanned water vehicle (UWV) is used as a multi-sensor platform to collect 3D-point clouds of the riverbanks, as well as bathymetric measurements of water depth and river morphology. The UWV is equipped with a mobile Lidar, a panorama camera, an echo sounder and a positioning unit. Whole (sub-) catchments of small rivers can be digitalized and provided for hydrological modelling when UWV-based and UAV (unmanned aerial vehicle) based point clouds are fused.
Intensity - Duration - Frequency Curves for U.S. Cities in a Warming Climate
NASA Astrophysics Data System (ADS)
Ragno, Elisa; AghaKouchak, Amir; Love, Charlotte; Vahedifard, Farshid; Cheng, Linyin; Lima, Carlos
2017-04-01
Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of infrastructures and natural slopes. Here we employ daily precipitation data from historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on future climatic model projections. We show that, based on CMIP5 simulations, U.S cities may experience extreme precipitation events up to 20% more intense and twice as frequently, relative to historical records, despite the expectation of unchanged annual mean precipitation.
NASA Astrophysics Data System (ADS)
Obozov, A. A.; Serpik, I. N.; Mihalchenko, G. S.; Fedyaeva, G. A.
2017-01-01
In the article, the problem of application of the pattern recognition (a relatively young area of engineering cybernetics) for analysis of complicated technical systems is examined. It is shown that the application of a statistical approach for hard distinguishable situations could be the most effective. The different recognition algorithms are based on Bayes approach, which estimates posteriori probabilities of a certain event and an assumed error. Application of the statistical approach to pattern recognition is possible for solving the problem of technical diagnosis complicated systems and particularly big powered marine diesel engines.
NASA Astrophysics Data System (ADS)
Vater, Stefan; Behrens, Jörn
2017-04-01
Simulations of historic tsunami events such as the 2004 Sumatra or the 2011 Tohoku event are usually initialized using earthquake sources resulting from inversion of seismic data. Also, other data from ocean buoys etc. is sometimes included in the derivation of the source model. The associated tsunami event can often be well simulated in this way, and the results show high correlation with measured data. However, it is unclear how the derived source model compares to the particular earthquake event. In this study we use the results from dynamic rupture simulations obtained with SeisSol, a software package based on an ADER-DG discretization solving the spontaneous dynamic earthquake rupture problem with high-order accuracy in space and time. The tsunami model is based on a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme on triangular grids and features a robust wetting and drying scheme for the simulation of inundation events at the coast. Adaptive mesh refinement enables the efficient computation of large domains, while at the same time it allows for high local resolution and geometric accuracy. The results are compared to measured data and results using earthquake sources based on inversion. With the approach of using the output of actual dynamic rupture simulations, we can estimate the influence of different earthquake parameters. Furthermore, the comparison to other source models enables a thorough comparison and validation of important tsunami parameters, such as the runup at the coast. This work is part of the ASCETE (Advanced Simulation of Coupled Earthquake and Tsunami Events) project, which aims at an improved understanding of the coupling between the earthquake and the generated tsunami event.
Very early warning of next El Niño.
Ludescher, Josef; Gozolchiani, Avi; Bogachev, Mikhail I; Bunde, Armin; Havlin, Shlomo; Schellnhuber, Hans Joachim
2014-02-11
The most important driver of climate variability is the El Niño Southern Oscillation, which can trigger disasters in various parts of the globe. Despite its importance, conventional forecasting is still limited to 6 mo ahead. Recently, we developed an approach based on network analysis, which allows projection of an El Niño event about 1 y ahead. Here we show that our method correctly predicted the absence of El Niño events in 2012 and 2013 and now announce that our approach indicated (in September 2013 already) the return of El Niño in late 2014 with a 3-in-4 likelihood. We also discuss the relevance of the next El Niño to the question of global warming and the present hiatus in the global mean surface temperature.
NASA Astrophysics Data System (ADS)
Aad, G.; Abajyan, T.; Abbott, B.; Abdallah, J.; Khalek, S. Abdel; Abdinov, O.; Aben, R.; Abi, B.; Abolins, M.; AbouZeid, O. S.; Abramowicz, H.; Abreu, H.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Addy, T. N.; Adelman, J.; Adomeit, S.; Adye, T.; Agatonovic-Jovin, T.; Aguilar-Saavedra, J. A.; Agustoni, M.; Ahlen, S. P.; Ahmadov, F.; Aielli, G.; Åkesson, T. P. A.; Akimoto, G.; Akimov, A. V.; Albert, J.; Albrand, S.; Verzini, M. J. Alconada; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Alimonti, G.; Alio, L.; Alison, J.; Allbrooke, B. M. M.; Allison, L. J.; Allport, P. P.; Allwood-Spiers, S. E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alonso, F.; Alpigiani, C.; Altheimer, A.; Gonzalez, B. Alvarez; Alviggi, M. G.; Amako, K.; Coutinho, Y. Amaral; Amelung, C.; Amidei, D.; Ammosov, V. V.; Santos, S. P. Amor Dos; Amorim, A.; Amoroso, S.; Amram, N.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Anduaga, X. S.; Angelidakis, S.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoki, M.; Bella, L. Aperio; Apolle, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Araque, J. P.; Arce, A. T. H.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Arnaez, O.; Arnal, V.; Arslan, O.; Artamonov, A.; Artoni, G.; Asai, S.; Asbah, N.; Ashkenazi, A.; Ask, S.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Auerbach, B.; Auge, E.; Augsten, K.; Aurousseau, M.; Avolio, G.; Azuelos, G.; Azuma, Y.; Baak, M. A.; Bacci, C.; Bach, A. M.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Mayes, J. Backus; Badescu, E.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Bailey, D. C.; Bain, T.; Baines, J. T.; Baker, O. K.; Baker, S.; Balek, P.; Balli, F.; Banas, E.; Banerjee, Sw.; Bangert, A.; Bannoura, A. A. E.; Bansal, V.; Bansil, H. S.; Barak, L.; Baranov, S. P.; Barber, T.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B. M.; Barnett, R. M.; Barnovska, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barreiro, F.; da Costa, J. Barreiro Guimarães; Bartoldus, R.; Barton, A. E.; Bartos, P.; Bartsch, V.; Bassalat, A.; Basye, A.; Bates, R. L.; Batkova, L.; Batley, J. R.; Battistin, M.; Bauer, F.; Bawa, H. S.; Beau, T.; Beauchemin, P. H.; Beccherle, R.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, S.; Beckingham, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bedikian, S.; Bednyakov, V. A.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, K.; Belanger-Champagne, C.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belloni, A.; Belotskiy, K.; Beltramello, O.; Benary, O.; Benchekroun, D.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Noccioli, E. Benhar; Garcia, J. A. Benitez; Benjamin, D. P.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Berge, D.; Kuutmann, E. Bergeaas; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernard, C.; Bernat, P.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertolucci, F.; Besana, M. I.; Besjes, G. J.; Bessidskaia, O.; Besson, N.; Betancourt, C.; Bethke, S.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Bieniek, S. P.; Bierwagen, K.; Biesiada, J.; Biglietti, M.; De Mendizabal, J. Bilbao; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blazek, T.; Bloch, I.; Blocker, C.; Blum, W.; Blumenschein, U.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Boddy, C. R.; Boehler, M.; Boek, J.; Boek, T. T.; Bogaerts, J. A.; Bogdanchikov, A. G.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Boldyrev, A. S.; Bolnet, N. M.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Borri, M.; Borroni, S.; Bortfeldt, J.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Boudreau, J.; Bouffard, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Bousson, N.; Boutouil, S.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozovic-Jelisavcic, I.; Bracinik, J.; Branchini, P.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Brazzale, S. F.; Brelier, B.; Brendlinger, K.; Brennan, A. J.; Brenner, R.; Bressler, S.; Bristow, K.; Bristow, T. M.; Britton, D.; Brochu, F. M.; Brock, I.; Brock, R.; Bromberg, C.; Bronner, J.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Brown, G.; Brown, J.; Renstrom, P. A. Bruckman de; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.; Bruschi, M.; Bryngemark, L.; Buanes, T.; Buat, Q.; Bucci, F.; Buchholz, P.; Buckingham, R. M.; Buckley, A. G.; Buda, S. I.; Budagov, I. A.; Buehrer, F.; Bugge, L.; Bugge, M. K.; Bulekov, O.; Bundock, A. C.; Burckhart, H.; Burdin, S.; Burghgrave, B.; Burke, S.; Burmeister, I.; Busato, E.; Büscher, V.; Bussey, P.; Buszello, C. P.; Butler, B.; Butler, J. M.; Butt, A. I.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Byszewski, M.; Urbán, S. Cabrera; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calfayan, P.; Calkins, R.; Caloba, L. P.; Calvet, D.; Calvet, S.; Toro, R. Camacho; Camarda, S.; Cameron, D.; Caminada, L. M.; Armadans, R. Caminal; Campana, S.; Campanelli, M.; Campoverde, A.; Canale, V.; Canepa, A.; Cantero, J.; Cantrill, R.; Cao, T.; Garrido, M. D. M. Capeans; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Cardarelli, R.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Castaneda-Miranda, E.; Castelli, A.; Gimenez, V. Castillo; Castro, N. F.; Catastini, P.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Cattani, G.; Caughron, S.; Cavaliere, V.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerio, B.; Cerny, K.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cerv, M.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chalupkova, I.; Chan, K.; Chang, P.; Chapleau, B.; Chapman, J. D.; Charfeddine, D.; Charlton, D. G.; Chau, C. C.; Barajas, C. A. Chavez; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, L.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, Y.; Cheplakov, A.; El Moursli, R. Cherkaoui; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiefari, G.; Childers, J. T.; Chilingarov, A.; Chiodini, G.; Chisholm, A. S.; Chislett, R. T.; Chitan, A.; Chizhov, M. V.; Chouridou, S.; Chow, B. K. B.; Christidi, I. A.; Chromek-Burckhart, D.; Chu, M. L.; Chudoba, J.; Chytka, L.; Ciapetti, G.; Ciftci, A. K.; Ciftci, R.; Cinca, D.; Cindro, V.; Ciocio, A.; Cirkovic, P.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, P. J.; Clarke, R. N.; Cleland, W.; Clemens, J. C.; Clement, B.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coffey, L.; Cogan, J. G.; Coggeshall, J.; Cole, B.; Cole, S.; Colijn, A. P.; Collins-Tooth, C.; Collot, J.; Colombo, T.; Colon, G.; Compostella, G.; Muiño, P. Conde; Coniavitis, E.; Conidi, M. C.; Connell, S. H.; Connelly, I. A.; Consonni, S. M.; Consorti, V.; Constantinescu, S.; Conta, C.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cooper-Smith, N. J.; Copic, K.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Côté, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Ortuzar, M. Crispin; Cristinziani, M.; Crosetti, G.; Cuciuc, C.-M.; Donszelmann, T. Cuhadar; Cummings, J.; Curatolo, M.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; Da Cunha Sargedas De Sousa, M. J.; Da Via, C.; Dabrowski, W.; Dafinca, A.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Daniells, A. C.; Hoffmann, M. Dano; Dao, V.; Darbo, G.; Darlea, G. L.; Darmora, S.; Dassoulas, J. A.; Davey, W.; David, C.; Davidek, T.; Davies, E.; Davies, M.; Davignon, O.; Davison, A. R.; Davison, P.; Davygora, Y.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Castro, S.; De Cecco, S.; de Graat, J.; De Groot, N.; de Jong, P.; De La Taille, C.; De la Torre, H.; De Lorenzi, F.; De Nooij, L.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; De Zorzi, G.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dechenaux, B.; Dedovich, D. V.; Degenhardt, J.; Deigaard, I.; Del Peso, J.; Del Prete, T.; Deliot, F.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; Deluca, C.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; Di Ciaccio, A.; Di Ciaccio, L.; Di Domenico, A.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Mattia, A.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Dietzsch, T. A.; Diglio, S.; Dimitrievska, A.; Dingfelder, J.; Dionisi, C.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; do Vale, M. A. B.; Do Valle Wemans, A.; Doan, T. K. O.; Dobos, D.; Dobson, E.; Doglioni, C.; Doherty, T.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dova, M. T.; Doyle, A. T.; Dris, M.; Dubbert, J.; Dube, S.; Dubreuil, E.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Dudziak, F.; Duflot, L.; Duguid, L.; Dührssen, M.; Dunford, M.; Yildiz, H. Duran; Düren, M.; Durglishvili, A.; Dwuznik, M.; Dyndal, M.; Ebke, J.; Edson, W.; Edwards, N. C.; Ehrenfeld, W.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellinghaus, F.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Engelmann, R.; Erdmann, J.; Ereditato, A.; Eriksson, D.; Ernis, G.; Ernst, J.; Ernst, M.; Ernwein, J.; Errede, D.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Esposito, B.; Etienvre, A. I.; Etzion, E.; Evans, H.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Favareto, A.; Fayard, L.; Federic, P.; Fedin, O. L.; Fedorko, W.; Fehling-Kaschek, M.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Perez, S. Fernandez; Fernando, W.; Ferrag, S.; Ferrando, J.; Ferrara, V.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferreira de Lima, D. E.; Ferrer, A.; Ferrere, D.; Ferretti, C.; Parodi, A. Ferretto; Fiascaris, M.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, J.; Fisher, M. J.; Fisher, W. C.; Fitzgerald, E. A.; Flechl, M.; Fleck, I.; Fleischmann, P.; Fleischmann, S.; Fletcher, G. T.; Fletcher, G.; Flick, T.; Floderus, A.; Castillo, L. R. Flores; Bustos, A. C. Florez; Flowerdew, M. J.; Formica, A.; Forti, A.; Fortin, D.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Franchino, S.; Francis, D.; Franklin, M.; Franz, S.; Fraternali, M.; French, S. T.; Friedrich, C.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Torregrosa, E. Fullana; Fulsom, B. G.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallo, V.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gandrajula, R. P.; Gao, J.; Gao, Y. S.; Walls, F. M. Garay; Garberson, F.; García, C.; Navarro, J. E. García; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gatti, C.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Ge, P.; Gecse, Z.; Gee, C. N. P.; Geerts, D. A. A.; Geich-Gimbel, Ch.; Gellerstedt, K.; Gemme, C.; Gemmell, A.; Genest, M. H.; Gentile, S.; George, M.; George, S.; Gerbaudo, D.; Gershon, A.; Ghazlane, H.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giangiobbe, V.; Giannetti, P.; Gianotti, F.; Gibbard, B.; Gibson, S. M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giordano, R.; Giorgi, F. M.; Giraud, P. F.; Giugni, D.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkialas, I.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Glonti, G. L.; Goblirsch-Kolb, M.; Goddard, J. R.; Godfrey, J.; Godlewski, J.; Goeringer, C.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Fajardo, L. S. Gomez; Gonçalo, R.; Da Costa, J. Goncalves Pinto Firmino; Gonella, L.; de la Hoz, S. González; Parra, G. Gonzalez; Silva, M. L. Gonzalez; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Gouighri, M.; Goujdami, D.; Goulette, M. P.; Goussiou, A. G.; Goy, C.; Gozpinar, S.; Grabas, H. M. X.; Graber, L.; Grabowska-Bold, I.; Grafström, P.; Grahn, K.-J.; Gramling, J.; Gramstad, E.; Grancagnolo, F.; Grancagnolo, S.; Grassi, V.; Gratchev, V.; Gray, H. M.; Graziani, E.; Grebenyuk, O. G.; Greenwood, Z. D.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Griffiths, J.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grishkevich, Y. V.; Grivaz, J.-F.; Grohs, J. P.; Grohsjean, A.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Groth-Jensen, J.; Grout, Z. J.; Grybel, K.; Guan, L.; Guescini, F.; Guest, D.; Gueta, O.; Guicheney, C.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Gunther, J.; Guo, J.; Gupta, S.; Gutierrez, P.; Ortiz, N. G. Gutierrez; Gutschow, C.; Guttman, N.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Haefner, P.; Hageböck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Hall, D.; Halladjian, G.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamer, M.; Hamilton, A.; Hamilton, S.; Hamnett, P. G.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Hanke, P.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Harkusha, S.; Harper, D.; Harrington, R. D.; Harris, O. M.; Harrison, P. F.; Hartjes, F.; Harvey, A.; Hasegawa, S.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauschild, M.; Hauser, R.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, A. D.; Hayashi, T.; Hayden, D.; Hays, C. P.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, L.; Heisterkamp, S.; Hejbal, J.; Helary, L.; Heller, C.; Heller, M.; Hellman, S.; Hellmich, D.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Hengler, C.; Henrichs, A.; Correia, A. M. Henriques; Henrot-Versille, S.; Hensel, C.; Herbert, G. H.; Jiménez, Y. Hernández; Herrberg-Schubert, R.; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hickling, R.; Higón-Rodriguez, E.; Hill, J. C.; Hiller, K. H.; Hillert, S.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, D.; Hofmann, J. I.; Hohlfeld, M.; Holmes, T. R.; Hong, T. M.; van Huysduynen, L. Hooft; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howard, J.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, X.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Hülsing, T. A.; Hurwitz, M.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Ideal, E.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikematsu, K.; Ikeno, M.; Iliadis, D.; Ilic, N.; Inamaru, Y.; Ince, T.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Quiles, A. Irles; Isaksson, C.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ponce, J. M. Iturbe; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, B.; Jackson, J. N.; Jackson, M.; Jackson, P.; Jaekel, M. R.; Jain, V.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jakubek, J.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansen, H.; Janssen, J.; Janus, M.; Jarlskog, G.; Javůrek, T.; Jeanty, L.; Jeng, G.-Y.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Ji, H.; Ji, W.; Jia, J.; Jiang, Y.; Belenguer, M. Jimenez; Jin, S.; Jinaru, A.; Jinnouchi, O.; Joergensen, M. D.; Johansson, K. E.; Johansson, P.; Johns, K. A.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Joshi, K. D.; Jovicevic, J.; Ju, X.; Jung, C. A.; Jungst, R. M.; Jussel, P.; Rozas, A. Juste; Kaci, M.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kajomovitz, E.; Kama, S.; Kanaya, N.; Kaneda, M.; Kaneti, S.; Kanno, T.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kar, D.; Karakostas, K.; Karastathis, N.; Karnevskiy, M.; Karpov, S. N.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasieczka, G.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Katre, A.; Katzy, J.; Kaushik, V.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Kazarinov, M. Y.; Keeler, R.; Kehoe, R.; Keil, M.; Keller, J. S.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Kessoku, K.; Keung, J.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Khodinov, A.; Khomich, A.; Khoo, T. J.; Khoriauli, G.; Khoroshilov, A.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kim, H. Y.; Kim, H.; Kim, S. H.; Kimura, N.; Kind, O.; King, B. T.; King, M.; King, R. S. B.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kiss, F.; Kitamura, T.; Kittelmann, T.; Kiuchi, K.; Kladiva, E.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koevesarki, P.; Koffas, T.; Koffeman, E.; Kogan, L. A.; Kohlmann, S.; Kohout, Z.; Kohriki, T.; Koi, T.; Kolanoski, H.; Koletsou, I.; Koll, J.; Komar, A. A.; Komori, Y.; Kondo, T.; Köneke, K.; König, A. C.; König, S.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Korotkov, V. A.; Kortner, O.; Kortner, S.; Kostyukhin, V. V.; Kotov, V. M.; Kotwal, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewski, R.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kreiss, S.; Kretz, M.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruker, T.; Krumnack, N.; Krumshteyn, Z. V.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kuday, S.; Kuehn, S.; Kugel, A.; Kuhl, A.; Kuhl, T.; Kukhtin, V.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunkle, J.; Kupco, A.; Kurashige, H.; Kurochkin, Y. A.; Kurumida, R.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; La Rosa, A.; La Rotonda, L.; Labarga, L.; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Laier, H.; Lambourne, L.; Lammers, S.; Lampen, C. L.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Lassnig, M.; Laurelli, P.; Lavorini, V.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Le, B. T.; Le Dortz, O.; Le Guirriec, E.; Le Menedeu, E.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Lehmacher, M.; Miotto, G. Lehmann; Lei, X.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lenzi, B.; Leone, R.; Leonhardt, K.; Leontsinis, S.; Leroy, C.; Lester, C. G.; Lester, C. M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Lewis, A.; Lewis, G. H.; Leyko, A. M.; Leyton, M.; Li, B.; Li, B.; Li, H.; Li, H. L.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liao, H.; Liberti, B.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limbach, C.; Limosani, A.; Limper, M.; Lin, S. C.; Linde, F.; Lindquist, B. E.; Linnemann, J. T.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y.; Livan, M.; Livermore, S. S. A.; Lleres, A.; Merino, J. Llorente; Lloyd, S. L.; Lo Sterzo, F.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Lombardo, V. P.; Long, J. D.; Long, R. E.; Lopes, L.; Mateos, D. Lopez; Paredes, B. Lopez; Lorenz, J.; Martinez, N. Lorenzo; Losada, M.; Loscutoff, P.; Losty, M. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lowe, A. J.; Lu, F.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Lungwitz, M.; Lynn, D.; Lysak, R.; Lytken, E.; Ma, H.; Ma, L. L.; Maccarrone, G.; Macchiolo, A.; Maček, B.; Miguens, J. Machado; Macina, D.; Madaffari, D.; Madar, R.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeno, M.; Maeno, T.; Magradze, E.; Mahboubi, K.; Mahlstedt, J.; Mahmoud, S.; Maiani, C.; Maidantchik, C.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Mal, P.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyshev, V. M.; Malyukov, S.; Mamuzic, J.; Mandelli, B.; Mandelli, L.; Mandić, I.; Mandrysch, R.; Maneira, J.; Manfredini, A.; de Andrade Filho, L. Manhaes; Ramos, J. A. Manjarres; Mann, A.; Manning, P. M.; Manousakis-Katsikakis, A.; Mansoulie, B.; Mantifel, R.; Mapelli, L.; March, L.; Marchand, J. F.; Marchese, F.; Marchiori, G.; Marcisovsky, M.; Marino, C. P.; Marques, C. N.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Marti, L. F.; Marti-Garcia, S.; Martin, B.; Martin, B.; Martin, T. A.; Martin, V. J.; dit Latour, B. Martin; Martinez, H.; Martinez, M.; Martin-Haugh, S.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massol, N.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Matsunaga, H.; Matsushita, T.; Mättig, P.; Mättig, S.; Mattmann, J.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazzaferro, L.; Mc Goldrick, G.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McCubbin, N. A.; McFarlane, K. W.; Mcfayden, J. A.; Mchedlidze, G.; Mclaughlan, T.; McMahon, S. J.; McPherson, R. A.; Meade, A.; Mechnich, J.; Medinnis, M.; Meehan, S.; Meera-Lebbai, R.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Melachrinos, C.; Garcia, B. R. Mellado; Meloni, F.; Navas, L. Mendoza; Mengarelli, A.; Menke, S.; Meoni, E.; Mercurio, K. M.; Mergelmeyer, S.; Meric, N.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Merritt, H.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Middleton, R. P.; Migas, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Milstein, D.; Minaenko, A. A.; Moya, M. Miñano; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mirabelli, G.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Mitsui, S.; Miucci, A.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Mochizuki, K.; Moeller, V.; Mohapatra, S.; Mohr, W.; Molander, S.; Moles-Valls, R.; Mönig, K.; Monini, C.; Monk, J.; Monnier, E.; Berlingen, J. Montejo; Monticelli, F.; Monzani, S.; Moore, R. W.; Herrera, C. Mora; Moraes, A.; Morange, N.; Morel, J.; Moreno, D.; Llácer, M. Moreno; Morettini, P.; Morgenstern, M.; Morii, M.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Morvaj, L.; Moser, H. G.; Mosidze, M.; Moss, J.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Muanza, S.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, K.; Mueller, T.; Mueller, T.; Muenstermann, D.; Munwes, Y.; Quijada, J. A. Murillo; Murray, W. J.; Musto, E.; Myagkov, A. G.; Myska, M.; Nackenhorst, O.; Nadal, J.; Nagai, K.; Nagai, R.; Nagai, Y.; Nagano, K.; Nagarkar, A.; Nagasaka, Y.; Nagel, M.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Nanava, G.; Narayan, R.; Nattermann, T.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Negri, A.; Negri, G.; Negrini, M.; Nektarijevic, S.; Nelson, A.; Nelson, T. K.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nguyen, D. H.; Nickerson, R. B.; Nicolaidou, R.; Nicquevert, B.; Nielsen, J.; Nikiforou, N.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolics, K.; Nikolopoulos, K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Norberg, S.; Nordberg, M.; Nowak, S.; Nozaki, M.; Nozka, L.; Ntekas, K.; Hanninger, G. Nunes; Nunnemann, T.; Nurse, E.; Nuti, F.; O'Brien, B. J.; O'Grady, F.; O'Neil, D. C.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, M. I.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Ohshima, T.; Okamura, W.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Olchevski, A. G.; Pino, S. A. Olivares; Damazio, D. Oliveira; Garcia, E. Oliver; Olivito, D.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Barrera, C. Oropeza; Orr, R. S.; Osculati, B.; Ospanov, R.; y Garzon, G. Otero; Otono, H.; Ouchrif, M.; Ouellette, E. A.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Ovcharova, A.; Owen, M.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pages, A. Pacheco; Aranda, C. Padilla; Pagáčová, M.; Griso, S. Pagan; Paganis, E.; Pahl, C.; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palestini, S.; Pallin, D.; Palma, A.; Palmer, J. D.; Pan, Y. B.; Panagiotopoulou, E.; Vazquez, J. G. Panduro; Pani, P.; Panikashvili, N.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Hernandez, D. Paredes; Parker, M. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pasqualucci, E.; Passaggio, S.; Passeri, A.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N. D.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pearce, J.; Pedersen, M.; Lopez, S. Pedraza; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Peng, H.; Penning, B.; Penwell, J.; Perepelitsa, D. V.; Codina, E. Perez; García-Estañ, M. T. Pérez; Reale, V. Perez; Perini, L.; Pernegger, H.; Perrino, R.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, J.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petteni, M.; Pettersson, N. E.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Piec, S. M.; Piegaia, R.; Pignotti, D. T.; Pilcher, J. E.; Pilkington, A. D.; Pina, J.; Pinamonti, M.; Pinder, A.; Pinfold, J. L.; Pingel, A.; Pinto, B.; Pires, S.; Pizio, C.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Poddar, S.; Podlyski, F.; Poettgen, R.; Poggioli, L.; Pohl, D.; Pohl, M.; Polesello, G.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Bueso, X. Portell; Pospelov, G. E.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Prabhu, R.; Pralavorio, P.; Pranko, A.; Prasad, S.; Pravahan, R.; Prell, S.; Price, D.; Price, J.; Price, L. E.; Prieur, D.; Primavera, M.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopapadaki, E.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Przysiezniak, H.; Ptacek, E.; Pueschel, E.; Puldon, D.; Purohit, M.; Puzo, P.; Pylypchenko, Y.; Qian, J.; Qin, G.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Quilty, D.; Qureshi, A.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Rajagopalan, S.; Rammensee, M.; Rammes, M.; Randle-Conde, A. S.; Rangel-Smith, C.; Rao, K.; Rauscher, F.; Rave, T. C.; Ravenscroft, T.; Raymond, M.; Read, A. L.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Rehnisch, L.; Reinsch, A.; Reisin, H.; Relich, M.; Rembser, C.; Ren, Z. L.; Renaud, A.; Rescigno, M.; Resconi, S.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Ridel, M.; Rieck, P.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Roda, C.; Rodrigues, L.; Roe, S.; Røhne, O.; Rolli, S.; Romaniouk, A.; Romano, M.; Romeo, G.; Adam, E. Romero; Rompotis, N.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, A.; Rose, M.; Rosendahl, P. L.; Rosenthal, O.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rubinskiy, I.; Rud, V. I.; Rudolph, C.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryder, N. C.; Saavedra, A. F.; Sacerdoti, S.; Saddique, A.; Sadeh, I.; Sadrozinski, H. F.-W.; Sadykov, R.; Tehrani, F. Safai; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Saleem, M.; Salek, D.; De Bruin, P. H. Sales; Salihagic, D.; Salnikov, A.; Salt, J.; Ferrando, B. M. Salvachua; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Martinez, V. Sanchez; Sandaker, H.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandoval, T.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Castillo, I. Santoyo; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sartisohn, G.; Sasaki, O.; Sasaki, Y.; Sauvage, G.; Sauvan, E.; Savard, P.; Savu, D. O.; Sawyer, C.; Sawyer, L.; Saxon, D. H.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Schaarschmidt, J.; Schacht, P.; Schaefer, D.; Schaefer, R.; Schaelicke, A.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schieck, J.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt, E.; Schmieden, K.; Schmitt, C.; Schmitt, C.; Schmitt, S.; Schneider, B.; Schnellbach, Y. J.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schorlemmer, A. L. S.; Schott, M.; Schouten, D.; Schovancova, J.; Schramm, S.; Schreyer, M.; Schroeder, C.; Schuh, N.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwegler, Ph.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Schwoerer, M.; Sciacca, F. G.; Scifo, E.; Sciolla, G.; Scott, W. G.; Scuri, F.; Scutti, F.; Searcy, J.; Sedov, G.; Sedykh, E.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekula, S. J.; Selbach, K. E.; Seliverstov, D. M.; Sellers, G.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Serre, T.; Seuster, R.; Severini, H.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shamim, M.; Shan, L. Y.; Shank, J. T.; Shao, Q. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Sherwood, P.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shiyakova, M.; Shmeleva, A.; Shochet, M. J.; Short, D.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Shushkevich, S.; Sicho, P.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silbert, O.; Silva, J.; Silver, Y.; Silverstein, D.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simoniello, R.; Simonyan, M.; Sinervo, P.; Sinev, N. B.; Sipica, V.; Siragusa, G.; Sircar, A.; Sisakyan, A. N.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinnari, L. A.; Skottowe, H. P.; Skovpen, K. Yu.; Skubic, P.; Slater, M.; Slavicek, T.; Sliwa, K.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, K. M.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snidero, G.; Snyder, S.; Sobie, R.; Socher, F.; Soffer, A.; Soh, D. A.; Solans, C. A.; Solar, M.; Solc, J.; Soldatov, E. Yu.; Soldevila, U.; Camillocci, E. Solfaroli; Solodkov, A. A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Song, H. Y.; Soni, N.; Sood, A.; Sopko, B.; Sopko, V.; Sorin, V.; Sosebee, M.; Soualah, R.; Soueid, P.; Soukharev, A. M.; South, D.; Spagnolo, S.; Spanò, F.; Spearman, W. R.; Spighi, R.; Spigo, G.; Spousta, M.; Spreitzer, T.; Spurlock, B.; St. Denis, R. D.; Staerz, S.; Stahlman, J.; Stamen, R.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, J.; Staroba, P.; Starovoitov, P.; Staszewski, R.; Stavina, P.; Steele, G.; Steinberg, P.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stern, S.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoerig, K.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, E.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Subramania, H. S.; Subramaniam, R.; Succurro, A.; Sugaya, Y.; Suhr, C.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, Y.; Svatos, M.; Swedish, S.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takahashi, Y.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tam, J. Y. C.; Tamsett, M. C.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanasijczuk, A. J.; Tani, K.; Tannoury, N.; Tapprogge, S.; Tarem, S.; Tarrade, F.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Delgado, A. Tavares; Tayalati, Y.; Taylor, C.; Taylor, F. E.; Taylor, G. N.; Taylor, W.; Teischinger, F. A.; Castanheira, M. Teixeira Dias; Teixeira-Dias, P.; Temming, K. K.; Kate, H. Ten; Teng, P. K.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Therhaag, J.; Theveneaux-Pelzer, T.; Thoma, S.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, E. N.; Thompson, P. D.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Thong, W. M.; Thun, R. P.; Tian, F.; Tibbetts, M. J.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tiouchichine, E.; Tipton, P.; Tisserant, S.; Todorov, T.; Todorova-Nova, S.; Toggerson, B.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tollefson, K.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Topilin, N. D.; Torrence, E.; Torres, H.; Pastor, E. Torró; Toth, J.; Touchard, F.; Tovey, D. R.; Tran, H. L.; Trefzger, T.; Tremblet, L.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Triplett, N.; Trischuk, W.; Trocmé, B.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; True, P.; Trzebinski, M.; Trzupek, A.; Tsarouchas, C.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsionou, D.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tua, A.; Tudorache, A.; Tudorache, V.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Cakir, I. Turk; Turra, R.; Tuts, P. M.; Tykhonov, A.; Tylmad, M.; Tyndel, M.; Uchida, K.; Ueda, I.; Ueno, R.; Ughetto, M.; Ugland, M.; Uhlenbrock, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Urbaniec, D.; Urquijo, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Gallego, E. Valladolid; Vallecorsa, S.; Ferrer, J. A. Valls; Van Der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; Van Der Leeuw, R.; van der Ster, D.; Eldik, N. van; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vaniachine, A.; Vankov, P.; Vannucci, F.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vazeille, F.; Schroeder, T. Vazquez; Veatch, J.; Veloso, F.; Veneziano, S.; Ventura, A.; Ventura, D.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Boeriu, O. E. Vickey; Viehhauser, G. H. A.; Viel, S.; Vigne, R.; Villa, M.; Perez, M. Villaplana; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Virzi, J.; Vitells, O.; Vivarelli, I.; Vaque, F. Vives; Vlachos, S.; Vladoiu, D.; Vlasak, M.; Vogel, A.; Vokac, P.; Volpi, G.; Volpi, M.; von der Schmitt, H.; Radziewski, H. von; von Toerne, E.; Vorobel, V.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Milosavljevic, M. Vranjes; Vrba, V.; Vreeswijk, M.; Anh, T. Vu; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, P.; Wagner, W.; Wahrmund, S.; Wakabayashi, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wall, R.; Waller, P.; Walsh, B.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, X.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Warsinsky, M.; Washbrook, A.; Wasicki, C.; Watanabe, I.; Watkins, P. M.; Watson, A. T.; Watson, I. J.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weigell, P.; Weinert, B.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wendland, D.; Weng, Z.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; White, A.; White, M. J.; White, R.; White, S.; Whiteson, D.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wijeratne, P. A.; Wildauer, A.; Wildt, M. A.; Wilkens, H. G.; Will, J. Z.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, A.; Wilson, J. A.; Wingerter-Seez, I.; Winkelmann, S.; Winklmeier, F.; Wittgen, M.; Wittig, T.; Wittkowski, J.; Wollstadt, S. J.; Wolter, M. W.; Wolters, H.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wright, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wulf, E.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xiao, M.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yamada, M.; Yamaguchi, H.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, K.; Yamamoto, S.; Yamamura, T.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, U. K.; Yang, Y.; Yanush, S.; Yao, L.; Yao, W.-M.; Yasu, Y.; Yatsenko, E.; Wong, K. H. Yau; Ye, J.; Ye, S.; Yen, A. L.; Yildirim, E.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J. M.; Yu, J.; Yuan, L.; Yurkewicz, A.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zaytsev, A.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; della Porta, G. Zevi; Zhang, D.; Zhang, F.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, X.; Zhang, Z.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, L.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, R.; Zimmermann, S.; Zimmermann, S.; Zinonos, Z.; Ziolkowski, M.; Zitoun, R.; Zobernig, G.; Zoccoli, A.; zur Nedden, M.; Zurzolo, G.; Zutshi, V.; Zwalinski, L.
2014-08-01
Distributions sensitive to the underlying event in QCD jet events have been measured with the ATLAS detector at the LHC, based on of proton-proton collision data collected at a centre-of-mass energy of 7 . Charged-particle mean and densities of all-particle and charged-particle multiplicity and have been measured in regions azimuthally transverse to the hardest jet in each event. These are presented both as one-dimensional distributions and with their mean values as functions of the leading-jet transverse momentum from 20 to 800 . The correlation of charged-particle mean with charged-particle multiplicity is also studied, and the densities include the forward rapidity region; these features provide extra data constraints for Monte Carlo modelling of colour reconnection and beam-remnant effects respectively. For the first time, underlying event observables have been computed separately for inclusive jet and exclusive dijet event selections, allowing more detailed study of the interplay of multiple partonic scattering and QCD radiation contributions to the underlying event. Comparisons to the predictions of different Monte Carlo models show a need for further model tuning, but the standard approach is found to generally reproduce the features of the underlying event in both types of event selection.