Why Clinicians Don't Report Adverse Drug Events: Qualitative Study.
Hohl, Corinne M; Small, Serena S; Peddie, David; Badke, Katherin; Bailey, Chantelle; Balka, Ellen
2018-02-27
Adverse drug events are unintended and harmful events related to medications. Adverse drug events are important for patient care, quality improvement, drug safety research, and postmarketing surveillance, but they are vastly underreported. Our objectives were to identify barriers to adverse drug event documentation and factors contributing to underreporting. This qualitative study was conducted in 1 ambulatory center, and the emergency departments and inpatient wards of 3 acute care hospitals in British Columbia between March 2014 and December 2016. We completed workplace observations and focus groups with general practitioners, hospitalists, emergency physicians, and hospital and community pharmacists. We analyzed field notes by coding and iteratively analyzing our data to identify emerging concepts, generate thematic and event summaries, and create workflow diagrams. Clinicians validated emerging concepts by applying them to cases from their clinical practice. We completed 238 hours of observations during which clinicians investigated 65 suspect adverse drug events. The observed events were often complex and diagnosed over time, requiring the input of multiple providers. Providers documented adverse drug events in charts to support continuity of care but never reported them to external agencies. Providers faced time constraints, and reporting would have required duplication of documentation. Existing reporting systems are not suited to capture the complex nature of adverse drug events or adapted to workflow and are simply not used by frontline clinicians. Systems that are integrated into electronic medical records, make use of existing data to avoid duplication of documentation, and generate alerts to improve safety may address the shortcomings of existing systems and generate robust adverse drug event data as a by-product of safer care. ©Corinne M Hohl, Serena S Small, David Peddie, Katherin Badke, Chantelle Bailey, Ellen Balka. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 27.02.2018.
Bounding the Resource Availability of Partially Ordered Events with Constant Resource Impact
NASA Technical Reports Server (NTRS)
Frank, Jeremy
2004-01-01
We compare existing techniques to bound the resource availability of partially ordered events. We first show that, contrary to intuition, two existing techniques, one due to Laborie and one due to Muscettola, are not strictly comparable in terms of the size of the search trees generated under chronological search with a fixed heuristic. We describe a generalization of these techniques called the Flow Balance Constraint to tightly bound the amount of available resource for a set of partially ordered events with piecewise constant resource impact We prove that the new technique generates smaller proof trees under chronological search with a fixed heuristic, at little increase in computational expense. We then show how to construct tighter resource bounds but at increased computational cost.
Expectations and Interpretations during Causal Learning
ERIC Educational Resources Information Center
Luhmann, Christian C.; Ahn, Woo-kyoung
2011-01-01
In existing models of causal induction, 4 types of covariation information (i.e., presence/absence of an event followed by presence/absence of another event) always exert identical influences on causal strength judgments (e.g., joint presence of events always suggests a generative causal relationship). In contrast, we suggest that, due to…
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
Women Faculty, Professional Identity, and Generational Disposition
ERIC Educational Resources Information Center
Marine, Susan B.; Martínez Alemán, Ana M.
2018-01-01
In an exploratory qualitative study, the generational dispositions of tenured women faculty from the Boomer Generation were examined. As pioneers and now senior members in the academic profession in the Golden Era of American higher education, they exist in a common historical location characterized by cultural forces and events that helped to…
An Improved Forwarding of Diverse Events with Mobile Sinks in Underwater Wireless Sensor Networks.
Raza, Waseem; Arshad, Farzana; Ahmed, Imran; Abdul, Wadood; Ghouzali, Sanaa; Niaz, Iftikhar Azim; Javaid, Nadeem
2016-11-04
In this paper, a novel routing strategy to cater the energy consumption and delay sensitivity issues in deep underwater wireless sensor networks is proposed. This strategy is named as ESDR: Event Segregation based Delay sensitive Routing. In this strategy sensed events are segregated on the basis of their criticality and, are forwarded to their respective destinations based on forwarding functions. These functions depend on different routing metrics like: Signal Quality Index, Localization free Signal to Noise Ratio, Energy Cost Function and Depth Dependent Function. The problem of incomparable values of previously defined forwarding functions causes uneven delays in forwarding process. Hence forwarding functions are redefined to ensure their comparable values in different depth regions. Packet forwarding strategy is based on the event segregation approach which forwards one third of the generated events (delay sensitive) to surface sinks and two third events (normal events) are forwarded to mobile sinks. Motion of mobile sinks is influenced by the relative distribution of normal nodes. We have also incorporated two different mobility patterns named as; adaptive mobility and uniform mobility for mobile sinks. The later one is implemented for collecting the packets generated by the normal nodes. These improvements ensure optimum holding time, uniform delay and in-time reporting of delay sensitive events. This scheme is compared with the existing ones and outperforms the existing schemes in terms of network lifetime, delay and throughput.
NASA Astrophysics Data System (ADS)
Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph
2018-07-01
To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.
Sensor-Generated Time Series Events: A Definition Language
Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan
2012-01-01
There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.
Metusalem, Ross; Kutas, Marta; Urbach, Thomas P; Elman, Jeffrey L
2016-04-01
During incremental language comprehension, the brain activates knowledge of described events, including knowledge elements that constitute semantic anomalies in their linguistic context. The present study investigates hemispheric asymmetries in this process, with the aim of advancing our understanding of the neural basis and functional properties of event knowledge activation during incremental comprehension. In a visual half-field event-related brain potential (ERP) experiment, participants read brief discourses in which the third sentence contained a word that was either highly expected, semantically anomalous but related to the described event (Event-Related), or semantically anomalous but unrelated to the described event (Event-Unrelated). For both visual fields of target word presentation, semantically anomalous words elicited N400 ERP components of greater amplitude than did expected words. Crucially, Event-Related anomalous words elicited a reduced N400 relative to Event-Unrelated anomalous words only with left visual field/right hemisphere presentation. This result suggests that right hemisphere processes are critical to the activation of event knowledge elements that violate the linguistic context, and in doing so informs existing theories of hemispheric asymmetries in semantic processing during language comprehension. Additionally, this finding coincides with past research suggesting a crucial role for the right hemisphere in elaborative inference generation, raises interesting questions regarding hemispheric coordination in generating event-specific linguistic expectancies, and more generally highlights the possibility of functional dissociation of event knowledge activation for the generation of elaborative inferences and for linguistic expectancies. Copyright © 2016 Elsevier Ltd. All rights reserved.
The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Act...
1984-08-01
Mississippi River. 91 o In the event that the existing licensee, Ford Motor Company , or another non-Federal entity does not apply to FERC for rights to...interests. The existing hydropower plant and equipment are owned and operated by the Ford Motor Company . The existing four turbines generate a combined...Turbines (Ford Motor Company ) 51 Right (West) Abutment 51 Preliminary Screening of Alternatives 51 Conclusions of the Preliminary Comparative Review 53
Metusalem, Ross; Kutas, Marta; Urbach, Thomas P.; Elman, Jeffrey L.
2016-01-01
During incremental language comprehension, the brain activates knowledge of described events, including knowledge elements that constitute semantic anomalies in their linguistic context. The present study investigates hemispheric asymmetries in this process, with the aim of advancing our understanding of the neural basis and functional properties of event knowledge activation during incremental comprehension. In a visual half-field event-related brain potential (ERP) experiment, participants read brief discourses in which the third sentence contained a word that was either highly expected, semantically anomalous but related to the described event, or semantically anomalous but unrelated to the described event. For both visual fields of target word presentation, semantically anomalous words elicited N400 ERP components of greater amplitude than did expected words. Crucially, event-related anomalous words elicited a reduced N400 relative to event-unrelated anomalous words only with left visual field/right hemisphere presentation. This result suggests that right hemisphere processes are critical to the activation of event knowledge elements that violate the linguistic context, and in doing so informs existing theories of hemispheric asymmetries in semantic processing during language comprehension. Additionally, this finding coincides with past research suggesting a crucial role for the right hemisphere in elaborative inference generation, raises interesting questions regarding hemispheric coordination in generating event-specific linguistic expectancies, and more generally highlights the possibility of functional dissociation between event knowledge activation for the generation of elaborative inferences and for linguistic expectancies. PMID:26878980
Courses of action for effects based operations using evolutionary algorithms
NASA Astrophysics Data System (ADS)
Haider, Sajjad; Levis, Alexander H.
2006-05-01
This paper presents an Evolutionary Algorithms (EAs) based approach to identify effective courses of action (COAs) in Effects Based Operations. The approach uses Timed Influence Nets (TINs) as the underlying mathematical model to capture a dynamic uncertain situation. TINs provide a concise graph-theoretic probabilistic approach to specify the cause and effect relationships that exist among the variables of interest (actions, desired effects, and other uncertain events) in a problem domain. The purpose of building these TIN models is to identify and analyze several alternative courses of action. The current practice is to use trial and error based techniques which are not only labor intensive but also produce sub-optimal results and are not capable of modeling constraints among actionable events. The EA based approach presented in this paper is aimed to overcome these limitations. The approach generates multiple COAs that are close enough in terms of achieving the desired effect. The purpose of generating multiple COAs is to give several alternatives to a decision maker. Moreover, the alternate COAs could be generalized based on the relationships that exist among the actions and their execution timings. The approach also allows a system analyst to capture certain types of constraints among actionable events.
Social anxiety and interpersonal stress generation: the moderating role of interpersonal distress.
Siegel, David M; Burke, Taylor A; Hamilton, Jessica L; Piccirillo, Marilyn L; Scharff, Adela; Alloy, Lauren B
2018-06-01
Existing models of social anxiety scarcely account for interpersonal stress generation. These models also seldom include interpersonal factors that compound the effects of social anxiety. Given recent findings that two forms of interpersonal distress, perceived burdensomeness and thwarted belongingness, intensify social anxiety and cause interpersonal stress generation, these two constructs may be especially relevant to examining social anxiety and interpersonal stress generation together. The current study extended prior research by examining the role of social anxiety in the occurrence of negative and positive interpersonal events and evaluated whether interpersonal distress moderated these associations. Undergraduate students (N = 243; M = 20.46 years; 83% female) completed self-report measures of social anxiety, perceived burdensomeness, and thwarted belongingness, as well as a self-report measure and clinician-rated interview assessing negative and positive interpersonal events that occurred over the past six weeks. Higher levels of social anxiety were associated only with a higher occurrence of negative interpersonal dependent events, after controlling for depressive symptoms. This relationship was stronger among individuals who also reported higher levels of perceived burdensomeness, but not thwarted belongingness. It may be important to more strongly consider interpersonal stress generation in models of social anxiety.
Climate Change Impacts on Runoff Generation for the Design of Sustainable Stormwater Infrastructure
DOT National Transportation Integrated Search
2011-06-01
Climate change over the Pacific Northwest is expected to alter the hydrological cycle, such as an increase in winter flooding potential due to more precipitation falling as snow and more frequent rain on snow events. Existing infrastructure for storm...
AER synthetic generation in hardware for bio-inspired spiking systems
NASA Astrophysics Data System (ADS)
Linares-Barranco, Alejandro; Linares-Barranco, Bernabe; Jimenez-Moreno, Gabriel; Civit-Balcells, Anton
2005-06-01
Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems it is absolutely necessary to have a computer interface that allows (a) to read AER interchip traffic into the computer and visualize it on screen, and (b) convert conventional frame-based video stream in the computer into AER and inject it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. This paper addresses the problem of converting, in a computer, a conventional frame-based video stream into the spike event based representation AER. There exist several proposed software methods for synthetic generation of AER for bio-inspired systems. This paper presents a hardware implementation for one method, which is based on Linear-Feedback-Shift-Register (LFSR) pseudo-random number generation. The sequence of events generated by this hardware, which follows a Poisson distribution like a biological neuron, has been reconstructed using two AER integrator cells. The error of reconstruction for a set of images that produces different traffic loads of event in the AER bus is used as evaluation criteria. A VHDL description of the method, that includes the Xilinx PCI Core, has been implemented and tested using a general purpose PCI-AER board. This PCI-AER board has been developed by authors, and uses a Spartan II 200 FPGA. This system for AER Synthetic Generation is capable of transforming frames of 64x64 pixels, received through a standard computer PCI bus, at a frame rate of 25 frames per second, producing spike events at a peak rate of 107 events per second.
Extreme Quantum Memory Advantage for Rare-Event Sampling
NASA Astrophysics Data System (ADS)
Aghamohammadi, Cina; Loomis, Samuel P.; Mahoney, John R.; Crutchfield, James P.
2018-02-01
We introduce a quantum algorithm for memory-efficient biased sampling of rare events generated by classical memoryful stochastic processes. Two efficiency metrics are used to compare quantum and classical resources for rare-event sampling. For a fixed stochastic process, the first is the classical-to-quantum ratio of required memory. We show for two example processes that there exists an infinite number of rare-event classes for which the memory ratio for sampling is larger than r , for any large real number r . Then, for a sequence of processes each labeled by an integer size N , we compare how the classical and quantum required memories scale with N . In this setting, since both memories can diverge as N →∞ , the efficiency metric tracks how fast they diverge. An extreme quantum memory advantage exists when the classical memory diverges in the limit N →∞ , but the quantum memory has a finite bound. We then show that finite-state Markov processes and spin chains exhibit memory advantage for sampling of almost all of their rare-event classes.
Search for 1st Generation Leptoquarks in the eejj channel with the DZero experiment (in French)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barfuss, Anne-Fleur
2008-09-12
An evidence of the existence of leptoquarks (LQ) would prove the validity of various extensions of the Standard Model of Particle Physics (SM). The search for first generation leptoquarks presented in this dissertation has been performed by analyzing a 1.02 fb -1 sample of data collected by the D0 detector, events with a final state comprising two light jets and two electrons. The absence of an excess of events in comparison to SM expectations leads to exclude scalar LQ masses up to 292 GeV and vector LQ masses from 350 to 458 GeV, depending on the LQ-l-q coupling type. Themore » great importance of a good jet energy measurement motivated the study of the instrumental backgrounds correlated to the calorimeter, as much as studies of the hadronic showers energy resolution in γ + jets events.« less
NASA Astrophysics Data System (ADS)
Ushio, Toshimitsu; Takai, Shigemasa
Supervisory control is a general framework of logical control of discrete event systems. A supervisor assigns a set of control-disabled controllable events based on observed events so that the controlled discrete event system generates specified languages. In conventional supervisory control, it is assumed that observed events are determined by internal events deterministically. But, this assumption does not hold in a discrete event system with sensor errors and a mobile system, where each observed event depends on not only an internal event but also a state just before the occurrence of the internal event. In this paper, we model such a discrete event system by a Mealy automaton with a nondeterministic output function. We introduce two kinds of supervisors: one assigns each control action based on a permissive policy and the other based on an anti-permissive one. We show necessary and sufficient conditions for the existence of each supervisor. Moreover, we discuss the relationship between the supervisors in the case that the output function is determinisitic.
Enabling Resiliency Operations across Multiple Microgrids with Grid Friendly Appliance Controllers
Schneider, Kevin P.; Tuffner, Frank K.; Elizondo, Marcelo A.; ...
2017-02-16
Changes in economic, technological, and environmental policies are resulting in a re-evaluation of the dependence on large central generation facilities and their associated transmission networks. Emerging concepts of smart communities/cities are examining the potential to leverage cleaner sources of generation, as well as integrating electricity generation with other municipal functions. When grid connected, these generation assets can supplement the existing interconnections with the bulk transmission system, and in the event of an extreme event, they can provide power via a collection of microgrids. To achieve the highest level of resiliency, it may be necessary to conduct switching operations to interconnectmore » individual microgrids. While the interconnection of multiple microgrids can increase the resiliency of the system, the associated switching operations can cause large transients in low inertia microgrids. The combination of low system inertia and IEEE 1547 and 1547a-compliant inverters can prevent multiple microgrids from being interconnected during extreme weather events. This study will present a method of using end-use loads equipped with Grid Friendly™ Appliance controllers to facilitate the switching operations between multiple microgrids; operations that are necessary for optimal operations when islanded for resiliency.« less
Enabling Resiliency Operations across Multiple Microgrids with Grid Friendly Appliance Controllers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, Kevin P.; Tuffner, Frank K.; Elizondo, Marcelo A.
Changes in economic, technological, and environmental policies are resulting in a re-evaluation of the dependence on large central generation facilities and their associated transmission networks. Emerging concepts of smart communities/cities are examining the potential to leverage cleaner sources of generation, as well as integrating electricity generation with other municipal functions. When grid connected, these generation assets can supplement the existing interconnections with the bulk transmission system, and in the event of an extreme event, they can provide power via a collection of microgrids. To achieve the highest level of resiliency, it may be necessary to conduct switching operations to interconnectmore » individual microgrids. While the interconnection of multiple microgrids can increase the resiliency of the system, the associated switching operations can cause large transients in low inertia microgrids. The combination of low system inertia and IEEE 1547 and 1547a-compliant inverters can prevent multiple microgrids from being interconnected during extreme weather events. This study will present a method of using end-use loads equipped with Grid Friendly™ Appliance controllers to facilitate the switching operations between multiple microgrids; operations that are necessary for optimal operations when islanded for resiliency.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duchkov, A. A., E-mail: DuchkovAA@ipgg.sbras.ru; Novosibirsk State University, Novosibirsk, 630090; Stefanov, Yu. P., E-mail: stefanov@ispms.tsc.ru
2015-10-27
We have developed and illustrated an approach for geomechanic modeling of elastic wave generation (microsiesmic event occurrence) during incremental fracture growth. We then derived properties of effective point seismic sources (radiation patterns) approximating obtained wavefields. These results establish connection between geomechanic models of hydraulic fracturing and microseismic monitoring. Thus, the results of the moment tensor inversion of microseismic data can be related to different geomechanic scenarios of hydraulic fracture growth. In future, the results can be used for calibrating hydrofrac models. We carried out a series of numerical simulations and made some observations about wave generation during fracture growth. Inmore » particular when the growing fracture hits pre-existing crack then it generates much stronger microseismic event compared to fracture growth in homogeneous medium (radiation pattern is very close to the theoretical dipole-type source mechanism)« less
Stange, Jonathan P.; Kleiman, Evan M.; Hamlat, Elissa J.; Abramson, Lyn Y.; Alloy, Lauren B.
2013-01-01
Early pubertal timing has been found to confer risk for the occurrence of interpersonal stressful events during adolescence. However, pre-existing vulnerabilities may exacerbate the effects of early pubertal timing on the occurrence of stressors. Thus, the current study prospectively examined whether cognitive vulnerabilities amplified the effects of early pubertal timing on interpersonal stress generation. In a diverse sample of 310 adolescents (M age = 12.83 years, 55 % female; 53 % African American), early pubertal timing predicted higher levels of interpersonal dependent events among adolescents with more negative cognitive style and rumination, but not among adolescents with lower levels of these cognitive vulnerabilities. These findings suggest that cognitive vulnerabilities may heighten the risk of generating interpersonal stress for adolescents who undergo early pubertal maturation, which may subsequently place adolescents at greater risk for the development of psychopathology. PMID:24061858
NASA Astrophysics Data System (ADS)
Mileo, Nicolas; de la Puente, Alejandro; Szynkman, Alejandro
2016-11-01
We study the production of scalar leptoquarks at IceCube, in particular, a particle transforming as a triplet under the weak interaction. The existence of electroweak-triplet scalars is highly motivated by models of grand unification and also within radiative seesaw models for neutrino mass generation. In our framework, we extend the Standard Model by a single colored electroweak-triplet scalar leptoquark and analyze its implications on the excess of ultra-high energy neutrino events observed by the IceCube collaboration. We consider only couplings between the leptoquark to first generation of quarks and first and second generations of leptons, and carry out a statistical analysis to determine the parameters that best describe the IceCube data as well as set 95% CL upper bounds. We analyze whether this study is still consistent with most up-to-date LHC data and various low energy observables.
Chen, Ming-Hui; Zeng, Donglin; Hu, Kuolung; Jia, Catherine
2014-01-01
Summary In many biomedical studies, patients may experience the same type of recurrent event repeatedly over time, such as bleeding, multiple infections and disease. In this article, we propose a Bayesian design to a pivotal clinical trial in which lower risk myelodysplastic syndromes (MDS) patients are treated with MDS disease modifying therapies. One of the key study objectives is to demonstrate the investigational product (treatment) effect on reduction of platelet transfusion and bleeding events while receiving MDS therapies. In this context, we propose a new Bayesian approach for the design of superiority clinical trials using recurrent events frailty regression models. Historical recurrent events data from an already completed phase 2 trial are incorporated into the Bayesian design via the partial borrowing power prior of Ibrahim et al. (2012, Biometrics 68, 578–586). An efficient Gibbs sampling algorithm, a predictive data generation algorithm, and a simulation-based algorithm are developed for sampling from the fitting posterior distribution, generating the predictive recurrent events data, and computing various design quantities such as the type I error rate and power, respectively. An extensive simulation study is conducted to compare the proposed method to the existing frequentist methods and to investigate various operating characteristics of the proposed design. PMID:25041037
Real-Time Multimission Event Notification System for Mars Relay
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.
2013-01-01
As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.
10 CFR 1800.13 - Conditions for becoming an eligible party state.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., including all surcharges (except new surcharges imposed pursuant to Article V.f.3. of the Compact), shall... institutional control period as a result of the radioactive waste and waste management operations of any... from generators within the borders of the existing states. In the event of such unavailability, the new...
Off Our Lawns and out of Our Basements: How We (Mis)Understand the Millennial Generation
ERIC Educational Resources Information Center
Mechler, Heather
2013-01-01
In this article, the author explores the existing research on the characteristics of Millennials within historical, social, and economic contexts. While many researchers have made claims about Millennials, they fail to consider how parenting styles, economic factors, historical events, and shifts in educational priorities may have created the…
Initial Evaluation of Signal-Based Bayesian Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Russell, S.
2016-12-01
We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.
Howell, Lydia Pleotis; Joad, Jesse P; Callahan, Edward; Servis, Gregg; Bonham, Ann C
2009-08-01
Multigenerational teams are essential to the missions of academic health centers (AHCs). Generational forecasting using Strauss and Howe's predictive model, "the generational diagonal," can be useful for anticipating and addressing issues so that each generation is effective. Forecasts are based on the observation that cyclical historical events are experienced by all generations, but the response of each generation differs according to its phase of life and previous defining experiences. This article relates Strauss and Howe's generational forecasts to AHCs. Predicted issues such as work-life balance, indebtedness, and succession planning have existed previously, but they now have different causes or consequences because of the unique experiences and life stages of current generations. Efforts to address these issues at the authors' AHC include a work-life balance workgroup, expanded leave, and intramural grants.
NASA Astrophysics Data System (ADS)
Diaz, E.; Webb, F.; Green, D. S.; Stough, T.; Kirschbaum, D.; Goodman, H. M.; Molthan, A.
2015-12-01
In the hours following the magnitude 7.8 Gorkha, Nepal, earthquake on April 25, 2015, NASA and its partners began the process of assessing their ability to provide actionable data from a variety of space resources and scientific capabiltiies in order to provide responders with actionable information to assist in the relief and humanitarian operations. Working with the USGS, NGA, ASI, and JAXA, in the hours and days following the event, the team generated a number of scientific data products that were distributed to organizations responding to the event. Data included, ground based geodetic observations, optical and radar data from international and domestic partners, to compile a variety of products, including "vulnerability maps," used to determine risks that may be present, and "damage proxy maps," used to determine the type and extent of existing damage. This talk will focus on the response process, highlighting some of the products generated and distributed and lessons learned that would be useful for responding to future events that would improve the effectiveness of such a broad, agency wide response.
Fog-Based Two-Phase Event Monitoring and Data Gathering in Vehicular Sensor Networks
Yang, Fan; Su, Jinsong; Zhou, Qifeng; Wang, Tian; Zhang, Lu; Xu, Yifan
2017-01-01
Vehicular nodes are equipped with more and more sensing units, and a large amount of sensing data is generated. Recently, more and more research considers cooperative urban sensing as the heart of intelligent and green city traffic management. The key components of the platform will be a combination of a pervasive vehicular sensing system, as well as a central control and analysis system, where data-gathering is a fundamental component. However, the data-gathering and monitoring are also challenging issues in vehicular sensor networks because of the large amount of data and the dynamic nature of the network. In this paper, we propose an efficient continuous event-monitoring and data-gathering framework based on fog nodes in vehicular sensor networks. A fog-based two-level threshold strategy is adopted to suppress unnecessary data upload and transmissions. In the monitoring phase, nodes sense the environment in low cost sensing mode and generate sensed data. When the probability of the event is high and exceeds some threshold, nodes transfer to the event-checking phase, and some nodes would be selected to transfer to the deep sensing mode to generate more accurate data of the environment. Furthermore, it adaptively adjusts the threshold to upload a suitable amount of data for decision making, while at the same time suppressing unnecessary message transmissions. Simulation results showed that the proposed scheme could reduce more than 84 percent of the data transmissions compared with other existing algorithms, while it detects the events and gathers the event data. PMID:29286320
Causing Factors for Extreme Precipitation in the Western Saudi-Arabian Peninsula
NASA Astrophysics Data System (ADS)
Alharbi, M. M.; Leckebusch, G. C.
2015-12-01
In the western coast of Saudi Arabia the climate is in general semi-arid but extreme precipitation events occur on a regular basis: e.g., on 26th November 2009, when 122 people were killed and 350 reported missing in Jeddah following more than 90mm in just four hours. Our investigation will a) analyse major drivers of the generation of extremes and b) investigate major responsible modes of variability for the occurrence of extremes. Firstly, we present a systematic analysis of station based observations of the most relevant extreme events (1985-2013) for 5 stations (Gizan, Makkah, Jeddah, Yenbo and Wejh). Secondly, we investigate the responsible mechanism on the synoptic to large-scale leading to the generation of extremes and will analyse factors for the time variability of extreme event occurrence. Extreme events for each station are identified in the wet season (Nov-Jan): 122 events show intensity above the respective 90th percentile. The most extreme events are systematically investigated with respect to the responsible forcing conditions which we can identify as: The influence of the Soudan Low, active Red-Sea-Trough situations established via interactions with mid-latitude tropospheric wave activity, low pressure systems over the Mediterranean, the influence of the North Africa High, the Arabian Anticyclone and the influence of the Indian monsoon trough. We investigate the role of dynamical forcing factors like the STJ and the upper-troposphere geopotential conditions and the relation to smaller local low-pressure systems. By means of an empirical orthogonal function (EOF) analysis based on MSLP we investigate the possibility to objectively quantify the influence of existing major variability modes and their role for the generation of extreme precipitation events.
The Generation of a Stochastic Flood Event Catalogue for Continental USA
NASA Astrophysics Data System (ADS)
Quinn, N.; Wing, O.; Smith, A.; Sampson, C. C.; Neal, J. C.; Bates, P. D.
2017-12-01
Recent advances in the acquisition of spatiotemporal environmental data and improvements in computational capabilities has enabled the generation of large scale, even global, flood hazard layers which serve as a critical decision-making tool for a range of end users. However, these datasets are designed to indicate only the probability and depth of inundation at a given location and are unable to describe the likelihood of concurrent flooding across multiple sites.Recent research has highlighted that although the estimation of large, widespread flood events is of great value to flood mitigation and insurance industries, to date it has been difficult to deal with this spatial dependence structure in flood risk over relatively large scales. Many existing approaches have been restricted to empirical estimates of risk based on historic events, limiting their capability of assessing risk over the full range of plausible scenarios. Therefore, this research utilises a recently developed model-based approach to describe the multisite joint distribution of extreme river flows across continental USA river gauges. Given an extreme event at a site, the model characterises the likelihood neighbouring sites are also impacted. This information is used to simulate an ensemble of plausible synthetic extreme event footprints from which flood depths are extracted from an existing global flood hazard catalogue. Expected economic losses are then estimated by overlaying flood depths with national datasets defining asset locations, characteristics and depth damage functions. The ability of this approach to quantify probabilistic economic risk and rare threshold exceeding events is expected to be of value to those interested in the flood mitigation and insurance sectors.This work describes the methodological steps taken to create the flood loss catalogue over a national scale; highlights the uncertainty in the expected annual economic vulnerability within the USA from extreme river flows; and presents future developments to the modelling approach.
Estimating evaporative vapor generation from automobiles based on parking activities.
Dong, Xinyi; Tschantz, Michael; Fu, Joshua S
2015-07-01
A new approach is proposed to quantify the evaporative vapor generation based on real parking activity data. As compared to the existing methods, two improvements are applied in this new approach to reduce the uncertainties: First, evaporative vapor generation from diurnal parking events is usually calculated based on estimated average parking duration for the whole fleet, while in this study, vapor generation rate is calculated based on parking activities distribution. Second, rather than using the daily temperature gradient, this study uses hourly temperature observations to derive the hourly incremental vapor generation rates. The parking distribution and hourly incremental vapor generation rates are then adopted with Wade-Reddy's equation to estimate the weighted average evaporative generation. We find that hourly incremental rates can better describe the temporal variations of vapor generation, and the weighted vapor generation rate is 5-8% less than calculation without considering parking activity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Skyalert: a Platform for Event Understanding and Dissemination
NASA Astrophysics Data System (ADS)
Williams, Roy; Drake, A. J.; Djorgovski, S. G.; Donalek, C.; Graham, M. J.; Mahabal, A.
2010-01-01
Skyalert.org is an event repository, web interface, and event-oriented workflow architecture that can be used in many different ways for handling astronomical events that are encoded as VOEvent. It can be used as a remote application (events in the cloud) or installed locally. Some applications are: Dissemination of events with sophisticated discrimination (trigger), using email, instant message, RSS, twitter, etc; Authoring interface for survey-generated events, follow-up observations, and other event types; event streams can be put into the skyalert.org repository, either public or private, or into a local inbstallation of Skyalert; Event-driven software components to fetch archival data, for data-mining and classification of events; human interface to events though wiki, comments, and circulars; use of the "notices and circulars" model, where machines make the notices in real time and people write the interpretation later; Building trusted, automated decisions for automated follow-up observation, and the information infrastructure for automated follow-up with DC3 and HTN telescope schedulers; Citizen science projects such as artifact detection and classification; Query capability for past events, including correlations between different streams and correlations with existing source catalogs; Event metadata structures and connection to the global registry of the virtual observatory.
Existing generating assets squeezed as new project starts slow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, R.B.; Tiffany, E.D.
Most forecasting reports concentrate on political or regulatory events to predict future industry trends. Frequently overlooked are the more empirical performance trends of the principal power generation technologies. Solomon and Associates queried its many power plant performance databases and crunched some numbers to identify those trends. Areas of investigation included reliability, utilization (net output factor and net capacity factor) and cost (operating costs). An in-depth analysis for North America and Europe is presented in this article, by region and by regeneration technology. 4 figs., 2 tabs.
Tidal influence through LOD variations on the temporal distribution of earthquake occurrences
NASA Astrophysics Data System (ADS)
Varga, P.; Gambis, D.; Bizouard, Ch.; Bus, Z.; Kiszely, M.
2006-10-01
Stresses generated by the body tides are very small at the depth of crustal earth- quakes (~10^2 N/m2). The maximum value of the lunisolar stress within the depth range of earthquakes is 10^3 N/m2 (at depth of about 600 km). Surface loads, due to oceanic tides, in coastal areas are ~ 104 N/m2. These influences are however too small to affect the outbreak time of seismic events. Authors show the effect on time distribution of seismic activity due to ΔLOD generated by zonal tides for the case of Mf, Mm, Ssa and Sa tidal constituents can be much more effective to trigger earthquakes. According to this approach we show that the tides are not directly triggering the seismic events but through the generated length of day variations. That is the reason why in case of zonal tides a correlation of the lunisolar effect and seismic activity exists, what is not the case for the tesseral and sectorial tides.
NASA Astrophysics Data System (ADS)
He, Lixin; Li, Yang; Wang, Zhe; Zhang, Qingbin; Lan, Pengfei; Lu, Peixiang
2014-05-01
We have performed the quantum trajectory analysis for high-order-harmonic generation (HHG) with different driving laser wavelengths. By defining the ratio of HHG yields of the Nth and first rescattering events (YN/Y1), we quantitatively evaluate the HHG contributions from multiple rescatterings. The results show that the HHG yield ratio increases gradually with the increase of the laser wavelength, which demonstrates that high-order rescatterings provide ascendent contributions to HHG at longer wavelength. By calculating the classical electron trajectories, we find significant differences exist in the electron behaviors between the first and high-order rescatterings. Further investigations have demonstrated that the increasing HHG yield ratio is mainly attributed to the relatively smaller contributions from the short path of the first electron rescattering at longer laser wavelength.
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.
WRF simulation of downslope wind events in coastal Santa Barbara County
NASA Astrophysics Data System (ADS)
Cannon, Forest; Carvalho, Leila M. V.; Jones, Charles; Hall, Todd; Gomberg, David; Dumas, John; Jackson, Mark
2017-07-01
The National Weather Service (NWS) considers frequent gusty downslope winds, accompanied by rapid warming and decreased relative humidity, among the most significant weather events affecting southern California coastal areas in the vicinity of Santa Barbara (SB). These extreme conditions, commonly known as "sundowners", have affected the evolution of all major wildfires that impacted SB in recent years. Sundowners greatly increase fire, aviation and maritime navigation hazards and are thus a priority for regional forecasting. Currently, the NWS employs the Weather Research Forecasting (WRF) model at 2 km resolution to complement forecasts at regional-to-local scales. However, no systematic study has been performed to evaluate the skill of WRF in simulating sundowners. This research presents a case study of an 11-day period in spring 2004 during which sundowner events were observed on multiple nights. We perform sensitivity experiments for WRF using available observations for validation and demonstrate that WRF is skillful in representing the general mesoscale structure of these events, though important shortcomings exist. Furthermore, we discuss the generation and evolution of sundowners during the case study using the best performing configuration, and compare these results to hindcasts for two major SB fires. Unique, but similar, profiles of wind and stability are observed over SB between case studies despite considerable differences in large-scale circulation, indicating that common conditions may exist across all events. These findings aid in understanding the evolution of sundowner events and are potentially valuable for event prediction.
NASA Astrophysics Data System (ADS)
Kardos, Adam; Trócsányi, Zoltán
2015-05-01
We simulate the hadroproduction of a -pair in association with a hard photon at LHC using the PowHel package. These events are almost fully inclusive with respect to the photon, allowing for any physically relevant isolation of the photon. We use the generated events, stored according to the Les-Houches event format, to make predictions for differential distributions formally at the next-to-leading order (NLO) accuracy and we compare these to existing predictions accurate at NLO using the smooth isolation prescription of Frixione. Our fixed-order predictions include the direct-photon contribution only. We also make predictions for distributions after full parton shower and hadronization using the standard experimental cone-isolation of the photon.
Historical tsunami in the Azores archipelago (Portugal)
NASA Astrophysics Data System (ADS)
Andrade, C.; Borges, P.; Freitas, M. C.
2006-08-01
Because of its exposed northern mid-Atlantic location, morphology and plate-tectonics setting, the Azores Archipelago is highly vulnerable to tsunami hazards associated with landslides and seismic or volcanic triggers, local or distal. Critical examination of available data - written accounts and geologic evidence - indicates that, since the settlement of the archipelago in the 15th century, at least 23 tsunami have struck Azorean coastal zones. Most of the recorded tsunami are generated by earthquakes. The highest known run-up (11-15 m) was recorded on 1 November 1755 at Terceira Island, corresponding to an event of intensity VII-VIII (damaging-heavily damaging) on the Papadopolous-Imamura scale. To date, eruptive activity, while relatively frequent in the Azores, does not appear to have generated destructive tsunami. However, this apparent paucity of volcanogenic tsunami in the historical record may be misleading because of limited instrumental and documentary data, and small source-volumes released during historical eruptions. The latter are in contrast with the geological record of massive pyroclastic flows and caldera explosions with potential to generate high-magnitude tsunami, predating settlement. In addition, limited evidence suggests that submarine landslides from unstable volcano flanks may have also triggered some damaging tsunamigenic floods that perhaps were erroneously attributed to intense storms. The lack of destructive tsunami since the mid-18th century has led to governmental complacency and public disinterest in the Azores, as demonstrated by the fact that existing emergency regulations concerning seismic events in the Azores Autonomous Region make no mention of tsunami and their attendant hazards. We suspect that the coastal fringe of the Azores may well preserve a sedimentary record of some past tsunamigenic flooding events. Geological field studies must be accelerated to expand the existing database to include prehistoric events-information essential for more precisely estimating the average tsunami recurrence rate for the Azores over a longer period. A present-day occurrence of a moderate to intense tsunami (i.e., the size of the 1755 event) would produce societal disruption and economic loss orders of magnitudes greater than those of previous events in Azorean history. To reduce risk from future tsunami, comprehensive assessment of tsunami hazards and the preparation of hazards-zonation maps are needed to guide governmental decisions on issues of prudent land-use planning, public education and emergency management.
Optical rogue waves and stimulated supercontinuum generation
NASA Astrophysics Data System (ADS)
Solli, Daniel R.; Ropers, Claus; Jalali, Bahram
2010-06-01
Nonlinear action is known for its ability to create unusual phenomena and unexpected events. Optical rogue waves-freak pulses of broadband light arising in nonlinear fiber-testify to the fact that optical nonlinearities are no less capable of generating anomalous events than those in other physical contexts. In this paper, we will review our work on optical rogue waves, an ultrafast phenomenon counterpart to the freak ocean waves known to roam the open oceans. We will discuss the experimental observation of these rare events in real time and the measurement of their heavytailed statistical properties-a probabilistic form known to appear in a wide variety of other complex systems from financial markets to genetics. The nonlinear Schrödinger equation predicts the existence of optical rogue waves, offering a means to study their origins with simulations. We will also discuss the type of initial conditions behind optical rogue waves. Because a subtle but specific fluctuation leads to extreme waves, the rogue wave instability can be harnessed to produce these events on demand. By exploiting this property, it is possible to produce a new type of optical switch as well as a supercontinuum source that operates in the long pulse regime but still achieves a stable, coherent output.
Applications of the Renewable Energy Network Optimization Tool
NASA Astrophysics Data System (ADS)
Alliss, R.; Link, R.; Apling, D.; Kiley, H.; Mason, M.; Darmenova, K.
2010-12-01
As the renewable energy industry continues to grow so does the requirement for atmospheric modeling and analysis tools to maximize both wind and solar power. Renewable energy generation is variable however; presenting challenges for electrical grid operation and requires a variety of measures to adequately firm power. These measures include the production of non-renewable generation during times when renewables are not available. One strategy for minimizing the variability of renewable energy production is site diversity. Assuming that a network of renewable energy systems feed a common electrical grid, site diversity ensures that when one system on the network has a reduction in generation others on the same grid make up the difference. The site-diversity strategy can be used to mitigate the intermittency in alternative energy production systems while still maximizing saleable energy. The Renewable Energy Network Optimization Tool (ReNOT) has recently been developed to study the merits of site optimization for wind farms. The modeling system has a plug-in architecture that allows us to accommodate a wide variety of renewable energy system designs and performance metrics. The Weather Research and Forecasting (WRF) mesoscale model is applied to generate high-resolution wind databases to support the site selection of wind farms. These databases are generated on High Performance Computing systems such as the Rocky Mountain Supercomputing Center (RMSC). The databases are then accessed by ReNOT and an optimized site selection is developed. We can accommodate numerous constraints (e.g., number of sites, the geographic extent of the optimization, proximity to high-voltage transport lines, etc.). As part of our collaboration with RMSC and the State of Montana a study was performed to estimate the optimal locations of a network of wind farms. Comparisons were made to four existing wind farm locations in Montana including Glacier with a 210 MW name plate capacity, Horseshoe Bend with a total capacity of 9 MW, Diamond Willow with a capacity of 20MW and Judith Gap with a total capacity of 135 MW. The goal of this study was to see if ReNOT could find a four site network that made more effective use of the existing four site network of wind farms' 374 MW nameplate capacity. We developed three different metrics in which to pick sites. Metric 3 (M3) picks sites based on the previous day's mean power, and accounts for short-term variability (i.e., 1 hour). M3 attempts to approximate usable power by minimizing ramping events which are so important to industry. In addition we investigated several performance metrics including Mean Power, Usable Power, and ramping event frequency. A ramping event is defined as an increase or decrease in power production over the course of one hour. Of interest was the frequency of ramping events that exceeded 10% of total capacity for the network. Networks with few ramping events are markedly superior to networks producing otherwise identical aggregate power. The optimization was run over the 15-year period of hub-height wind data (40 meters AGL). The ReNOT derived network produces 58% more usable power than the four existing and operating wind farms. In addition, the optimized four site network produces three times fewer significant ramping events.
Predicting Pilot Performance in Off-Nominal Conditions: A Meta-Analysis and Model Validation
NASA Technical Reports Server (NTRS)
Wickens, C.D.; Hooey, B.L.; Gore, B.F.; Sebok, A.; Koenecke, C.; Salud, E.
2009-01-01
Pilot response to off-nominal (very rare) events represents a critical component to understanding the safety of next generation airspace technology and procedures. We describe a meta-analysis designed to integrate the existing data regarding pilot accuracy of detecting rare, unexpected events such as runway incursions in realistic flight simulations. Thirty-five studies were identified and pilot responses were categorized by expectancy, event location, and whether the pilot was flying with a highway-in-the-sky display. All three dichotomies produced large, significant effects on event miss rate. A model of human attention and noticing, N-SEEV, was then used to predict event noticing performance as a function of event salience and expectancy, and retinal eccentricity. Eccentricity is predicted from steady state scanning by the SEEV model of attention allocation. The model was used to predict miss rates for the expectancy, location and highway-in-the-sky (HITS) effects identified in the meta-analysis. The correlation between model-predicted results and data from the meta-analysis was 0.72.
NASA Astrophysics Data System (ADS)
Di Vittorio, Alan V.; Negrón-Juárez, Robinson I.; Higuchi, Niro; Chambers, Jeffrey Q.
2014-03-01
Debate continues over the adequacy of existing field plots to sufficiently capture Amazon forest dynamics to estimate regional forest carbon balance. Tree mortality dynamics are particularly uncertain due to the difficulty of observing large, infrequent disturbances. A recent paper (Chambers et al 2013 Proc. Natl Acad. Sci. 110 3949-54) reported that Central Amazon plots missed 9-17% of tree mortality, and here we address ‘why’ by elucidating two distinct mortality components: (1) variation in annual landscape-scale average mortality and (2) the frequency distribution of the size of clustered mortality events. Using a stochastic-empirical tree growth model we show that a power law distribution of event size (based on merged plot and satellite data) is required to generate spatial clustering of mortality that is consistent with forest gap observations. We conclude that existing plots do not sufficiently capture losses because their placement, size, and longevity assume spatially random mortality, while mortality is actually distributed among differently sized events (clusters of dead trees) that determine the spatial structure of forest canopies.
NASA Astrophysics Data System (ADS)
Ikelle, Luc T.
2006-02-01
We here describe one way of constructing internal multiples from surface seismic data only. The key feature of our construct of internal multiples is the introduction of the concept of virtual seismic events. Virtual events here are events, which are not directly recorded in standard seismic data acquisition, but their existence allows us to construct internal multiples with scattering points at the sea surface; the standard construct of internal multiples does not include any scattering points at the sea surface. The mathematical and computational operations invoked in our construction of virtual events and internal multiples are similar to those encountered in the construction of free-surface multiples based on the Kirchhoff or Born scattering theory. For instance, our construct operates on one temporal frequency at a time, just like free-surface demultiple algorithms; other internal multiple constructs tend to require all frequencies for the computation of an internal multiple at a given frequency. It does not require any knowledge of the subsurface nor an explicit knowledge of specific interfaces that are responsible for the generation of internal multiples in seismic data. However, our construct requires that the data be divided into two, three or four windows to avoid generating primaries. This segmentation of the data also allows us to select a range of periods of internal multiples that one wishes to construct because, in the context of the attenuation of internal multiples, it is important to avoid generating short-period internal multiples that may constructively average to form primaries at the seismic scale.
Far from thunderstorm UV transient events in the atmosphere measured by Vernov satellite
NASA Astrophysics Data System (ADS)
Morozenko, Violetta; Klimov, Pavel; Khrenov, Boris; Gali, Garipov; Margarita, Kaznacheeva; Mikhail, Panasyuk; Sergei, Svertilov; Robert, Holzworth
2016-04-01
The steady self-contained classification of events such as sprites, elves, blue jets emerged for the period of transient luminous events (TLE) observation. In accordance with TLE origin theories the presence of the thunderstorm region where the lightnings with the large peak current generating in is necessary. However, some far-from-thunderstorm region events were also detected and revealed to us another TLE generating mechanisms. For the discovering of the TLE nature the Universitetsky-Tatiana-2 and Vernov satellites were equipped with ultraviolet (240-400 nm) and red-infrared ( >610 nm) detectors. In both detector it was carried out regardless the lightnings with the guidance by the flashes in the UV wavelength where lightning's emitting is quite faint. The lowered threshold on the Vernov satellite allowed to select the great amount of TLE with the numerous far-from-thunderstorm region events examples. such events were not conjuncted with lightning activity measured by global lightning location network (WWLLN) on the large area of approximately 107 km2 for 30 minutes before and after the time of registration. The characteristic features of this type of event are: the absence of significant signal in the red-infrared detector's channel; a relatively small number of photons (less than 5 ṡ 1021). A large number of without lightning flash were detected at high latitudes over the ocean (30°S - 60°S). Lightning activity in the magnetic conjugate point also was analyzed. The relationship of far-from-thunderstorm region events with the specific lightning discharges didn't confirmed. Far-from-thunderstorm events - a new type of transient phenomena in the upper atmosphere is not associated with the thunderstorm activity. The mechanism of such discharges is not clear, though it was accumulated a sufficient amount of experimental facts of the existence of such flashes. According to the data of Vernov satellite the temporal profile, duration, location with earth coordinates and the number of photons generated in the far-from-thunderstorm atmospheric events has been analyzed and the discussion of these events origin is in progress.
Quantifying and Monetizing Renewable Energy Resiliency
Anderson, Kate H.; Laws, Nicholas D.; Marr, Spencer; ...
2018-03-23
Energy resiliency has been thrust to the forefront by recent severe weather events and natural disasters. Billions of dollars are lost each year due to power outages. This article highlights the unique value renewable energy hybrid systems (REHS), comprised of solar, energy storage, and generators, provide in increasing resiliency. We present a methodology to quantify the amount and value of resiliency provided by REHS, and ways to monetize this resiliency value through insurance premium discounts. A case study of buildings in New York City demonstrates how implementing REHS in place of traditional backup diesel generators can double the amount ofmore » outage survivability, with an added value of $781,200. For a Superstorm Sandy type event, results indicate that insurance premium reductions could support up to 4% of the capital cost of REHS, and the potential exists to prevent up to $2.5 billion in business interruption losses with increased REHS deployment.« less
Quantifying and Monetizing Renewable Energy Resiliency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Kate H.; Laws, Nicholas D.; Marr, Spencer
Energy resiliency has been thrust to the forefront by recent severe weather events and natural disasters. Billions of dollars are lost each year due to power outages. This article highlights the unique value renewable energy hybrid systems (REHS), comprised of solar, energy storage, and generators, provide in increasing resiliency. We present a methodology to quantify the amount and value of resiliency provided by REHS, and ways to monetize this resiliency value through insurance premium discounts. A case study of buildings in New York City demonstrates how implementing REHS in place of traditional backup diesel generators can double the amount ofmore » outage survivability, with an added value of $781,200. For a Superstorm Sandy type event, results indicate that insurance premium reductions could support up to 4% of the capital cost of REHS, and the potential exists to prevent up to $2.5 billion in business interruption losses with increased REHS deployment.« less
Airborne exposure limits for chemical and biological warfare agents: is everything set and clear?
Sabelnikov, Alex; Zhukov, Vladimir; Kempf, C Ruth
2006-08-01
Emergency response strategies (guidelines) for biological, chemical, nuclear, or radiological terrorist events should be based on scientifically established exposure limits for all the agents or materials involved. In the case of a radiological terrorist event, emergency response guidelines (ERG) have been worked out. In the case of a terrorist event with the use of chemical warfare (CW) agents the situation is not that clear, though the new guidelines and clean-up values are being generated based on re-evaluation of toxicological and risk data. For biological warfare (BW) agents, such guidelines do not yet exist. In this paper the current status of airborne exposure limits (AELs) for chemical and biological warfare (CBW) agents are reviewed. Particular emphasis is put on BW agents that lack such data. An efficient, temporary solution to bridge the gap in experimental infectious data and to set provisional AELs for BW agents is suggested. It is based on mathematically generated risks of infection for BW agents grouped by their alleged ID50 values in three categories: with low, intermediate and high ID50 values.
NASA Astrophysics Data System (ADS)
Aucan, Jérôme; Vendé-Leclerc, Myriam; Dumas, Pascal; Bricquir, Marianne
2017-10-01
In the present study, we examine how waves may contribute to the morphological changes of islets in the New Caledonia lagoon. We collected in situ wave data to investigate their characteristics. Three types of waves are identified and quantified: (1) high-frequency waves generated within the lagoon, (2) low-frequency waves originating from swells in the Tasman Sea, and (3) infragravity waves. We found out that high-frequency waves are the dominant forcing on the islets during typical wind events throughout the year, while infragravity waves, likely generated by the breaking of low-frequency waves, dominate during seasonal swell events. During swell events, low-frequency waves can also directly propagate to the islets through channels across the barrier reef, or be tidally modulated across the barrier reef before reaching the islets. Topographic surveys and beach profiles on one islet indicate areas with seasonal morphological changes and other areas with longer, interannual or decadal, erosion patterns. Although more data are needed to validate this hypothesis, we suspect that a relation exists between wave forcing and morphological changes of the islets.
NASA Technical Reports Server (NTRS)
Green, J.; Wys, J. Negusde; Zuppero, A.
1992-01-01
The importance of H2O on the lunar surface has long been identified as a high priority for the existence of a human colony for mining activities and, more recently, for space fuel. Using the Earth as an analog, volcanic activity would suggest the generation of water during lunar history. Evidence of volcanism is found not only in present lunar morphology, but in over 400 locations of lunar transient events cataloged by Middlehurst and Kuiper in the 1960's. These events consisted of sightings since early history of vapor emissions and bright spots or flares. Later infrared scanning by Saari and Shorthill showed 'hot spots', many of which coincided with transient event sites. Many of the locations of Middlehurst and Kuiper were the sites of repeat events, leading to the conclusion that these were possibly volcanic in nature. The detection and use of H2O from the lunar surface is discussed.
Assessing hail risk for a building portfolio by generating stochastic events
NASA Astrophysics Data System (ADS)
Nicolet, Pierrick; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel; Nguyen, Liliane; Voumard, Jérémie
2015-04-01
Among the natural hazards affecting buildings, hail is one of the most costly and is nowadays a major concern for building insurance companies. In Switzerland, several costly events were reported these last years, among which the July 2011 event, which cost around 125 million EUR to the Aargauer public insurance company (North-western Switzerland). This study presents the new developments in a stochastic model which aims at evaluating the risk for a building portfolio. Thanks to insurance and meteorological radar data of the 2011 Aargauer event, vulnerability curves are proposed by comparing the damage rate to the radar intensity (i.e. the maximum hailstone size reached during the event, deduced from the radar signal). From these data, vulnerability is defined by a two-step process. The first step defines the probability for a building to be affected (i.e. to claim damages), while the second, if the building is affected, attributes a damage rate to the building from a probability distribution specific to the intensity class. To assess the risk, stochastic events are then generated by summing a set of Gaussian functions with 6 random parameters (X and Y location, maximum hailstone size, standard deviation, eccentricity and orientation). The location of these functions is constrained by a general event shape and by the position of the previously defined functions of the same event. For each generated event, the total cost is calculated in order to obtain a distribution of event costs. The general events parameters (shape, size, …) as well as the distribution of the Gaussian parameters are inferred from two radar intensity maps, namely the one of the aforementioned event, and a second from an event which occurred in 2009. After a large number of simulations, the hailstone size distribution obtained in different regions is compared to the distribution inferred from pre-existing hazard maps, built from a larger set of radar data. The simulation parameters are then adjusted by trial and error, in order to get the best reproduction of the expected distributions. The value of the mean annual risk obtained using the model is also compared to the mean annual risk calculated using directly the hazard maps. According to the first results, the return period of an event inducing a total damage cost equal or greater than 125 million EUR for the Aargauer insurance company would be of around 10 to 40 years.
NASA Astrophysics Data System (ADS)
Kawzenuk, B.; Sellars, S. L.; Nguyen, P.; Ralph, F. M.; Sorooshian, S.
2017-12-01
The CONNected objECT (CONNECT) algorithm is applied to Integrated Water Vapor Transport (IVT) data from the NASA's Modern-Era Retrospective Analysis for Research and Applications - Version 2 reanalysis product for the period 1980 to 2016 to study water vapor transport globally. The algorithm generates life-cycle records as statistical objects for the time and space location of the evolving strong vapor transport events. Global statistics are presented and used to investigate how climate variability impacts the events' location and frequency. Results show distinct water vapor object frequency and seasonal peaks during NH and SH Winter. Moreover, a positive linear trend in the annual number of objects is reported, increasing by 3.58 objects year-over-year (with 95% confidence, +/- 1.39). In addition, we show five distinct regions where these events typically exist (southeastern United States, eastern China, South Pacific south of 25°S, eastern South America and off the southern tip of South Africa), and where they rarely exist (eastern South Pacific Ocean and central southern Atlantic Ocean between 5°N-25°S). In addition, the event frequency and geographical location are also shown to be related to the Arctic Oscillation, Pacific North American Pattern, and the Quasi-Biennial Oscillation.
ERIC Educational Resources Information Center
Geiser, Saul
2017-01-01
Like Berkeley, the UC system as a whole is quickly running out of space to accommodate the next generation of Californians who will be reaching college age by mid-century. Even with the added capacity at UC Merced, the UC system will run out of space on existing campuses in the next decade. In the normal course of events, this would trigger…
NASA Astrophysics Data System (ADS)
Terry, James P.; Lau, A. Y. Annie
2018-02-01
We delimit nearshore storm waves generated by category-5 Tropical Cyclone Winston in February 2016 on the northern Fijian island of Taveuni. Wave magnitudes (heights and flow velocities) are hindcast by inverse modelling, based on the characteristics of large carbonate boulders (maximum 33.8 m3, 60.9 metric tons) that were quarried from reef-front sources, transported and deposited on coral reef platforms during Winston and older extreme events. Results indicate that Winston's storm waves on the seaward-margin of reefs fringing the southeastern coasts of Taveuni reached over 10 m in height and generated flow velocities of 14 m s- 1, thus coinciding with the scale of the biggest ancient storms as estimated from pre-existing boulder evidence. We conclude that although Winston tracked an uncommon path and was described as the most powerful storm on record to make landfall in the Fiji Islands, its coastal wave characteristics were not unprecedented on centennial timescales. At least seven events of comparable magnitude have occurred over the last 400 years.
Widespread alternative and aberrant splicing revealed by lariat sequencing
Stepankiw, Nicholas; Raghavan, Madhura; Fogarty, Elizabeth A.; Grimson, Andrew; Pleiss, Jeffrey A.
2015-01-01
Alternative splicing is an important and ancient feature of eukaryotic gene structure, the existence of which has likely facilitated eukaryotic proteome expansions. Here, we have used intron lariat sequencing to generate a comprehensive profile of splicing events in Schizosaccharomyces pombe, amongst the simplest organisms that possess mammalian-like splice site degeneracy. We reveal an unprecedented level of alternative splicing, including alternative splice site selection for over half of all annotated introns, hundreds of novel exon-skipping events, and thousands of novel introns. Moreover, the frequency of these events is far higher than previous estimates, with alternative splice sites on average activated at ∼3% the rate of canonical sites. Although a subset of alternative sites are conserved in related species, implying functional potential, the majority are not detectably conserved. Interestingly, the rate of aberrant splicing is inversely related to expression level, with lowly expressed genes more prone to erroneous splicing. Although we validate many events with RNAseq, the proportion of alternative splicing discovered with lariat sequencing is far greater, a difference we attribute to preferential decay of aberrantly spliced transcripts. Together, these data suggest the spliceosome possesses far lower fidelity than previously appreciated, highlighting the potential contributions of alternative splicing in generating novel gene structures. PMID:26261211
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
NASA Astrophysics Data System (ADS)
Weber, Juliane; Zachow, Christopher; Witthaut, Dirk
2018-03-01
Wind power generation exhibits a strong temporal variability, which is crucial for system integration in highly renewable power systems. Different methods exist to simulate wind power generation but they often cannot represent the crucial temporal fluctuations properly. We apply the concept of additive binary Markov chains to model a wind generation time series consisting of two states: periods of high and low wind generation. The only input parameter for this model is the empirical autocorrelation function. The two-state model is readily extended to stochastically reproduce the actual generation per period. To evaluate the additive binary Markov chain method, we introduce a coarse model of the electric power system to derive backup and storage needs. We find that the temporal correlations of wind power generation, the backup need as a function of the storage capacity, and the resting time distribution of high and low wind events for different shares of wind generation can be reconstructed.
Pain Now or Later: An Outgrowth Account of Pain-Minimization
Chen, Shuai; Zhao, Dan; Rao, Li-Lin; Liang, Zhu-Yuan; Li, Shu
2015-01-01
The preference for immediate negative events contradicts the minimizing loss principle given that the value of a delayed negative event is discounted by the amount of time it is delayed. However, this preference is understandable if we assume that the value of a future outcome is not restricted to the discounted utility of the outcome per se but is complemented by an anticipated negative utility assigned to an unoffered dimension, which we termed the “outgrowth.” We conducted three studies to establish the existence of the outgrowth and empirically investigated the mechanism underlying the preference for immediate negative outcomes. Study 1 used a content analysis method to examine whether the outgrowth was generated in accompaniment with the delayed negative events. The results revealed that the investigated outgrowth was composed of two elements. The first component is the anticipated negative emotions elicited by the delayed negative event, and the other is the anticipated rumination during the waiting process, in which one cannot stop thinking about the negative event. Study 2 used a follow-up investigation to examine whether people actually experienced the negative emotions they anticipated in a real situation of waiting for a delayed negative event. The results showed that the participants actually experienced a number of negative emotions when waiting for a negative event. Study 3 examined whether the existence of the outgrowth could make the minimizing loss principle work. The results showed that the difference in pain anticipation between the immediate event and the delayed event could significantly predict the timing preference of the negative event. Our findings suggest that people’s preference for experiencing negative events sooner serves to minimize the overall negative utility, which is divided into two parts: the discounted utility of the outcome itself and an anticipated negative utility assigned to the outgrowth. PMID:25747461
NASA Astrophysics Data System (ADS)
Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum
2017-04-01
We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.
NASA Astrophysics Data System (ADS)
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.
Fine-Scale Event Location and Error Analysis in NET-VISA
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.
2016-12-01
NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.
Khalilzadeh, Omid; Baerlocher, Mark O; Shyn, Paul B; Connolly, Bairbre L; Devane, A Michael; Morris, Christopher S; Cohen, Alan M; Midia, Mehran; Thornton, Raymond H; Gross, Kathleen; Caplin, Drew M; Aeron, Gunjan; Misra, Sanjay; Patel, Nilesh H; Walker, T Gregory; Martinez-Salazar, Gloria; Silberzweig, James E; Nikolic, Boris
2017-10-01
To develop a new adverse event (AE) classification for the interventional radiology (IR) procedures and evaluate its clinical, research, and educational value compared with the existing Society of Interventional Radiology (SIR) classification via an SIR member survey. A new AE classification was developed by members of the Standards of Practice Committee of the SIR. Subsequently, a survey was created by a group of 18 members from the SIR Standards of Practice Committee and Service Lines. Twelve clinical AE case scenarios were generated that encompassed a broad spectrum of IR procedures and potential AEs. Survey questions were designed to evaluate the following domains: educational and research values, accountability for intraprocedural challenges, consistency of AE reporting, unambiguity, and potential for incorporation into existing quality-assurance framework. For each AE scenario, the survey participants were instructed to answer questions about the proposed and existing SIR classifications. SIR members were invited via online survey links, and 68 members participated among 140 surveyed. Answers on new and existing classifications were evaluated and compared statistically. Overall comparison between the two surveys was performed by generalized linear modeling. The proposed AE classification received superior evaluations in terms of consistency of reporting (P < .05) and potential for incorporation into existing quality-assurance framework (P < .05). Respondents gave a higher overall rating to the educational and research value of the new compared with the existing classification (P < .05). This study proposed an AE classification system that outperformed the existing SIR classification in the studied domains. Copyright © 2017 SIR. Published by Elsevier Inc. All rights reserved.
Source Characterization and Seismic Hazard Considerations for Hydraulic Fracture Induced Seismicity
NASA Astrophysics Data System (ADS)
Bosman, K.; Viegas, G. F.; Baig, A. M.; Urbancic, T.
2015-12-01
Large microseismic events (M>0) have been shown to be generated during hydraulic fracture treatments relatively frequently. These events are a concern both from public safety and engineering viewpoints. Recent microseismic monitoring projects in the Horn River Basin have utilized both downhole and surface sensors to record events associated with hydraulic fracturing. The resulting hybrid monitoring system has produced a large dataset with two distinct groups of events: large events recorded by the surface network (0
Spatiotemporal Interpolation Methods for Solar Event Trajectories
NASA Astrophysics Data System (ADS)
Filali Boubrahimi, Soukaina; Aydin, Berkay; Schuh, Michael A.; Kempton, Dustin; Angryk, Rafal A.; Ma, Ruizhe
2018-05-01
This paper introduces four spatiotemporal interpolation methods that enrich complex, evolving region trajectories that are reported from a variety of ground-based and space-based solar observatories every day. Our interpolation module takes an existing solar event trajectory as its input and generates an enriched trajectory with any number of additional time–geometry pairs created by the most appropriate method. To this end, we designed four different interpolation techniques: MBR-Interpolation (Minimum Bounding Rectangle Interpolation), CP-Interpolation (Complex Polygon Interpolation), FI-Interpolation (Filament Polygon Interpolation), and Areal-Interpolation, which are presented here in detail. These techniques leverage k-means clustering, centroid shape signature representation, dynamic time warping, linear interpolation, and shape buffering to generate the additional polygons of an enriched trajectory. Using ground-truth objects, interpolation effectiveness is evaluated through a variety of measures based on several important characteristics that include spatial distance, area overlap, and shape (boundary) similarity. To our knowledge, this is the first research effort of this kind that attempts to address the broad problem of spatiotemporal interpolation of solar event trajectories. We conclude with a brief outline of future research directions and opportunities for related work in this area.
Coupling Post-Event and Prospective Analyses for El Niño-related Risk Reduction in Peru
NASA Astrophysics Data System (ADS)
French, Adam; Keating, Adriana; Mechler, Reinhard; Szoenyi, Michael; Cisneros, Abel; Chuquisengo, Orlando; Etienne, Emilie; Ferradas, Pedro
2017-04-01
Analyses in the wake of natural disasters play an important role in identifying how ex ante risk reduction and ex post hazard response activities have both succeeded and fallen short in specific contexts, thereby contributing to recommendations for improving such measures in the future. Event analyses have particular relevance in settings where disasters are likely to reoccur, and especially where recurrence intervals are short. This paper applies the Post Event Review Capability (PERC) methodology to the context of frequently reoccurring El Niño Southern Oscillation (ENSO) events in the country of Peru, where over the last several decades ENSO impacts have generated high levels of damage and economic loss. Rather than analyzing the impacts of a single event, this study builds upon the existing PERC methodology by combining empirical event analysis with a critical examination of risk reduction and adaptation measures implemented both prior to and following several ENSO events in the late 20th and early 21st centuries. Additionally, the paper explores linking the empirical findings regarding the uptake and outcomes of particular risk reduction and adaptation strategies to a prospective, scenario-based approach for projecting risk several decades into the future.
Requirements-Driven Log Analysis Extended Abstract
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2012-01-01
Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.
Visualization of Traffic Accidents
NASA Technical Reports Server (NTRS)
Wang, Jie; Shen, Yuzhong; Khattak, Asad
2010-01-01
Traffic accidents have tremendous impact on society. Annually approximately 6.4 million vehicle accidents are reported by police in the US and nearly half of them result in catastrophic injuries. Visualizations of traffic accidents using geographic information systems (GIS) greatly facilitate handling and analysis of traffic accidents in many aspects. Environmental Systems Research Institute (ESRI), Inc. is the world leader in GIS research and development. ArcGIS, a software package developed by ESRI, has the capabilities to display events associated with a road network, such as accident locations, and pavement quality. But when event locations related to a road network are processed, the existing algorithm used by ArcGIS does not utilize all the information related to the routes of the road network and produces erroneous visualization results of event locations. This software bug causes serious problems for applications in which accurate location information is critical for emergency responses, such as traffic accidents. This paper aims to address this problem and proposes an improved method that utilizes all relevant information of traffic accidents, namely, route number, direction, and mile post, and extracts correct event locations for accurate traffic accident visualization and analysis. The proposed method generates a new shape file for traffic accidents and displays them on top of the existing road network in ArcGIS. Visualization of traffic accidents along Hampton Roads Bridge Tunnel is included to demonstrate the effectiveness of the proposed method.
Integrated modeling for assessment of energy-water system resilience under changing climate
NASA Astrophysics Data System (ADS)
Yan, E.; Veselka, T.; Zhou, Z.; Koritarov, V.; Mahalik, M.; Qiu, F.; Mahat, V.; Betrie, G.; Clark, C.
2016-12-01
Energy and water systems are intrinsically interconnected. Due to an increase in climate variability and extreme weather events, interdependency between these two systems has been recently intensified resulting significant impacts on both systems and energy output. To address this challenge, an Integrated Water-Energy Systems Assessment Framework (IWESAF) is being developed to integrate multiple existing or developed models from various sectors. The IWESAF currently includes an extreme climate event generator to predict future extreme weather events, hydrologic and reservoir models, riverine temperature model, power plant water use simulator, and power grid operation and cost optimization model. The IWESAF can facilitate the interaction among the modeling systems and provide insights of the sustainability and resilience of the energy-water system under extreme climate events and economic consequence. The regional case demonstration in the Midwest region will be presented. The detailed information on some of individual modeling components will also be presented in several other abstracts submitted to AGU this year.
Swarms of repeating long-period earthquakes at Shishaldin Volcano, Alaska, 2001-2004
Petersen, Tanja
2007-01-01
During 2001–2004, a series of four periods of elevated long-period seismic activity, each lasting about 1–2 months, occurred at Shishaldin Volcano, Aleutian Islands, Alaska. The time periods are termed swarms of repeating events, reflecting an abundance of earthquakes with highly similar waveforms that indicate stable, non-destructive sources. These swarms are characterized by increased earthquake amplitudes, although the seismicity rate of one event every 0.5–5 min has remained more or less constant since Shishaldin last erupted in 1999. A method based on waveform cross-correlation is used to identify highly repetitive events, suggestive of spatially distinct source locations. The waveform analysis shows that several different families of similar events co-exist during a given swarm day, but generally only one large family dominates. A network of hydrothermal fractures may explain the events that do not belong to a dominant repeating event group, i.e. multiple sources at different locations exist next to a dominant source. The dominant waveforms exhibit systematic changes throughout each swarm, but some of these waveforms do reappear over the course of 4 years indicating repeatedly activated source locations. The choked flow model provides a plausible trigger mechanism for the repeating events observed at Shishaldin, explaining the gradual changes in waveforms over time by changes in pressure gradient across a constriction within the uppermost part of the conduit. The sustained generation of Shishaldin's long-period events may be attributed to complex dynamics of a multi-fractured hydrothermal system: the pressure gradient within the main conduit may be regulated by temporarily sealing and reopening of parallel flow pathways, by the amount of debris within the main conduit and/or by changing gas influx into the hydrothermal system. The observations suggest that Shishaldin's swarms of repeating events represent time periods during which a dominant source is activated.
Xu, Zhezhuang; Liu, Guanglun; Yan, Haotian; Cheng, Bin; Lin, Feilong
2017-10-27
In wireless sensor and actor networks, when an event is detected, the sensor node needs to transmit an event report to inform the actor. Since the actor moves in the network to execute missions, its location is always unavailable to the sensor nodes. A popular solution is the search strategy that can forward the data to a node without its location information. However, most existing works have not considered the mobility of the node, and thus generate significant energy consumption or transmission delay. In this paper, we propose the trail-based search (TS) strategy that takes advantage of actor's mobility to improve the search efficiency. The main idea of TS is that, when the actor moves in the network, it can leave its trail composed of continuous footprints. The search packet with the event report is transmitted in the network to search the actor or its footprints. Once an effective footprint is discovered, the packet will be forwarded along the trail until it is received by the actor. Moreover, we derive the condition to guarantee the trail connectivity, and propose the redundancy reduction scheme based on TS (TS-R) to reduce nontrivial transmission redundancy that is generated by the trail. The theoretical and numerical analysis is provided to prove the efficiency of TS. Compared with the well-known expanding ring search (ERS), TS significantly reduces the energy consumption and search delay.
A video event trigger for high frame rate, high resolution video technology
NASA Astrophysics Data System (ADS)
Williams, Glenn L.
1991-12-01
When video replaces film the digitized video data accumulates very rapidly, leading to a difficult and costly data storage problem. One solution exists for cases when the video images represent continuously repetitive 'static scenes' containing negligible activity, occasionally interrupted by short events of interest. Minutes or hours of redundant video frames can be ignored, and not stored, until activity begins. A new, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term or short term changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pretrigger and post-trigger storage techniques are then adaptable for archiving the digital stream from only the significant video images.
A video event trigger for high frame rate, high resolution video technology
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
1991-01-01
When video replaces film the digitized video data accumulates very rapidly, leading to a difficult and costly data storage problem. One solution exists for cases when the video images represent continuously repetitive 'static scenes' containing negligible activity, occasionally interrupted by short events of interest. Minutes or hours of redundant video frames can be ignored, and not stored, until activity begins. A new, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term or short term changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pretrigger and post-trigger storage techniques are then adaptable for archiving the digital stream from only the significant video images.
Potential implementation of reservoir computing models based on magnetic skyrmions
NASA Astrophysics Data System (ADS)
Bourianoff, George; Pinna, Daniele; Sitte, Matthias; Everschor-Sitte, Karin
2018-05-01
Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts to implement reservoir computing prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of magnetic skyrmion fabrics and the complex current patterns which form in them as an attractive physical instantiation for Reservoir Computing. We argue that their nonlinear dynamical interplay resulting from anisotropic magnetoresistance and spin-torque effects allows for an effective and energy efficient nonlinear processing of spatial temporal events with the aim of event recognition and prediction.
Gamma Ray Bursts as Cosmological Probes with EXIST
NASA Astrophysics Data System (ADS)
Hartmann, Dieter; EXIST Team
2006-12-01
The EXIST mission, studied as a Black Hole Finder Probe within NASA's Beyond Einstein Program, would, in its current design, trigger on 1000 Gamma Ray Bursts (GRBs) per year (Grindlay et al, this meeting). The redshift distribution of these GRBs, using results from Swift as a guide, would probe the z > 7 epoch at an event rate of > 50 per year. These bursts trace early cosmic star formation history, point to a first generation of stellar objects that reionize the universe, and provide bright beacons for absorption line studies with groundand space-based observatories. We discuss how EXIST, in conjunction with other space missions and future large survey programs such as LSST, can be utilized to advance our understanding of cosmic chemical evolution, the structure and evolution of the baryonic cosmic web, and the formation of stars in low metallicity environments.
Weirather, Jason L.; Afshar, Pegah Tootoonchi; Clark, Tyson A.; Tseng, Elizabeth; Powers, Linda S.; Underwood, Jason G.; Zabner, Joseph; Korlach, Jonas; Wong, Wing Hung; Au, Kin Fai
2015-01-01
We developed an innovative hybrid sequencing approach, IDP-fusion, to detect fusion genes, determine fusion sites and identify and quantify fusion isoforms. IDP-fusion is the first method to study gene fusion events by integrating Third Generation Sequencing long reads and Second Generation Sequencing short reads. We applied IDP-fusion to PacBio data and Illumina data from the MCF-7 breast cancer cells. Compared with the existing tools, IDP-fusion detects fusion genes at higher precision and a very low false positive rate. The results show that IDP-fusion will be useful for unraveling the complexity of multiple fusion splices and fusion isoforms within tumorigenesis-relevant fusion genes. PMID:26040699
Spatial effects in discrete generation population models.
Carrillo, C; Fife, P
2005-02-01
A framework is developed for constructing a large class of discrete generation, continuous space models of evolving single species populations and finding their bifurcating patterned spatial distributions. Our models involve, in separate stages, the spatial redistribution (through movement laws) and local regulation of the population; and the fundamental properties of these events in a homogeneous environment are found. Emphasis is placed on the interaction of migrating individuals with the existing population through conspecific attraction (or repulsion), as well as on random dispersion. The nature of the competition of these two effects in a linearized scenario is clarified. The bifurcation of stationary spatially patterned population distributions is studied, with special attention given to the role played by that competition.
Stair negotiation made easier using novel interactive energy-recycling assistive stairs.
Song, Yun Seong; Ha, Sehoon; Hsu, Hsiang; Ting, Lena H; Liu, C Karen
2017-01-01
Here we show that novel, energy-recycling stairs reduce the amount of work required for humans to both ascend and descend stairs. Our low-power, interactive, and modular steps can be placed on existing staircases, storing energy during stair descent and returning that energy to the user during stair ascent. Energy is recycled through event-triggered latching and unlatching of passive springs without the use of powered actuators. When ascending the energy-recycling stairs, naive users generated 17.4 ± 6.9% less positive work with their leading legs compared to conventional stairs, with the knee joint positive work reduced by 37.7 ± 10.5%. Users also generated 21.9 ± 17.8% less negative work with their trailing legs during stair descent, with ankle joint negative work reduced by 26.0 ± 15.9%. Our low-power energy-recycling stairs have the potential to assist people with mobility impairments during stair negotiation on existing staircases.
Stair negotiation made easier using novel interactive energy-recycling assistive stairs
Song, Yun Seong; Ha, Sehoon; Hsu, Hsiang; Ting, Lena H.
2017-01-01
Here we show that novel, energy-recycling stairs reduce the amount of work required for humans to both ascend and descend stairs. Our low-power, interactive, and modular steps can be placed on existing staircases, storing energy during stair descent and returning that energy to the user during stair ascent. Energy is recycled through event-triggered latching and unlatching of passive springs without the use of powered actuators. When ascending the energy-recycling stairs, naive users generated 17.4 ± 6.9% less positive work with their leading legs compared to conventional stairs, with the knee joint positive work reduced by 37.7 ± 10.5%. Users also generated 21.9 ± 17.8% less negative work with their trailing legs during stair descent, with ankle joint negative work reduced by 26.0 ± 15.9%. Our low-power energy-recycling stairs have the potential to assist people with mobility impairments during stair negotiation on existing staircases. PMID:28700719
Slow slip events and seismic tremor at circum-Pacific subduction zones
NASA Astrophysics Data System (ADS)
Schwartz, Susan Y.; Rokosky, Juliana M.
2007-09-01
It has been known for a long time that slip accompanying earthquakes accounts for only a fraction of plate tectonic displacements. However, only recently has a fuller spectrum of strain release processes, including normal, slow, and silent earthquakes (or slow slip events) and continuous and episodic slip, been observed and generated by numerical simulations of the earthquake cycle. Despite a profusion of observations and modeling studies the physical mechanism of slow slip events remains elusive. The concurrence of seismic tremor with slow slip episodes in Cascadia and southwestern Japan provides insight into the process of slow slip. A perceived similarity between subduction zone and volcanic tremor has led to suggestions that slow slip involves fluid migration on or near the plate interface. Alternatively, evidence is accumulating to support the notion that tremor results from shear failure during slow slip. Global observations of the location, spatial extent, magnitude, duration, slip rate, and periodicity of these aseismic slip transients indicate significant variation that may be exploited to better understand their generation. Most slow slip events occur just downdip of the seismogenic zone, consistent with rate- and state-dependent frictional modeling that requires unstable to stable transitional properties for slow slip generation. At a few convergent margins the occurrence of slow slip events within the seismogenic zone makes it highly likely that transitions in frictional properties exist there and are the loci of slow slip nucleation. Slow slip events perturb the surrounding stress field and may either increase or relieve stress on a fault, bringing it closer to or farther from earthquake failure, respectively. This paper presents a review of slow slip events and related seismic tremor observed at plate boundaries worldwide, with a focus on circum-Pacific subduction zones. Trends in global observations of slow slip events suggest that (1) slow slip is a common phenomena observed at almost all subduction zones with instrumentation capable of recording it, (2) different frictional properties likely control fast versus slow slip, (3) the depth range of slow slip may be related to the thermal properties of the plate interface, and (4) the equivalent seismic moment of slow slip events is proportional to their duration (Moατ), different from the Moατ3 scaling observed for earthquakes.
NASA Astrophysics Data System (ADS)
Soulis, K. X.; Valiantzas, J. D.; Dercas, N.; Londra, P. A.
2009-01-01
The Soil Conservation Service Curve Number (SCS-CN) method is widely used for predicting direct runoff volume for a given rainfall event. The applicability of the SCS-CN method and the runoff generation mechanism were thoroughly analysed in a Mediterranean experimental watershed in Greece. The region is characterized by a Mediterranean semi-arid climate. A detailed land cover and soil survey using remote sensing and GIS techniques, showed that the watershed is dominated by coarse soils with high hydraulic conductivities, whereas a smaller part is covered with medium textured soils and impervious surfaces. The analysis indicated that the SCS-CN method fails to predict runoff for the storm events studied, and that there is a strong correlation between the CN values obtained from measured runoff and the rainfall depth. The hypothesis that this correlation could be attributed to the existence of an impermeable part in a very permeable watershed was examined in depth, by developing a numerical simulation water flow model for predicting surface runoff generated from each of the three soil types of the watershed. Numerical runs were performed using the HYDRUS-1D code. The results support the validity of this hypothesis for most of the events examined where the linear runoff formula provides better results than the SCS-CN method. The runoff coefficient of this formula can be taken equal to the percentage of the impervious area. However, the linear formula should be applied with caution in case of extreme events with very high rainfall intensities. In this case, the medium textured soils may significantly contribute to the total runoff and the linear formula may significantly underestimate the runoff produced.
NASA Astrophysics Data System (ADS)
Soulis, K. X.; Valiantzas, J. D.; Dercas, N.; Londra, P. A.
2009-05-01
The Soil Conservation Service Curve Number (SCS-CN) method is widely used for predicting direct runoff volume for a given rainfall event. The applicability of the SCS-CN method and the direct runoff generation mechanism were thoroughly analysed in a Mediterranean experimental watershed in Greece. The region is characterized by a Mediterranean semi-arid climate. A detailed land cover and soil survey using remote sensing and GIS techniques, showed that the watershed is dominated by coarse soils with high hydraulic conductivities, whereas a smaller part is covered with medium textured soils and impervious surfaces. The analysis indicated that the SCS-CN method fails to predict runoff for the storm events studied, and that there is a strong correlation between the CN values obtained from measured runoff and the rainfall depth. The hypothesis that this correlation could be attributed to the existence of an impermeable part in a very permeable watershed was examined in depth, by developing a numerical simulation water flow model for predicting surface runoff generated from each of the three soil types of the watershed. Numerical runs were performed using the HYDRUS-1D code. The results support the validity of this hypothesis for most of the events examined where the linear runoff formula provides better results than the SCS-CN method. The runoff coefficient of this formula can be taken equal to the percentage of the impervious area. However, the linear formula should be applied with caution in case of extreme events with very high rainfall intensities. In this case, the medium textured soils may significantly contribute to the total runoff and the linear formula may significantly underestimate the runoff produced.
Harnessing Scientific Literature Reports for Pharmacovigilance
Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-01-01
Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432
Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-03-22
We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.
Long-Term Interactions of Streamflow Generation and River Basin Morphology
NASA Astrophysics Data System (ADS)
Huang, X.; Niemann, J.
2005-12-01
It is well known that the spatial patterns and dynamics of streamflow generation processes depend on river basin topography, but the impact of streamflow generation processes on the long-term evolution of river basins has not drawn as much attention. Fluvial erosion processes are driven by streamflow, which can be produced by Horton runoff, Dunne runoff, and groundwater discharge. In this analysis, we hypothesize that the dominant streamflow generation process in a basin affects the spatial patterns of fluvial erosion and that the nature of these patterns changes for storm events with differing return periods. Furthermore, we hypothesize that differences in the erosion patterns modify the topography over the long term in a way that promotes and/or inhibits the other streamflow generation mechanisms. In order to test these hypotheses, a detailed hydrologic model is imbedded into an existing landscape evolution model. Precipitation events are simulated with a Poisson process and have random intensities and durations. The precipitation is partitioned between Horton runoff and infiltration to groundwater using a specified infiltration capacity. Groundwater flow is described by a two-dimensional Dupuit equation for a homogeneous, isotropic, unconfined aquifer with an irregular underlying impervious layer. Dunne runoff occurs when precipitation falls on locations where the water table reaches the land surface. The combined hydrologic/geomorphic model is applied to the WE-38 basin, an experimental watershed in Pennsylvania that has substantial available hydrologic data. First, the hydrologic model is calibrated to reproduce the observed streamflow for 1990 using the observed rainfall as the input. Then, the relative roles of Horton runoff, Dunne runoff, and groundwater discharge are controlled by varying the infiltration capacity of the soil. For each infiltration capacity, the hydrologic and geomorphic behavior of the current topography is analyzed and the long-term evolution of the basin is simulated. The results indicate that the topography can be divided into three types of locations (unsaturated, saturated, and intermittently saturated) which control the patterns of streamflow generation for events with different return periods. The results also indicate that the streamflow generation processes can produce different geomorphic effective events at upstream and downstream locations. The model also suggests that a topography dominated by groundwater discharge evolves over a long period of time to a shape that tends to inhibit the development of saturated areas and Dunne runoff.
Fault Diagnosis System of Wind Turbine Generator Based on Petri Net
NASA Astrophysics Data System (ADS)
Zhang, Han
Petri net is an important tool for discrete event dynamic systems modeling and analysis. And it has great ability to handle concurrent phenomena and non-deterministic phenomena. Currently Petri nets used in wind turbine fault diagnosis have not participated in the actual system. This article will combine the existing fuzzy Petri net algorithms; build wind turbine control system simulation based on Siemens S7-1200 PLC, while making matlab gui interface for migration of the system to different platforms.
NASA Technical Reports Server (NTRS)
Trettel, D. W.; Aquino, J. T.; Piazza, T. R.; Taylor, L. E.; Trask, D. C.
1982-01-01
Correlations between standard meteorological data and wind power generation potential were developed. Combined with appropriate wind forecasts, these correlations can be useful to load dispatchers to supplement conventional energy sources. Hourly wind data were analyzed for four sites, each exhibiting a unique physiography. These sites are Amarillo, Texas; Ludington, Michigan; Montauk Point, New York; and San Gorgonio, California. Synoptic weather maps and tables are presented to illustrate various wind 'regimes' at these sites.
Fenner, Jack N
2005-10-01
The length of the human generation interval is a key parameter when using genetics to date population divergence events. However, no consensus exists regarding the generation interval length, and a wide variety of interval lengths have been used in recent studies. This makes comparison between studies difficult, and questions the accuracy of divergence date estimations. Recent genealogy-based research suggests that the male generation interval is substantially longer than the female interval, and that both are greater than the values commonly used in genetics studies. This study evaluates each of these hypotheses in a broader cross-cultural context, using data from both nation states and recent hunter-gatherer societies. Both hypotheses are supported by this study; therefore, revised estimates of male, female, and overall human generation interval lengths are proposed. The nearly universal, cross-cultural nature of the evidence justifies using these proposed estimates in Y-chromosomal, mitochondrial, and autosomal DNA-based population divergence studies.
A method for feature selection of APT samples based on entropy
NASA Astrophysics Data System (ADS)
Du, Zhenyu; Li, Yihong; Hu, Jinsong
2018-05-01
By studying the known APT attack events deeply, this paper propose a feature selection method of APT sample and a logic expression generation algorithm IOCG (Indicator of Compromise Generate). The algorithm can automatically generate machine readable IOCs (Indicator of Compromise), to solve the existing IOCs logical relationship is fixed, the number of logical items unchanged, large scale and cannot generate a sample of the limitations of the expression. At the same time, it can reduce the redundancy and useless APT sample processing time consumption, and improve the sharing rate of information analysis, and actively respond to complex and volatile APT attack situation. The samples were divided into experimental set and training set, and then the algorithm was used to generate the logical expression of the training set with the IOC_ Aware plug-in. The contrast expression itself was different from the detection result. The experimental results show that the algorithm is effective and can improve the detection effect.
Modelling the effectiveness of grass buffer strips in managing muddy floods under a changing climate
NASA Astrophysics Data System (ADS)
Mullan, Donal; Vandaele, Karel; Boardman, John; Meneely, John; Crossley, Laura H.
2016-10-01
Muddy floods occur when rainfall generates runoff on agricultural land, detaching and transporting sediment into the surrounding natural and built environment. In the Belgian Loess Belt, muddy floods occur regularly and lead to considerable economic costs associated with damage to property and infrastructure. Mitigation measures designed to manage the problem have been tested in a pilot area within Flanders and were found to be cost-effective within three years. This study assesses whether these mitigation measures will remain effective under a changing climate. To test this, the Water Erosion Prediction Project (WEPP) model was used to examine muddy flooding diagnostics (precipitation, runoff, soil loss and sediment yield) for a case study hillslope in Flanders where grass buffer strips are currently used as a mitigation measure. The model was run for present day conditions and then under 33 future site-specific climate scenarios. These future scenarios were generated from three earth system models driven by four representative concentration pathways and downscaled using quantile mapping and the weather generator CLIGEN. Results reveal that under the majority of future scenarios, muddy flooding diagnostics are projected to increase, mostly as a consequence of large scale precipitation events rather than mean changes. The magnitude of muddy flood events for a given return period is also generally projected to increase. These findings indicate that present day mitigation measures may have a reduced capacity to manage muddy flooding given the changes imposed by a warming climate with an enhanced hydrological cycle. Revisions to the design of existing mitigation measures within existing policy frameworks are considered the most effective way to account for the impacts of climate change in future mitigation planning.
NASA Astrophysics Data System (ADS)
Colella, H.; Ellis, S. M.; Williams, C. A.
2015-12-01
The Hikurangi subduction zone (New Zealand) is one of many subudction zones that exhibit slow slip behavior. Geodetic observations along the Hikurangi subduction zone are unusual in that not only does the subduction zone exhibit periodic slow slip events at "typical" subduction-zone depths of 25-50 km along the southern part of the margin, but also much shallower depths of 8-15 km along the northern part of the margin. Furthermore, there is evidence for interplay between slow slip events at these different depth ranges (between the deep and shallow events) along the central part of the margin, and some of the slow slip behavior is observed along regions of the interface that were previously considered locked, which raises questions about the slip behavior of this region. This study employs the earthquake simulator, RSQSim, to explore variations in the effective normal stress (i.e., stress after the addition of pore fluid pressures) and the frictional instability necessary to generate the complex slow slip events observed along the Hikurangi margin. Preliminary results suggest that to generate slow slip events with similar recurrence intervals to those observed the effective normal stress (MPa) is 3x higher in the south than the north, 6-9MPa versus 2-3MPa, respectively. Results also suggest that, at a minimum, that some overlap along the central margin must exist between the slow slip sections in the north and south to reproduce the types of slip events observed along the Hikurangi subduction zone. To further validate the results from the simulations, Okada solutions for surface displacements will be compared to geodetic solution to more accurately constrain the areas in which slip behavior varies and the cause(s) for the variation(s).
Identification of genomic indels and structural variations using split reads
2011-01-01
Background Recent studies have demonstrated the genetic significance of insertions, deletions, and other more complex structural variants (SVs) in the human population. With the development of the next-generation sequencing technologies, high-throughput surveys of SVs on the whole-genome level have become possible. Here we present split-read identification, calibrated (SRiC), a sequence-based method for SV detection. Results We start by mapping each read to the reference genome in standard fashion using gapped alignment. Then to identify SVs, we score each of the many initial mappings with an assessment strategy designed to take into account both sequencing and alignment errors (e.g. scoring more highly events gapped in the center of a read). All current SV calling methods have multilevel biases in their identifications due to both experimental and computational limitations (e.g. calling more deletions than insertions). A key aspect of our approach is that we calibrate all our calls against synthetic data sets generated from simulations of high-throughput sequencing (with realistic error models). This allows us to calculate sensitivity and the positive predictive value under different parameter-value scenarios and for different classes of events (e.g. long deletions vs. short insertions). We run our calculations on representative data from the 1000 Genomes Project. Coupling the observed numbers of events on chromosome 1 with the calibrations gleaned from the simulations (for different length events) allows us to construct a relatively unbiased estimate for the total number of SVs in the human genome across a wide range of length scales. We estimate in particular that an individual genome contains ~670,000 indels/SVs. Conclusions Compared with the existing read-depth and read-pair approaches for SV identification, our method can pinpoint the exact breakpoints of SV events, reveal the actual sequence content of insertions, and cover the whole size spectrum for deletions. Moreover, with the advent of the third-generation sequencing technologies that produce longer reads, we expect our method to be even more useful. PMID:21787423
The Holocene floods and their affinity to climatic variability in the western Himalaya, India
NASA Astrophysics Data System (ADS)
Sharma, Shubhra; Shukla, A. D.; Bartarya, S. K.; Marh, B. S.; Juyal, Navin
2017-08-01
The present study in the middle Satluj valley explores the sedimentary records of past floods with an objective to understand the climatic processes responsible for their genesis. Based on lithostratigraphy, sedimentology, and grain size variability, 25 flood events are identified. The geochemical data indicate that the flood sediments were mostly generated and transported from the higher Himalayan crystalline and the trans-Himalaya. Our study suggests that the floods were generated by Landslide Lake Outburst Floods (LLOFs) during extreme precipitation events. However, the existing database does not allow us to negate the contribution from Glacial Lake Outburst Floods (GLOFs). Field stratigraphy supported by optical chronology indicates four major flood phases that are dated to 13.4-10.4, 8.3-3.6, 2.2-1.4, and < 1.4 ka (kilo-annum). These phases correspond to the cooler and less wet conditions and broadly correlate with the phases of negative Arctic Oscillation (- AO) and negative North Atlantic Oscillation (- NAO). Thus, implying coupling between the moisture-laden monsoon circulation and southward penetrating mid-latitude westerly troughs for extreme precipitation events and consequent LLOFs. Additionally, a broad synchronicity in Holocene floods between the western Himalaya and across the mid-latitudinal region (30°N-40°N) suggests a synoptic scale Arctic and Atlantic climate variability.
Heavy ion event generator HYDJET++ (HYDrodynamics plus JETs)
NASA Astrophysics Data System (ADS)
Lokhtin, I. P.; Malinina, L. V.; Petrushanko, S. V.; Snigirev, A. M.; Arsene, I.; Tywoniuk, K.
2009-05-01
HYDJET++ is a Monte Carlo event generator for simulation of relativistic heavy ion AA collisions considered as a superposition of the soft, hydro-type state and the hard state resulting from multi-parton fragmentation. This model is the development and continuation of HYDJET event generator (Lokhtin and Snigirev, EPJC 45 (2006) 211). The main program is written in the object-oriented C++ language under the ROOT environment. The hard part of HYDJET++ is identical to the hard part of Fortran-written HYDJET and it is included in the generator structure as a separate directory. The soft part of HYDJET++ event is the "thermal" hadronic state generated on the chemical and thermal freeze-out hypersurfaces obtained from the parameterization of relativistic hydrodynamics with preset freeze-out conditions. It includes the longitudinal, radial and elliptic flow effects and the decays of hadronic resonances. The corresponding fast Monte Carlo simulation procedure, C++ code FAST MC (Amelin et al., PRC 74 (2006) 064901; PRC 77 (2008) 014903) is adapted to HYDJET++. It is designed for studying the multi-particle production in a wide energy range of heavy ion experimental facilities: from FAIR and NICA to RHIC and LHC. Program summaryProgram title: HYDJET++, version 2 Catalogue identifier: AECR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 100 387 No. of bytes in distributed program, including test data, etc.: 797 019 Distribution format: tar.gz Programming language: C++ (however there is a Fortran-written part which is included in the generator structure as a separate directory) Computer: Hardware independent (both C++ and Fortran compilers and ROOT environment [1] ( http://root.cern.ch/) should be installed) Operating system: Linux (Scientific Linux, Red Hat Enterprise, FEDORA, etc.) RAM: 50 MBytes (determined by ROOT requirements) Classification: 11.2 External routines: ROOT [1] ( http://root.cern.ch/) Nature of problem: The experimental and phenomenological study of multi-particle production in relativistic heavy ion collisions is expected to provide valuable information on the dynamical behavior of strongly-interacting matter in the form of quark-gluon plasma (QGP) [2-4], as predicted by lattice Quantum Chromodynamics (QCD) calculations. Ongoing and future experimental studies in a wide range of heavy ion beam energies require the development of new Monte Carlo (MC) event generators and improvement of existing ones. Especially for experiments at the CERN Large Hadron Collider (LHC), implying very high parton and hadron multiplicities, one needs fast (but realistic) MC tools for heavy ion event simulations [5-7]. The main advantage of MC technique for the simulation of high-multiplicity hadroproduction is that it allows a visual comparison of theory and data, including if necessary the detailed detector acceptances, responses and resolutions. The realistic MC event generator has to include maximum possible number of observable physical effects, which are important to determine the event topology: from the bulk properties of soft hadroproduction (domain of low transverse momenta p≲1 GeV/c) such as collective flows, to hard multi-parton production in hot and dense QCD-matter, which reveals itself in the spectra of high- p particles and hadronic jets. Moreover, the role of hard and semi-hard particle production at LHC can be significant even for the bulk properties of created matter, and hard probes of QGP became clearly observable in various new channels [8-11]. In the majority of the available MC heavy ion event generators, the simultaneous treatment of collective flow effects for soft hadroproduction and hard multi-parton in-medium production (medium-induced partonic rescattering and energy loss, so-called "jet quenching") is lacking. Thus, in order to analyze existing data on low and high- p hadron production, test the sensitivity of physical observables at the upcoming LHC experiments (and other future heavy ion facilities) to the QGP formation, and study the experimental capabilities of constructed detectors, the development of adequate and fast MC models for simultaneous collective flow and jet quenching simulations is necessary. HYDJET++ event generator includes detailed treatment of soft hadroproduction as well as hard multi-parton production, and takes into account known medium effects. Solution method: A heavy ion event in HYDJET++ is a superposition of the soft, hydro-type state and the hard state resulting from multi-parton fragmentation. Both states are treated independently. HYDJET++ is the development and continuation of HYDJET MC model [12]. The main program is written in the object-oriented C++ language under the ROOT environment [1]. The hard part of HYDJET++ is identical to the hard part of Fortran-written HYDJET [13] (version 1.5) and is included in the generator structure as a separate directory. The routine for generation of single hard NN collision, generator PYQUEN [12,14], modifies the "standard" jet event obtained with the generator PYTHIA 6.4 [15]. The event-by-event simulation procedure in PYQUEN includes generation of initial parton spectra with PYTHIA and production vertexes at given impact parameter; rescattering-by-rescattering simulation of the parton path in a dense zone and its radiative and collisional energy loss; final hadronization according to the Lund string model for hard partons and in-medium emitted gluons. Then the PYQUEN multi-jets generated according to the binomial distribution are included in the hard part of the event. The mean number of jets produced in an AA event is the product of the number of binary NN subcollisions at a given impact parameter and the integral cross section of the hard process in NN collisions with the minimum transverse momentum transfer pTmin. In order to take into account the effect of nuclear shadowing on parton distribution functions, the impact parameter dependent parameterization obtained in the framework of Glauber-Gribov theory [16] is used. The soft part of HYDJET++ event is the "thermal" hadronic state generated on the chemical and thermal freeze-out hypersurfaces obtained from the parameterization of relativistic hydrodynamics with preset freeze-out conditions (the adapted C++ code FAST MC [17,18]). Hadron multiplicities are calculated using the effective thermal volume approximation and Poisson multiplicity distribution around its mean value, which is supposed to be proportional to the number of participating nucleons at a given impact parameter of AA collision. The fast soft hadron simulation procedure includes generation of the 4-momentum of a hadron in the rest frame of a liquid element in accordance with the equilibrium distribution function; generation of the spatial position of a liquid element and its local 4-velocity in accordance with phase space and the character of motion of the fluid; the standard von Neumann rejection/acceptance procedure to account for the difference between the true and generated probabilities; boost of the hadron 4-momentum in the center mass frame of the event; the two- and three-body decays of resonances with branching ratios taken from the SHARE particle decay table [19]. The high generation speed in HYDJET++ is achieved due to almost 100% generation efficiency of the "soft" part because of the nearly uniform residual invariant weights which appear in the freeze-out momentum and coordinate simulation. Although HYDJET++ is optimized for very high energies of RHIC and LHC colliders (c.m.s. energies of heavy ion beams √{s}=200 and 5500 GeV per nucleon pair, respectively), in practice it can also be used for studying the particle production in a wider energy range down to √{s}˜10 GeV per nucleon pair at other heavy ion experimental facilities. As one moves from very high to moderately high energies, the contribution of the hard part of the event becomes smaller, while the soft part turns into just a multi-parameter fit to the data. Restrictions: HYDJET++ is only applicable for symmetric AA collisions of heavy ( A≳40) ions at high energies (c.m.s. energy √{s}≳10 GeV per nucleon pair). The results obtained for very peripheral collisions (with the impact parameter of the order of two nucleus radii, b˜2R) and very forward rapidities may be not adequate. Additional comments: Accessibility http://cern.ch/lokhtin/hydjet++ Running time: The generation of 100 central (0-5%) Au+Au events at √{s}=200A GeV (Pb+Pb events at √{s}=5500A GeV) with default input parameters takes about 7 (85) minutes on a PC 64 bit Intel Core Duo CPU @ 3 GHz with 8 GB of RAM memory under Red Hat Enterprise. References: [1] I.P. Lokhtin, A.M. Snigirev, Eur. Phys. J. C 46 (2006) 211. [2] N.S. Amelin, R. Lednicky, T.A. Pocheptsov, I.P. Lokhtin, L.V. Malinina, A.M. Snigirev, Iu.A. Karpenko, Yu.M. Sinyukov, Phys. Rev. C 74 (2006) 064901. [3] N.S. Amelin, I. Arsene, L. Bravina, Iu.A. Karpenko, R. Lednicky, I.P. Lokhtin, L.V. Malinina, A.M. Snigirev, Yu.M. Sinyukov, Phys. Rev. C 77 (2008) 014903.
Keefer, David K.; Moseley, Michael E.; DeFrance, Susan D.
2003-01-01
Previous work throughout the Ilo region of south coastal Peru has documented the existence of flood and debris-flow deposits produced by two El Niño events evidently much more severe than any in recent history. These two events have been dated to ca. AD 1300–1400 and AD 1607–08. The Late Pleistocene to Holocene record of older sedimentary deposits in this region is dominated by flood and debris-flow deposits of similar scale. These older deposits have been described and dated from three coastal, alluvial-fan sites. These deposits, which are as old as 38 200 years, are dominated by massive debris-flow deposits, several tens of cm thick, typically composed of cobble- and boulder-sized clasts in a matrix of silty sand, with characteristics indicating generation by heavy rainfall in an arid environment. Twenty-two radiocarbon dates and a single infrared-stimulated luminescence date show that particularly severe El Niño events occurred throughout the Late Pleistocene and two of three divisions of the Holocene with significantly different frequencies. The period of greatest activity was during the Early Holocene when at least six such events took place during a period of ca. 3600 years, beginning near the end of the Younger Dryas ca. 12 000 years ago. One of these events produced a debris flow that may have caused abandonment of the Paleo-Indian site at Quebrada Tacahuay, one of the oldest on the Andean coast. No severe events took place during the Middle Holocene between ca. 8400 and 5300 years ago, when a wide variety of other paleoclimate proxy records indicate that the El Niño–Southern Oscillation regime was particularly weak. Since ca. 5300 years ago, four of these severe events have taken place. The Late Pleistocene sequence is constrained by only two dates, which indicate that at least ten severe events took place between ca. 38 200 and 12 900 years ago. Mechanisms probably responsible for generating these large-scale deposits include: (1) ‘Mega-Niños’ that produced anomalously heavy rainfall along most or all of the central Andean coast; (2) El Niños that occurred shortly after great earthquakes that produced large amounts of sediment; or (3) El Niños that produced anomalously heavy local rainfall. The existence of these large-scale deposits in the Ilo region implies a level of hazard much higher than indicated by the historical record alone
Challenges and Demands on Automated Software Revision
NASA Technical Reports Server (NTRS)
Bonakdarpour, Borzoo; Kulkarni, Sandeep S.
2008-01-01
In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.
Field Observations of Meteotsunami in Kami-koshiki Island, Japan
NASA Astrophysics Data System (ADS)
Asano, T.; Yamashiro, T.; Nishimura, N.
2012-12-01
BACKGROUND Meteotsunami; atmospherically induced destructive ocean waves in the tsunami frequency band, are known in Japan by the local term "abiki", literally meaning "net-dragging waves" in Japanese. Large abiki occur in bays and harbors along the west coast of Kyushu almost every year during winter and early spring. On 24-25 February, 2009, Urauchi Bay, located on west coast of Kami-Koshiki Island on the southeast coast of Kyushu, was subjected to a destructive meteotsunami. In this event, a maximum sea surface height of 3.1 m was observed at the inner part of the bay. At least 18 boats capsized and eight houses were flooded. This event surpassed the previous record height for an abiki in Japan: 278 cm in Nagasaki Bay, also located west coast of Kyushu, in 1979. Generally, such an elongated inlet with narrow mouth as Urauchi bay provides calm water conditions even when offshore weather is stormy. Therefore, the area is regarded as a suitable place for the farming of large fish with a high market value. Possible damage to the extensive fish cage system as a result of meteotsunami events is of concern, especially because aquaculture is the main industry in the isolated islands. Forecasting of meteotsunami is a serious request from the local people. AIMS The objectives of the present study are to detect a meteotsunami event in Urauchi Bay and to clarify the meteorological and hydrodynamic conditions related to its occurrence. This work attempts to observe the whole process of a meteotsunami event: generation offshore, resonance while it propagates, and finally amplification in the bay. Observations were conducted over a period of 82 days; 12 January to 4 April, 2010, aiming to record large secondary oscillations. A comprehensive measuring system for sea level, current and barometric pressure fluctuations was deployed covering not only inside and near Urauchi Bay but also further offshore in the vicinity of Mejima in the East China Sea. MAIN RESULTS 1) Large meteotsunami events with total height in excess of 150 cm were observed five times during the 82-day observation period. On 1 February, 2010, one such event coincided with the high water of a spring tide, which resulted in flooding. The present observations have revealed that meteotsunami events occur more frequently than previously estimated from existing records of flooding. Even if a meteotsunami event does not result in flooding (e.g., if it coincides with a low tide), attention should be paid to the seiche induced strong currents that may damage fishing boats or aquaculture installations. 2) Three dominant modes were found to exist in sea level fluctuation data in Urauchi Bay using spectra analysis, wavelet analysis and phase analysis of the extracted period band components. The node and anti-node structure for each node governs more energetic areas for sea level and the current velocity fluctuations. 3) Analyses of barometric pressure data show that abrupt pressure changes of 1-2 hPa are generated in the open sea area at Mejima when major meteotsunami events occur. The pressure waves propagated eastward or northeastward to reach Kami-Koshiki within 1-2 hours. The propagation speed was found to nearly coincide with ocean long waves over the East China Sea. This air-sea resonant coupling is considered to be a source mechanism of meteotsunami generation.
Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.
Das, Rahul Deb; Winter, Stephan
2016-11-23
Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.
Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation
Das, Rahul Deb; Winter, Stephan
2016-01-01
Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation. PMID:27886053
Xu, Zhezhuang; Liu, Guanglun; Yan, Haotian; Cheng, Bin; Lin, Feilong
2017-01-01
In wireless sensor and actor networks, when an event is detected, the sensor node needs to transmit an event report to inform the actor. Since the actor moves in the network to execute missions, its location is always unavailable to the sensor nodes. A popular solution is the search strategy that can forward the data to a node without its location information. However, most existing works have not considered the mobility of the node, and thus generate significant energy consumption or transmission delay. In this paper, we propose the trail-based search (TS) strategy that takes advantage of actor’s mobility to improve the search efficiency. The main idea of TS is that, when the actor moves in the network, it can leave its trail composed of continuous footprints. The search packet with the event report is transmitted in the network to search the actor or its footprints. Once an effective footprint is discovered, the packet will be forwarded along the trail until it is received by the actor. Moreover, we derive the condition to guarantee the trail connectivity, and propose the redundancy reduction scheme based on TS (TS-R) to reduce nontrivial transmission redundancy that is generated by the trail. The theoretical and numerical analysis is provided to prove the efficiency of TS. Compared with the well-known expanding ring search (ERS), TS significantly reduces the energy consumption and search delay. PMID:29077017
A model of seismic coda arrivals to suppress spurious events.
NASA Astrophysics Data System (ADS)
Arora, N.; Russell, S.
2012-04-01
We describe a model of coda arrivals which has been added to NET-VISA (Network processing Vertically Integrated Seismic Analysis) our probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. The scattered energy that follows a seismic phase arrival tends to deceive typical STA/LTA based arrival picking software into believing that a real seismic phase has been detected. These coda arrivals which tend to follow all seismic phases cause most network processing software including NET-VISA to believe that multiple events have taken place. It is not a simple matter of ignoring closely spaced arrivals since arrivals from multiple events can indeed overlap. The current practice in NET-VISA of pruning events within a small space-time neighborhood of a larger event works reasonably well, but it may mask real events produced in an after-shock sequence. Our new model allows any seismic arrival, even coda arrivals, to trigger a subsequent coda arrival. The probability of such a triggered arrival depends on the amplitude of the triggering arrival. Although real seismic phases are more likely to generate such coda arrivals. Real seismic phases also tend to generate coda arrivals with more strongly correlated parameters, for example azimuth and slowness. However, the SNR (Signal to Noise Ratio) of a coda arrival immediately following a phase arrival tends to be lower because of the nature of the SNR calculation. We have calibrated our model on historical statistics of such triggered arrivals and our inference accounts for them while searching for the best explanation of seismic events their association to the arrivals and the coda arrivals. We have tested our new model on one week of global seismic data spanning March 22, 2009 to March 29, 2009. Our model was trained on two and half months of data from April 5, 2009 to June 20, 2009. We use the LEB bulletin produced by the IDC (International Data Center) as the ground truth and computed the precision (percentage of reported events which are true) and recall (percentage of true events which are reported). The existing model has a precision of 32.2 and recall of 88.6 which changes to a precision of 50.7 and recall of 88.5 after pruning. The new model has a precision of 56.8 and recall of 86.9 without any pruning and the corresponding precision recall curve is dramatically improved. In contrast, the performance of the current automated bulletin at the IDC, SEL3, has a precision of 46.2 and recall of 69.7.
Panigrahi, Priyabrata; Jere, Abhay; Anamika, Krishanpal
2018-01-01
Gene fusion is a chromosomal rearrangement event which plays a significant role in cancer due to the oncogenic potential of the chimeric protein generated through fusions. At present many databases are available in public domain which provides detailed information about known gene fusion events and their functional role. Existing gene fusion detection tools, based on analysis of transcriptomics data usually report a large number of fusion genes as potential candidates, which could be either known or novel or false positives. Manual annotation of these putative genes is indeed time-consuming. We have developed a web platform FusionHub, which acts as integrated search engine interfacing various fusion gene databases and simplifies large scale annotation of fusion genes in a seamless way. In addition, FusionHub provides three ways of visualizing fusion events: circular view, domain architecture view and network view. Design of potential siRNA molecules through ensemble method is another utility integrated in FusionHub that could aid in siRNA-based targeted therapy. FusionHub is freely available at https://fusionhub.persistent.co.in.
Chai, C T; Putuhena, F J; Selaman, O S
2017-12-01
The influences of climate on the retention capability of green roof have been widely discussed in existing literature. However, knowledge on how the retention capability of green roof is affected by the tropical climate is limited. This paper highlights the retention performance of the green roof situated in Kuching under hot-humid tropical climatic conditions. Using the green roof water balance modelling approach, this study simulated the hourly runoff generated from a virtual green roof from November 2012 to October 2013 based on past meteorological data. The result showed that the overall retention performance was satisfactory with a mean retention rate of 72.5% from 380 analysed rainfall events but reduced to 12.0% only for the events that potentially trigger the occurrence of flash flood. By performing the Spearman rank's correlation analysis, it was found that the rainfall depth and mean rainfall intensity, individually, had a strong negative correlation with event retention rate, suggesting that the retention rate increases with decreased rainfall depth. The expected direct relationship between retention rate and antecedent dry weather period was found to be event size dependent.
A Review of State Public Health Emergency Declarations in Peru: 2014-2016.
Bambarén, Celso; Alatrista, Maria Del Socorro
2018-04-01
Peru has different legal mechanisms of emergency, one of which is the Public Health Emergency that is applicable when: there is high-risk for, or the existence of an outbreak, epidemic, or pandemic; the occurrence of cases of a disease classified as eliminated or eradicated; the occurrence of emerging or re-emerging infectious diseases with high epidemic potential; the occurrence of rapid disseminated epidemics that simultaneously affect more than one department; as well as the existence of an event that affects the continuity of health services. From July 2014 to December 2016, 23 Public Health Emergencies were declared, out of which 57% were in the high-risk or existence of epidemics, 30% were due to some natural or anthropic events that generate a sudden decrease in the operative capacity of health services, and 13% were due to the existence of a rapid spreading epidemic that could affect more than one department in the country. The risk or occurrence of epidemiological outbreaks, mainly of Dengue, was the main cause of emergency declaration. One-hundred and forty million US dollars were allocated to implement the action plans that were part of the declaration, of which 72% was used to keep the operational capacity of health services and 28% to vector and epidemiological control measures. Bambarén C , Alatrista MdS . A review of state public health emergency declarations in Peru: 2014-2016. Prehosp Disaster Med. 2018;33(2):197-200.
Mechanism-based Pharmacovigilance over the Life Sciences Linked Open Data Cloud.
Kamdar, Maulik R; Musen, Mark A
2017-01-01
Adverse drug reactions (ADR) result in significant morbidity and mortality in patients, and a substantial proportion of these ADRs are caused by drug-drug interactions (DDIs). Pharmacovigilance methods are used to detect unanticipated DDIs and ADRs by mining Spontaneous Reporting Systems, such as the US FDA Adverse Event Reporting System (FAERS). However, these methods do not provide mechanistic explanations for the discovered drug-ADR associations in a systematic manner. In this paper, we present a systems pharmacology-based approach to perform mechanism-based pharmacovigilance. We integrate data and knowledge from four different sources using Semantic Web Technologies and Linked Data principles to generate a systems network. We present a network-based Apriori algorithm for association mining in FAERS reports. We evaluate our method against existing pharmacovigilance methods for three different validation sets. Our method has AUROC statistics of 0.7-0.8, similar to current methods, and event-specific thresholds generate AUROC statistics greater than 0.75 for certain ADRs. Finally, we discuss the benefits of using Semantic Web technologies to attain the objectives for mechanism-based pharmacovigilance.
Using a simulation assistant in modeling manufacturing systems
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.
1988-01-01
Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colaneri, Luca
2017-04-01
With the experimental discovery of the Higgs boson, the Standard Model has been considered veri ed in all its previsions. The Standard Model, though, is still considered an incomplete theory, because it fails to address many theoretical and phenomenological issues. Among those, it doesn't provide any viable Dark Matter candidate. Many Beyond-Standard Model theories, such as the Supersymmetric Standard Model, provide possible solutions. In this work we have reported the experimental observations that led to considerate the existence of a new Force, mediated by a new massive vector boson, that could address all the observed phenomenology. This new dark Forcemore » could open an observational channel between the Standard Model and a new Dark Sector, convey by the interaction of the Standard Model photon with the massive dark photon, also called the A'. Purpose of this work was to develop an independent study of the background processes and the implementation of an independent event generator, to better understand the kinematics of the produced particles in the process e - +W → e - +W' + e + + e - and validate, or invalidate, the o cial event generator.« less
Earthquake and submarine landslide tsunamis: how can we tell the difference? (Invited)
NASA Astrophysics Data System (ADS)
Tappin, D. R.; Grilli, S. T.; Harris, J.; Geller, R. J.; Masterlark, T.; Kirby, J. T.; Ma, G.; Shi, F.
2013-12-01
Several major recent events have shown the tsunami hazard from submarine mass failures (SMF), i.e., submarine landslides. In 1992 a small earthquake triggered landslide generated a tsunami over 25 meters high on Flores Island. In 1998 another small, earthquake-triggered, sediment slump-generated tsunami up to 15 meters high devastated the local coast of Papua New Guinea killing 2,200 people. It was this event that led to the recognition of the importance of marine geophysical data in mapping the architecture of seabed sediment failures that could be then used in modeling and validating the tsunami generating mechanism. Seabed mapping of the 2004 Indian Ocean earthquake rupture zone demonstrated, however, that large, if not great, earthquakes do not necessarily cause major seabed failures, but that along some convergent margins frequent earthquakes result in smaller sediment failures that are not tsunamigenic. Older events, such as Messina, 1908, Makran, 1945, Alaska, 1946, and Java, 2006, all have the characteristics of SMF tsunamis, but for these a SMF source has not been proven. When the 2011 tsunami struck Japan, it was generally assumed that it was directly generated by the earthquake. The earthquake has some unusual characteristics, such as a shallow rupture that is somewhat slow, but is not a 'tsunami earthquake.' A number of simulations of the tsunami based on an earthquake source have been published, but in general the best results are obtained by adjusting fault rupture models with tsunami wave gauge or other data so, to the extent that they can model the recorded tsunami data, this demonstrates self-consistency rather than validation. Here we consider some of the existing source models of the 2011 Japan event and present new tsunami simulations based on a combination of an earthquake source and an SMF mapped from offshore data. We show that the multi-source tsunami agrees well with available tide gauge data and field observations and the wave data from offshore buoys, and that the SMF generated the large runups in the Sanriku region (northern Tohoku). Our new results for the 2011 Tohoku event suggest that care is required in using tsunami wave and tide gauge data to both model and validate earthquake tsunami sources. They also suggest a potential pitfall in the use of tsunami waveform inversion from tide gauges and buoys to estimate the size and spatial characteristics of earthquake rupture. If the tsunami source has a significant SMF component such studies may overestimate earthquake magnitude. Our seabed mapping identifies other large SMFs off Sanriku that have the potential to generate significant tsunamis and which should be considered in future analyses of the tsunami hazard in Japan. The identification of two major SMF-generated tsunamis (PNG and Tohoku), especially one associated with a M9 earthquake, is important in guiding future efforts at forecasting and mitigating the tsunami hazard from large megathrust plus SMF events both in Japan and globally.
Rüdelsheim, Patrick; Dumont, Philippe; Freyssinet, Georges; Pertry, Ine; Heijde, Marc
2018-01-01
More than 20 years ago, the first genetically modified (GM) plants entered the seed market. The patents covering the first GM plants have begun to expire and these can now be considered as Off-Patent Events. Here we describe the challenges that will be faced by a Secondary Party by further use and development of these Off-Patent Events. Indeed, the conditions for Off-Patent Events are not available yet to form the basis for a new viable industry similar to the generic manufacturers of agrochemicals or pharmaceutical products, primarily because of (i) unharmonized global regulatory requirements for GM organisms, (ii) inaccessibility of regulatory submissions and data, and (iii) potential difficulties to obtain seeds and genetic material of the unique genotypes used to generate regulatory data. We propose certain adaptations by comparing what has been done in the agrochemical and pharmaceutical markets to facilitate the development of generics. Finally, we present opportunities that still exist for further development of Off-Patent Events in collaboration with Proprietary Regulatory Property Holders in emerging markets, provided (i) various countries approve these events without additional regulatory burdens (i.e., acceptance of the concept of data transportability), and (ii) local breeders agree to meet product stewardship requirements.
Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.
Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran
2017-04-01
Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.
Using open data in near real time disaster analysis and knowledge generation
NASA Astrophysics Data System (ADS)
She, Jun
2017-04-01
This presentation will address the value of using open operational geo data in near real time disaster analysis and knowledge generation. In the past, mechanism analysis of a meteo-hyrological extreme event may take month and years with lots of resources since there exist many kinds of restrictions on the model and observation data, e.g., in availability, accessibility, adequacy in resolution, quality and delivery time etc. In recent years, thanks to the open data and open service programs such as Copernicus, EMODnet (European Marine Observation Data Network) and data sharing activities in ROOSs (Regional Operational Oceanography Systems) and national agencies, the disaster analysis become a much faster and efficient procedure. The study will present such a case study for analyzing a hundred-year storm event in January 2017 which affects Danish and German coasts in western Baltic Sea. The event and its forecasts have caused lots of attention in Danish and German media. However, the explanations on how the storm surge is formed and why the prediction is good or bad in this or that country are still largely absent in the media reports. All the data and plots used in the analysis are from open sources. It is found that with the open data, the spatiotemporal variation and the internal links between weather, sea level and water mass movements can be well understood. New knowledge on key factors for the unusual high waters in the western Baltic is obtained from this analysis. Finally, recommendations for using open operational data in generating open science are given.
Modelling of historical tsunami in Eastern Indonesia: 1674 Ambon and 1992 Flores case studies
NASA Astrophysics Data System (ADS)
Pranantyo, Ignatius Ryan; Cummins, Phil; Griffin, Jonathan; Davies, Gareth; Latief, Hamzah
2017-07-01
In order to reliably assess tsunami hazard in eastern Indonesia, we need to understand how historical events were generated. Here we consider two such events: the 1674 Ambon and the 1992 Flores tsunamis. Firstly, Ambon Island suffered a devastating earthquake that generated a tsunami with 100 m run-up height on the north coast of the island in 1674. However, there is no known active fault around the island capable of generating such a gigantic wave. Rumphius' report describes that the initial wave was coming from three villages that collapsed immediately after the earthquake with width as far as a musket shot. Moreover, a very high tsunami was only observed locally. We suspect that a submarine landslide was the main cause of the gigantic tsunami on the north side of Ambon Island. Unfortunately, there is no data available to confirm if landslide have occurred in this region. Secondly, several tsunami source models for the 1992 Flores event have been suggested. However, the fault strike is quite different compare to the existing Flores back-arc thrust and has not been well validated against a tide gauge waveform at Palopo, Sulawesi. We considered a tsunami model based on Griffin, et al., 2015, extended with high resolution bathymetry laround Palopo, in order to validate the latest tsunami source model available. In general, the model produces a good agreement with tsunami waveforms, but arrives 10 minutes late compared to observed data. In addition, the source overestimates the tsunami inundation west of Maumere, and does not account for the presumed landslide tsunami on the east side of Flores Island.
Event reweighting with the NuWro neutrino interaction generator
NASA Astrophysics Data System (ADS)
Pickering, Luke; Stowell, Patrick; Sobczyk, Jan
2017-09-01
Event reweighting has been implemented in the NuWro neutrino event generator for a number of free theory parameters in the interaction model. Event reweighting is a key analysis technique, used to efficiently study the effect of neutrino interaction model uncertainties. This opens up the possibility for NuWro to be used as a primary event generator by experimental analysis groups. A preliminary model tuning to ANL and BNL data of quasi-elastic and single pion production events was performed to validate the reweighting engine.
U.S. Spacesuit Knowledge Capture Status and Initiatives in Fiscal Year 2014
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2015-01-01
Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.
Statistical analysis of Hasegawa-Wakatani turbulence
NASA Astrophysics Data System (ADS)
Anderson, Johan; Hnat, Bogdan
2017-06-01
Resistive drift wave turbulence is a multipurpose paradigm that can be used to understand transport at the edge of fusion devices. The Hasegawa-Wakatani model captures the essential physics of drift turbulence while retaining the simplicity needed to gain a qualitative understanding of this process. We provide a theoretical interpretation of numerically generated probability density functions (PDFs) of intermittent events in Hasegawa-Wakatani turbulence with enforced equipartition of energy in large scale zonal flows, and small scale drift turbulence. We find that for a wide range of adiabatic index values, the stochastic component representing the small scale turbulent eddies of the flow, obtained from the autoregressive integrated moving average model, exhibits super-diffusive statistics, consistent with intermittent transport. The PDFs of large events (above one standard deviation) are well approximated by the Laplace distribution, while small events often exhibit a Gaussian character. Furthermore, there exists a strong influence of zonal flows, for example, via shearing and then viscous dissipation maintaining a sub-diffusive character of the fluxes.
NASA Astrophysics Data System (ADS)
Sellars, S. L.; Kawzenuk, B.; Nguyen, P.; Ralph, F. M.; Sorooshian, S.
2017-12-01
The CONNected objECT (CONNECT) algorithm is applied to global Integrated Water Vapor Transport data from the NASA's Modern-Era Retrospective Analysis for Research and Applications - Version 2 reanalysis product for the period of 1980 to 2016. The algorithm generates life-cycle records in time and space evolving strong vapor transport events. We show five regions, located in the midlatitudes, where events typically exist (off the coast of the southeast United States, eastern China, eastern South America, off the southern tip of South Africa, and in the southeastern Pacific Ocean). Global statistics show distinct genesis and termination regions and global seasonal peak frequency during Northern Hemisphere late fall/winter and Southern Hemisphere winter. In addition, the event frequency and geographical location are shown to be modulated by the Arctic Oscillation, Pacific North American Pattern, and the quasi-biennial oscillation. Moreover, a positive linear trend in the annual number of objects is reported, increasing by 3.58 objects year-over-year.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2014
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2015-01-01
Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.
Evaluation of extreme temperature events in northern Spain based on process control charts
NASA Astrophysics Data System (ADS)
Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.
2018-02-01
Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.
Towards marine seismological Network: real time small aperture seismic array
NASA Astrophysics Data System (ADS)
Ilinskiy, Dmitry
2017-04-01
Most powerful and dangerous seismic events are generated in underwater subduction zones. Existing seismological networks are based on land seismological stations. Increased demands for accuracy of location, magnitude, rupture process of coming earthquakes and at the same time reduction of data processing time require information from seabed seismic stations located near the earthquake generation area. Marine stations provide important contribution for clarification of the tectonic settings in most active subduction zones of the world. Early warning system for subduction zone area is based on marine seabed array which located near the area of most hazardous seismic zone in the region. Fast track processing for location of the earthquake hypocenter and energy takes place in buoy surface unit. Information about detected and located earthquake reaches the onshore seismological center earlier than the first break waves from the same earthquake will reach the nearest onshore seismological station. Implementation of small aperture array is based on existed and shown a good proven performance and costs effective solutions such as weather moored buoy and self-pop up autonomous seabed seismic nodes. Permanent seabed system for real-time operation has to be installed in deep sea waters far from the coast. Seabed array consists of several self-popup seismological stations which continuously acquire the data, detect the events of certain energy class and send detected event parameters to the surface buoy via acoustic link. Surface buoy unit determine the earthquake location by receiving the event parameters from seabed units and send such information in semi-real time to the onshore seismological center via narrow band satellite link. Upon the request from the cost the system could send wave form of events of certain energy class, bottom seismic station battery status and other environmental parameters. When the battery life of particular seabed unit is close to became empty, the seabed unit is switching into sleep mode and send that information to surface buoy and father to the onshore data center. Then seabed unit can wait for the vessel of opportunity for recovery of seabed unit to sea surface and replacing seabed station to another one with fresh batteries. All collected permanent seismic data by seabed unit could than downloaded for father processing and analysis. In our presentation we will demonstrate the several working prototypes of proposed system such as real time cable broad band seismological station and real time buoy seabed seismological station.
Continental-Scale Estimates of Runoff Using Future Climate ...
Recent runoff events have had serious repercussions to both natural ecosystems and human infrastructure. Understanding how shifts in storm event intensities are expected to change runoff responses are valuable for local, regional, and landscape planning. To address this challenge, relative changes in runoff using predicted future climate conditions were estimated over different biophysical areas for the CONterminous U.S. (CONUS). Runoff was estimated using the Curve Number (CN) developed by the USDA Soil Conservation Service (USDA, 1986). A seamless gridded dataset representing a CN for existing land use/land cover (LULC) across the CONUS was used along with two different storm event grids created specifically for this effort. The two storm event grids represent a 2- and a 100-year, 24-hour storm event under current climate conditions. The storm event grids were generated using a compilation of county-scale Texas USGS Intensity-Duration-Frequency (IDF) data (provided by William Asquith, USGS, Lubbock, Texas), and NOAA Atlas-2 and NOAA Atlas-14 gridded data sets. Future CN runoff was predicted using extreme storm events grids created using a method based on Kao and Ganguly (2011) where precipitation extremes reflect changes in saturated water vapor pressure of the atmosphere in response to temperature changes. The Clausius-Clapeyron relationship establishes that the total water vapor mass of fully saturated air increases with increasing temperature, leading to
Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig
2014-08-01
The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.
Swarm observation of field-aligned current and electric field in multiple arc systems
NASA Astrophysics Data System (ADS)
Wu, J.; Knudsen, D. J.; Gillies, M.; Donovan, E.; Burchill, J. K.
2017-12-01
It is often thought that auroral arcs are a direct consequence of upward field-aligned currents. In fact, the relation between currents and brightness is more complicated. Multiple auroral arc systems provide and opportunity to study this relation in detail. In this study, we have identified two types of FAC configurations in multiple parallel arc systems using ground-based optical data from the THEMIS all-sky imagers (ASIs), magnetometers and electric field instruments onboard the Swarm satellites during the period from December 2013 to March 2015. In type 1 events, each arc is an intensification within a broad, unipolar current sheet and downward currents only exist outside the upward current sheet. These types of events are termed "unipolar FAC" events. In type 2 events, multiple arc systems represent a collection of multiple up/down current pairs, which are termed as "multipolar FAC" events. Comparisons of these two types of FAC events are presented with 17 "unipolar FAC" events and 12 "multipolar FAC" events. The results show that "unipolar FAC" and "multipolar FAC" events have systematic differences in terms of MLT, arc width and separation, and dependence on substorm onset time. For "unipolar FAC" events, significant electric field enhancements are shown on the edges of the broad upward current sheet. Electric field fluctuations inside the multiple arc system can be large or small. For "multipolar FAC" events, a strong correlation between magnetic and electric field indicate uniform conductance within each upward current sheet. The electrodynamical structures of multiple arc systems presented in this paper represents a step toward understanding arc generation.
Distributed plug-and-play optimal generator and load control for power system frequency regulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Changhong; Mallada, Enrique; Low, Steven H.
A distributed control scheme, which can be implemented on generators and controllable loads in a plug-and-play manner, is proposed for power system frequency regulation. The proposed scheme is based on local measurements, local computation, and neighborhood information exchanges over a communication network with an arbitrary (but connected) topology. In the event of a sudden change in generation or load, the proposed scheme can restore the nominal frequency and the reference inter-area power flows, while minimizing the total cost of control for participating generators and loads. Power network stability under the proposed control is proved with a relatively realistic model whichmore » includes nonlinear power flow and a generic (potentially nonlinear or high-order) turbine-governor model, and further with first- and second-order turbine-governor models as special cases. Finally, in simulations, the proposed control scheme shows a comparable performance to the existing automatic generation control (AGC) when implemented only on the generator side, and demonstrates better dynamic characteristics than AGC when each scheme is implemented on both generators and controllable loads. Simulation results also show robustness of the proposed scheme to communication link failure.« less
Distributed plug-and-play optimal generator and load control for power system frequency regulation
Zhao, Changhong; Mallada, Enrique; Low, Steven H.; ...
2018-03-14
A distributed control scheme, which can be implemented on generators and controllable loads in a plug-and-play manner, is proposed for power system frequency regulation. The proposed scheme is based on local measurements, local computation, and neighborhood information exchanges over a communication network with an arbitrary (but connected) topology. In the event of a sudden change in generation or load, the proposed scheme can restore the nominal frequency and the reference inter-area power flows, while minimizing the total cost of control for participating generators and loads. Power network stability under the proposed control is proved with a relatively realistic model whichmore » includes nonlinear power flow and a generic (potentially nonlinear or high-order) turbine-governor model, and further with first- and second-order turbine-governor models as special cases. Finally, in simulations, the proposed control scheme shows a comparable performance to the existing automatic generation control (AGC) when implemented only on the generator side, and demonstrates better dynamic characteristics than AGC when each scheme is implemented on both generators and controllable loads. Simulation results also show robustness of the proposed scheme to communication link failure.« less
Parameterizing the Variability and Uncertainty of Wind and Solar in CEMs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frew, Bethany
We present current and improved methods for estimating the capacity value and curtailment impacts from variable generation (VG) in capacity expansion models (CEMs). The ideal calculation of these variability metrics is through an explicit co-optimized investment-dispatch model using multiple years of VG and load data. Because of data and computational limitations, existing CEMs typically approximate these metrics using a subset of all hours from a single year and/or using statistical methods, which often do not capture the tail-event impacts or the broader set of interactions between VG, storage, and conventional generators. In our proposed new methods, we use hourly generationmore » and load values across all hours of the year to characterize the (1) contribution of VG to system capacity during high load hours, (2) the curtailment level of VG, and (3) the reduction in VG curtailment due to storage and shutdown of select thermal generators. Using CEM model outputs from a preceding model solve period, we apply these methods to exogenously calculate capacity value and curtailment metrics for the subsequent model solve period. Preliminary results suggest that these hourly methods offer improved capacity value and curtailment representations of VG in the CEM from existing approximation methods without additional computational burdens.« less
Jing, Helen G; Madore, Kevin P; Schacter, Daniel L
2017-12-01
A critical adaptive feature of future thinking involves the ability to generate alternative versions of possible future events. However, little is known about the nature of the processes that support this ability. Here we examined whether an episodic specificity induction - brief training in recollecting details of a recent experience that selectively impacts tasks that draw on episodic retrieval - (1) boosts alternative event generation and (2) changes one's initial perceptions of negative future events. In Experiment 1, an episodic specificity induction significantly increased the number of alternative positive outcomes that participants generated to a series of standardized negative events, compared with a control induction not focused on episodic specificity. We also observed larger decreases in the perceived plausibility and negativity of the original events in the specificity condition, where participants generated more alternative outcomes, relative to the control condition. In Experiment 2, we replicated and extended these findings using a series of personalized negative events. Our findings support the idea that episodic memory processes are involved in generating alternative outcomes to anticipated future events, and that boosting the number of alternative outcomes is related to subsequent changes in the perceived plausibility and valence of the original events, which may have implications for psychological well-being. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Trubilowicz, J. W.; Moore, D.
2015-12-01
Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.
NASA Astrophysics Data System (ADS)
Pecoits, Ernesto; Aubet, Natalie R.; Heaman, Larry M.; Philippot, Pascal; Rosière, Carlos A.; Veroslavsky, Gerardo; Konhauser, Kurt O.
2016-11-01
The Neoproterozoic volcano-sedimentary successions of Uruguay have been the subject of several sedimentologic, chrono-stratigraphic and tectonic interpretation studies. Recent studies have shown, however, that the stratigraphy, age and tectonic evolution of these units remain uncertain. Here we use new Usbnd Pb detrital zircon ages, combined with previously published geochronologic and stratigraphic data in order to provide more precise temporal constraints on their depositional age and to establish a more solid framework for the stratigraphic and tectonic evolution of these units. The sequence of events begins with a period of tectonic quiescence and deposition of extensive mixed siliciclastic-carbonate sedimentary successions. This is followed by the development of small fault-bounded siliciclastic and volcaniclastic basins and the emplacement of voluminous granites associated with episodic terrane accretion. According to our model, the Arroyo del Soldado Group and the Piedras de Afilar Formation were deposited sometime between ∼1000 and 650 Ma, and represent passive continental margin deposits of the Nico Pérez and Piedra Alta terranes, respectively. In contrast, the Ediacaran San Carlos (<552 ± 3 Ma) and Barriga Negra (<581 ± 6 Ma) formations, and the Maldonado Group (<580-566 Ma) were deposited in tectonically active basins developed on the Nico Pérez and Cuchilla Dionisio terranes, and the herein defined Edén Terrane. The Edén and the Nico Pérez terranes likely accreted at ∼650-620 Ma (Edén Accretionary Event), followed by their accretion to the Piedra Alta Terrane at ∼620-600 Ma (Piedra Alta Accretionary Event), and culminating with the accretion of the Cuchilla Dionisio Terrane at ∼600-560 Ma (Cuchilla Dionisio Accretionary Event). Although existing models consider all the Ediacaran granites as a result of a single orogenic event, recently published age constraints point to the existence of at least two distinct stages of granite generation, which are spatially and temporally associated with the Edén and Cuchilla Dionisio accretionary events.
NASA Technical Reports Server (NTRS)
Sibeck, D. G.; Lin, R.-Q.
2011-01-01
We employ the Cooling et al. (2001) model to predict the location, orientation, motion, and signatures of flux transfer events (FTEs) generated at the solstices and equinoxes along extended subsolar component and high ]latitude antiparallel reconnection curves for typical solar wind plasma conditions and various interplanetary magnetic field (IMF) strengths and directions. In general, events generated by the two mechanisms maintain the strikingly different orientations they begin with as they move toward the terminator in opposite pairs of magnetopause quadrants. The curves along which events generated by component reconnection form bow toward the winter cusp. Events generated by antiparallel reconnection form on the equatorial magnetopause during intervals of strongly southward IMF orientation during the equinoxes, form in the winter hemisphere and only reach the dayside equatorial magnetopause during the solstices when the IMF strength is very large and the IMF points strongly southward, never reach the equatorial dayside magnetopause when the IMF has a substantial dawnward or duskward component, and never reach the equatorial flank magnetopause during intervals of northward and dawnward or duskward IMF orientation. Magnetosheath magnetic fields typically have strong components transverse to events generated by component reconnection but only weak components transverse to the axes of events generated by antiparallel reconnection. As a result, much stronger bipolar magnetic field signatures normal to the nominal magnetopause should accompany events generated by component reconnection. The results presented in this paper suggest that events generated by component reconnection predominate on the dayside equatorial and flank magnetopause for most solar wind conditions.
Use of Synchronized Phasor Measurements for Model Validation in ERCOT
NASA Astrophysics Data System (ADS)
Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill
2013-05-01
This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.
New data products available at the IRIS DMC
NASA Astrophysics Data System (ADS)
Trabant, C. M.; Bahavar, M.; Hutko, A.; Karstens, R.
2010-12-01
The research supported by the raw data from the observatories of NSF's EarthScope project are having tremendous impact on our understanding of the structure and geologic history of North America, how and why earthquakes occur and many other areas of modern geophysics. The IRIS Data Management Center (DMC) is the primary access point for EarthScope/USArray data and has embarked on a new effort to produce higher-level data products beyond raw time series in order to assist the community in extracting the highest value possible from these data. These new products will serve many purposes: stepping-stones for future research projects, data visualizations, research result comparisons and compilation of unique data sets as well as outreach material. To ensure community involvement in the development of new products the requirements and priorities are reviewed and approved by the IRIS Data Products Working Group (DPWG). Many new products are now available at the IRIS DMC. These include two event based products generated in near real time. 1) USArray Ground Motion Visualizations, routinely generated animations showing the both the vertical and horizontal seismic wavefields sweeping across the USArray Transportable Array from earthquakes around the world. 2) Event Plots, a suite of figures automatically generated following all M6.0+ events which include phase aligned record sections, global body wave envelope stacks, regional network vespagrams and source-time functions. 3) Earth Model Collaboration, a new web repository for community-supplied regional and global tomography models with the ability to preview, request and compare models. 4) EARS, the EarthScope Automated Receiver Survey, developed at the University of South Carolina, aims to calculate crustal thickness and bulk crustal properties beneath USArray stations as well as many other broadband stations whose data are archived at the IRIS DMC. 5) Archiving and distribution of Princeton 3D SEM and 1D synthetic seismograms generated for all Global CMT events. 6) Archiving and distribution of GPS displacement time series produced by the Plate Boundary Observatory. Other data products are under consideration and will be moved to the development pipeline once approved by the IRIS DPWG. Feedback on existing products and ideas for new products are welcome at any time.
Weirather, Jason L; Afshar, Pegah Tootoonchi; Clark, Tyson A; Tseng, Elizabeth; Powers, Linda S; Underwood, Jason G; Zabner, Joseph; Korlach, Jonas; Wong, Wing Hung; Au, Kin Fai
2015-10-15
We developed an innovative hybrid sequencing approach, IDP-fusion, to detect fusion genes, determine fusion sites and identify and quantify fusion isoforms. IDP-fusion is the first method to study gene fusion events by integrating Third Generation Sequencing long reads and Second Generation Sequencing short reads. We applied IDP-fusion to PacBio data and Illumina data from the MCF-7 breast cancer cells. Compared with the existing tools, IDP-fusion detects fusion genes at higher precision and a very low false positive rate. The results show that IDP-fusion will be useful for unraveling the complexity of multiple fusion splices and fusion isoforms within tumorigenesis-relevant fusion genes. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Premonitory acoustic emissions and stick-slip in natural and smooth-faulted Westerly granite
Thompson, B.D.; Young, R.P.; Lockner, David A.
2009-01-01
A stick-slip event was induced in a cylindrical sample of Westerly granite containing a preexisting natural fault by loading at constant confining pressure of 150 MPa. Continuously recorded acoustic emission (AE) data and computer tomography (CT)-generated images of the fault plane were combined to provide a detailed examination of microscale processes operating on the fault. The dynamic stick-slip event, considered to be a laboratory analog of an earthquake, generated an ultrasonic signal that was recorded as a large-amplitude AE event. First arrivals of this event were inverted to determine the nucleation site of slip, which is associated with a geometric asperity on the fault surface. CT images and AE locations suggest that a variety of asperities existed in the sample because of the intersection of branch or splay faults with the main fault. This experiment is compared with a stick-slip experiment on a sample prepared with a smooth, artificial saw-cut fault surface. Nearly a thousand times more AE were observed for the natural fault, which has a higher friction coefficient (0.78 compared to 0.53) and larger shear stress drop (140 compared to 68 MPa). However at the measured resolution, the ultrasonic signal emitted during slip initiation does not vary significantly between the two experiments, suggesting a similar dynamic rupture process. We propose that the natural faulted sample under triaxial compression provides a good laboratory analogue for a field-scale fault system in terms of the presence of asperities, fault surface heterogeneity, and interaction of branching faults. ?? 2009.
Phillips, Anna C; Carroll, Douglas; Der, Geoff
2015-01-01
Stressful life events are known to contribute to development of depression; however, it is possible this link is bidirectional. The present study examined whether such stress generation effects are greater than the effects of stressful life events on depression, and whether stress generation is also evident with anxiety. Participants were two large age cohorts (N = 732 aged 44 years; N = 705 aged 63 years) from the West of Scotland Twenty-07 study. Stressful life events, depression, and anxiety symptoms were measured twice five years apart. Cross-lagged panel analysis examined the mutual influences of stressful life events on depression and on anxiety over time. Life events predicted later depressive symptomatology (p = .01), but the depression predicting life events relationship was less strong (p = .06), whereas earlier anxiety predicted life events five years later (p = .001). There was evidence of sex differences in the extent to which life events predicted later anxiety. This study provides evidence of stress causation for depression and weaker evidence for stress generation. In contrast, there was strong evidence of stress generation for anxiety but weaker evidence for stress causation, and that differed for men and women.
Phillips, Anna C.; Carroll, Douglas; Der, Geoff
2016-01-01
Background and Objectives Stressful life events are known to contribute to development of depression, however, it is possible this link is bi-directional. The present study examined whether such stress generation effects are greater than the effects of stressful life events on depression, and whether stress generation is also evident with anxiety. Design Participants were two large age cohorts (N = 732 aged 44 years; N = 705 aged 63 years) from the West of Scotland Twenty-07 study. Methods Stressful life events, depression and anxiety symptoms were measured twice five years apart. Cross-lagged panel analysis examined the mutual influences of stressful life events on depression and on anxiety over time. Results Life events predicted later depressive symptomatology (p = .01), but the depression predicting life events relationship was less strong (p = .06), whereas earlier anxiety predicted life events five years later (p = .001). There was evidence of sex differences in the extent to which life events predicted later anxiety. Conclusions This study provides evidence of stress causation for depression and weaker evidence for stress generation. In contrast, there was strong evidence of stress generation for anxiety but weaker evidence for stress causation, and that differed for men and women. PMID:25572915
NASA Astrophysics Data System (ADS)
Pankow, C.; Brady, P.; Ochsner, E.; O'Shaughnessy, R.
2015-07-01
We introduce a highly parallelizable architecture for estimating parameters of compact binary coalescence using gravitational-wave data and waveform models. Using a spherical harmonic mode decomposition, the waveform is expressed as a sum over modes that depend on the intrinsic parameters (e.g., masses) with coefficients that depend on the observer dependent extrinsic parameters (e.g., distance, sky position). The data is then prefiltered against those modes, at fixed intrinsic parameters, enabling efficiently evaluation of the likelihood for generic source positions and orientations, independent of waveform length or generation time. We efficiently parallelize our intrinsic space calculation by integrating over all extrinsic parameters using a Monte Carlo integration strategy. Since the waveform generation and prefiltering happens only once, the cost of integration dominates the procedure. Also, we operate hierarchically, using information from existing gravitational-wave searches to identify the regions of parameter space to emphasize in our sampling. As proof of concept and verification of the result, we have implemented this algorithm using standard time-domain waveforms, processing each event in less than one hour on recent computing hardware. For most events we evaluate the marginalized likelihood (evidence) with statistical errors of ≲5 %, and even smaller in many cases. With a bounded runtime independent of the waveform model starting frequency, a nearly unchanged strategy could estimate neutron star (NS)-NS parameters in the 2018 advanced LIGO era. Our algorithm is usable with any noise curve and existing time-domain model at any mass, including some waveforms which are computationally costly to evolve.
Vertically Integrated Seismological Analysis II : Inference
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.; Sudderth, E.
2009-12-01
Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.
Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies
Uhlemann, Elisabeth
2018-01-01
Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications. PMID:29570676
Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.
Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos
2018-03-23
Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.
Automatic detection of snow avalanches in continuous seismic data using hidden Markov models
NASA Astrophysics Data System (ADS)
Heck, Matthias; Hammer, Conny; van Herwijnen, Alec; Schweizer, Jürg; Fäh, Donat
2018-01-01
Snow avalanches generate seismic signals as many other mass movements. Detection of avalanches by seismic monitoring is highly relevant to assess avalanche danger. In contrast to other seismic events, signals generated by avalanches do not have a characteristic first arrival nor is it possible to detect different wave phases. In addition, the moving source character of avalanches increases the intricacy of the signals. Although it is possible to visually detect seismic signals produced by avalanches, reliable automatic detection methods for all types of avalanches do not exist yet. We therefore evaluate whether hidden Markov models (HMMs) are suitable for the automatic detection of avalanches in continuous seismic data. We analyzed data recorded during the winter season 2010 by a seismic array deployed in an avalanche starting zone above Davos, Switzerland. We re-evaluated a reference catalogue containing 385 events by grouping the events in seven probability classes. Since most of the data consist of noise, we first applied a simple amplitude threshold to reduce the amount of data. As first classification results were unsatisfying, we analyzed the temporal behavior of the seismic signals for the whole data set and found that there is a high variability in the seismic signals. We therefore applied further post-processing steps to reduce the number of false alarms by defining a minimal duration for the detected event, implementing a voting-based approach and analyzing the coherence of the detected events. We obtained the best classification results for events detected by at least five sensors and with a minimal duration of 12 s. These processing steps allowed identifying two periods of high avalanche activity, suggesting that HMMs are suitable for the automatic detection of avalanches in seismic data. However, our results also showed that more sensitive sensors and more appropriate sensor locations are needed to improve the signal-to-noise ratio of the signals and therefore the classification.
2013-01-01
Background Multicellular organisms consist of cells of many different types that are established during development. Each type of cell is characterized by the unique combination of expressed gene products as a result of spatiotemporal gene regulation. Currently, a fundamental challenge in regulatory biology is to elucidate the gene expression controls that generate the complex body plans during development. Recent advances in high-throughput biotechnologies have generated spatiotemporal expression patterns for thousands of genes in the model organism fruit fly Drosophila melanogaster. Existing qualitative methods enhanced by a quantitative analysis based on computational tools we present in this paper would provide promising ways for addressing key scientific questions. Results We develop a set of computational methods and open source tools for identifying co-expressed embryonic domains and the associated genes simultaneously. To map the expression patterns of many genes into the same coordinate space and account for the embryonic shape variations, we develop a mesh generation method to deform a meshed generic ellipse to each individual embryo. We then develop a co-clustering formulation to cluster the genes and the mesh elements, thereby identifying co-expressed embryonic domains and the associated genes simultaneously. Experimental results indicate that the gene and mesh co-clusters can be correlated to key developmental events during the stages of embryogenesis we study. The open source software tool has been made available at http://compbio.cs.odu.edu/fly/. Conclusions Our mesh generation and machine learning methods and tools improve upon the flexibility, ease-of-use and accuracy of existing methods. PMID:24373308
Pore-level mechanics of foam generation and coalescence in the presence of oil.
Almajid, Muhammad M; Kovscek, Anthony R
2016-07-01
The stability of foam in porous media is extremely important for realizing the advantages of foamed gas on gas mobility reduction. Foam texture (i.e., bubbles per volume of gas) achieved is dictated by foam generation and coalescence processes occurring at the pore-level. For foam injection to be widely applied during gas injection projects, we need to understand these pore-scale events that lead to foam stability/instability so that they are modeled accurately. Foam flow has been studied for decades, but most efforts focused on studying foam generation and coalescence in the absence of oil. Here, the extensive existing literature is reviewed and analyzed to identify open questions. Then, we use etched-silicon micromodels to observe foam generation and coalescence processes at the pore-level. Special emphasis is placed on foam coalescence in the presence of oil. For the first time, lamella pinch-off as described by Myers and Radke [40] is observed in porous media and documented. Additionally, a new mechanism coined "hindered generation" is found. Hindered generation refers to the role oil plays in preventing the successful formation of a lamella following snap-off near a pore throat. Copyright © 2015 Elsevier B.V. All rights reserved.
Detecting gravity waves from binary black holes
NASA Technical Reports Server (NTRS)
Wahlquist, Hugo D.
1989-01-01
One of the most attractive possible sources of strong gravitational waves would be a binary system comprising massive black holes (BH). The gravitational radiation from a binary is an elliptically polarized, periodic wave which could be observed continuously - or at intervals whenever a detector was available. This continuity of the signal is certainly appealing compared to waiting for individual pulses from infrequent random events. It also has the advantage over pulses that continued observation can increase the signal-to-noise ratio almost indefinitely. Furthermore, this system is dynamically simple; the theory of the generation of the radiation is unambiguous; all characteristics of the signal can be precisely related to the dynamical parameters of the source. The current situation is that while there is no observational evidence as yet for the existence of massive binary BH, their formation is theoretically plausible, and within certain coupled constraints of mass and location, their existence cannot be observationally excluded. Detecting gravitational waves from these objects might be the first observational proof of their existence.
Hybridisation and diversification in the adaptive radiation of clownfishes.
Litsios, Glenn; Salamin, Nicolas
2014-11-30
The importance of hybridisation during species diversification has long been debated among evolutionary biologists. It is increasingly recognised that hybridisation events occurred during the evolutionary history of numerous species, especially during the early stages of adaptive radiation. We study the effect of hybridisation on diversification in the clownfishes, a clade of coral reef fish that diversified through an adaptive radiation process. While two species of clownfish are likely to have been described from hybrid specimens, the occurrence and effect of hybridisation on the clade diversification is yet unknown. We generate sequences of three mitochondrial genes to complete an existing dataset of nuclear sequences and document cytonuclear discordance at a node, which shows a drastic increase of diversification rate. Then, using a tree-based jack-knife method, we identify clownfish species likely stemming from hybridisation events. Finally, we use molecular cloning and identify the putative parental species of four clownfish specimens that display the morphological characteristics of hybrids. Our results show that consistently with the syngameon hypothesis, hybridisation events are linked with a burst of diversification in the clownfishes. Moreover, several recently diverged clownfish lineages likely originated through hybridisation, which indicates that diversification, catalysed by hybridisation events, may still be happening.
Generating heavy particles with energy and momentum conservation
NASA Astrophysics Data System (ADS)
Mereš, Michal; Melo, Ivan; Tomášik, Boris; Balek, Vladimír; Černý, Vladimír
2011-12-01
We propose a novel algorithm, called REGGAE, for the generation of momenta of a given sample of particle masses, evenly distributed in Lorentz-invariant phase space and obeying energy and momentum conservation. In comparison to other existing algorithms, REGGAE is designed for the use in multiparticle production in hadronic and nuclear collisions where many hadrons are produced and a large part of the available energy is stored in the form of their masses. The algorithm uses a loop simulating multiple collisions which lead to production of configurations with reasonably large weights. Program summaryProgram title: REGGAE (REscattering-after-Genbod GenerAtor of Events) Catalogue identifier: AEJR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1523 No. of bytes in distributed program, including test data, etc.: 9608 Distribution format: tar.gz Programming language: C++ Computer: PC Pentium 4, though no particular tuning for this machine was performed. Operating system: Originally designed on Linux PC with g++, but it has been compiled and ran successfully on OS X with g++ and MS Windows with Microsoft Visual C++ 2008 Express Edition, as well. RAM: This depends on the number of particles which are generated. For 10 particles like in the attached example it requires about 120 kB. Classification: 11.2 Nature of problem: The task is to generate momenta of a sample of particles with given masses which obey energy and momentum conservation. Generated samples should be evenly distributed in the available Lorentz-invariant phase space. Solution method: In general, the algorithm works in two steps. First, all momenta are generated with the GENBOD algorithm. There, particle production is modeled as a sequence of two-body decays of heavy resonances. After all momenta are generated this way, they are reshuffled. Each particle undergoes a collision with some other partner such that in the pair center of mass system the new directions of momenta are distributed isotropically. After each particle collides only a few times, the momenta are distributed evenly across the whole available phase space. Starting with GENBOD is not essential for the procedure but it improves the performance. Running time: This depends on the number of particles and number of events one wants to generate. On a LINUX PC with 2 GHz processor, generation of 1000 events with 10 particles each takes about 3 s.
Geophysical Event Casting: Assembling & Broadcasting Data Relevant to Events and Disasters
NASA Astrophysics Data System (ADS)
Manipon, G. M.; Wilson, B. D.
2012-12-01
Broadcast Atom feeds are already being used to publish metadata and support discovery of data collections, granules, and web services. Such data and service casting advertises the existence of new granules in a dataset and available services to access or transform data. Similarly, data and services relevant to studying topical geophysical events (earthquakes, hurricanes, etc.) or periodic/regional structures (El Nino, deep convection) can be broadcast by publishing new entries and links in a feed for that topic. By using the geoRSS conventions, the time and space location of the event (e.g. a moving hurricane track) is specified in the feed, along with science description, images, relevant data granules, and links to useful web services (e.g. OGC/WMS). The topic cast is used to assemble all of the relevant data/images as they come in, and publish the metadata (images, links, services) to a broad group of subscribers. All of the information in the feed is structured using standardized XML tags (e.g. georss for space & time, and tags to point to external data & services), and is thus machine-readable, which is an improvement over collecting ad hoc links on a wiki. We have created a software suite in python to generate such "event casts" when a geophysical event first happens, then update them with more information as it becomes available, and display them as an event album in a web browser. Figure 1 shows a snapshot of our Event Cast Browser displaying information from a set of casts about the hurricanes in the Western Pacific during the year 2011. The 19th cyclone is selected in the left panel, so the top right panels display the entries in that feed with metadata such as maximum wind speed, while the bottom right panel displays the hurricane track (positions every 12 hours) as KML in the Google Earth plug-in, where additional data/image layers from the feed can be turned on or off by the user. The software automatically converts (georss) space & time information to KML placemarks, and can also generate various KML visualizations for other data layers that are pointed to in the feed. The user can replay all of the data images as an animation over the several days as the cyclone develops. The goal of "event casting" is to standardize several metadata micro-formats and use them within Atom feeds to create a rich ecosystem of topical event data that can be automatically manipulated by scripts and many interfaces. For our event cast browser, the same code can display all kinds of casts, whether about hurricanes, fire, earthquakes, or even El Nino. The presentation will describe: the event cast format and its standard micro-formats, software to generate and augment casts, and the browser GUI with KML visualizations.;
Integrated assessment of water-power grid systems under changing climate
NASA Astrophysics Data System (ADS)
Yan, E.; Zhou, Z.; Betrie, G.
2017-12-01
Energy and water systems are intrinsically interconnected. Due to an increase in climate variability and extreme weather events, interdependency between these two systems has been recently intensified resulting significant impacts on both systems and energy output. To address this challenge, an Integrated Water-Energy Systems Assessment Framework (IWESAF) is being developed to integrate multiple existing or developed models from various sectors. In this presentation, we are focusing on recent improvement in model development of thermoelectric power plant water use simulator, power grid operation and cost optimization model, and model integration that facilitate interaction among water and electricity generation under extreme climate events. A process based thermoelectric power water use simulator includes heat-balance, climate, and cooling system modules that account for power plant characteristics, fuel types, and cooling technology. The model is validated with more than 800 power plants of fossil-fired, nuclear and gas-turbine power plants with different cooling systems. The power grid operation and cost optimization model was implemented for a selected regional in the Midwest. The case study will be demonstrated to evaluate the sensitivity and resilience of thermoelectricity generation and power grid under various climate and hydrologic extremes and potential economic consequences.
Sense, decide, act, communicate (SDAC): next generation of smart sensor systems
NASA Astrophysics Data System (ADS)
Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian
2004-09-01
The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.
77 FR 57994 - Airworthiness Directives; The Cessna Aircraft Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-19
... unit (GCU). We are issuing this AD to prevent DC generator overvoltage events, which could result in... AD was prompted by reports of direct current (DC) generator overvoltage events. We are issuing this AD to prevent DC generator overvoltage events, which could result in smoke in the cockpit and loss of...
NASA Astrophysics Data System (ADS)
Haruki, W.; Iseri, Y.; Takegawa, S.; Sasaki, O.; Yoshikawa, S.; Kanae, S.
2016-12-01
Natural disasters caused by heavy rainfall occur every year in Japan. Effective countermeasures against such events are important. In 2015, a catastrophic flood occurred in Kinu river basin, which locates in the northern part of Kanto region. The remarkable feature of this flood event was not only in the intensity of rainfall but also in the spatial characteristics of heavy rainfall area. The flood was caused by continuous overlapping of heavy rainfall area over the Kinu river basin, suggesting consideration of spatial extent is quite important to assess impacts of heavy rainfall events. However, the spatial extent of heavy rainfall events cannot be properly measured through rainfall measurement by rain gauges at observation points. On the other hand, rainfall measurements by radar observations provide spatially and temporarily high resolution rainfall data which would be useful to catch the characteristics of heavy rainfall events. For long term effective countermeasure, extreme heavy rainfall scenario considering rainfall area and distribution is required. In this study, a new method for generating extreme heavy rainfall events using Monte Carlo Simulation has been developed in order to produce extreme heavy rainfall scenario. This study used AMeDAS analyzed precipitation data which is high resolution grid precipitation data made by Japan Meteorological Agency. Depth area duration (DAD) analysis has been conducted to extract extreme rainfall events in the past, considering time and spatial scale. In the Monte Carlo Simulation, extreme rainfall event is generated based on events extracted by DAD analysis. Extreme heavy rainfall events are generated in specific region in Japan and the types of generated extreme heavy rainfall events can be changed by varying the parameter. For application of this method, we focused on Kanto region in Japan. As a result, 3000 years rainfall data are generated. 100 -year probable rainfall and return period of flood in Kinu River Basin (2015) are obtained using generated data. We compared 100-year probable rainfall calculated by this method with other traditional method. New developed method enables us to generate extreme rainfall events considering time and spatial scale and produce extreme rainfall scenario.
HepML, an XML-based format for describing simulated data in high energy physics
NASA Astrophysics Data System (ADS)
Belov, S.; Dudko, L.; Kekelidze, D.; Sherstnev, A.
2010-10-01
In this paper we describe a HepML format and a corresponding C++ library developed for keeping complete description of parton level events in a unified and flexible form. HepML tags contain enough information to understand what kind of physics the simulated events describe and how the events have been prepared. A HepML block can be included into event files in the LHEF format. The structure of the HepML block is described by means of several XML Schemas. The Schemas define necessary information for the HepML block and how this information should be located within the block. The library libhepml is a C++ library intended for parsing and serialization of HepML tags, and representing the HepML block in computer memory. The library is an API for external software. For example, Matrix Element Monte Carlo event generators can use the library for preparing and writing a header of an LHEF file in the form of HepML tags. In turn, Showering and Hadronization event generators can parse the HepML header and get the information in the form of C++ classes. libhepml can be used in C++, C, and Fortran programs. All necessary parts of HepML have been prepared and we present the project to the HEP community. Program summaryProgram title: libhepml Catalogue identifier: AEGL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 138 866 No. of bytes in distributed program, including test data, etc.: 613 122 Distribution format: tar.gz Programming language: C++, C Computer: PCs and workstations Operating system: Scientific Linux CERN 4/5, Ubuntu 9.10 RAM: 1 073 741 824 bytes (1 Gb) Classification: 6.2, 11.1, 11.2 External routines: Xerces XML library ( http://xerces.apache.org/xerces-c/), Expat XML Parser ( http://expat.sourceforge.net/) Nature of problem: Monte Carlo simulation in high energy physics is divided into several stages. Various programs exist for these stages. In this article we are interested in interfacing different Monte Carlo event generators via data files, in particular, Matrix Element (ME) generators and Showering and Hadronization (SH) generators. There is a widely accepted format for data files for such interfaces - Les Houches Event Format (LHEF). Although information kept in an LHEF file is enough for proper working of SH generators, it is insufficient for understanding how events in the LHEF file have been prepared and which physical model has been applied. In this paper we propose an extension of the format for keeping additional information available in generators. We propose to add a new information block, marked up with XML tags, to the LHEF file. This block describes events in the file in more detail. In particular, it stores information about a physical model, kinematical cuts, generator, etc. This helps to make LHEF files self-documented. Certainly, HepML can be applied in more general context, not in LHEF files only. Solution method: In order to overcome drawbacks of the original LHEF accord we propose to add a new information block of HepML tags. HepML is an XML-based markup language. We designed several XML Schemas for all tags in the language. Any HepML document should follow rules of the Schemas. The language is equipped with a library for operation with HepML tags and documents. This C++ library, called libhepml, consists of classes for HepML objects, which represent a HepML document in computer memory, parsing classes, serializating classes, and some auxiliary classes. Restrictions: The software is adapted for solving problems, described in the article. There are no additional restrictions. Running time: Tests have been done on a computer with Intel(R) Core(TM)2 Solo, 1.4 GHz. Parsing of a HepML file: 6 ms (size of the HepML files is 12.5 Kb) Writing of a HepML block to file: 14 ms (file size 12.5 Kb) Merging of two HepML blocks and writing to file: 18 ms (file size - 25.0 Kb).
Strategies for automatic processing of large aftershock sequences
NASA Astrophysics Data System (ADS)
Kvaerna, T.; Gibbons, S. J.
2017-12-01
Aftershock sequences following major earthquakes present great challenges to seismic bulletin generation. The analyst resources needed to locate events increase with increased event numbers as the quality of underlying, fully automatic, event lists deteriorates. While current pipelines, designed a generation ago, are usually limited to single passes over the raw data, modern systems also allow multiple passes. Processing the raw data from each station currently generates parametric data streams that are later subject to phase-association algorithms which form event hypotheses. We consider a major earthquake scenario and propose to define a region of likely aftershock activity in which we will detect and accurately locate events using a separate, specially targeted, semi-automatic process. This effort may use either pattern detectors or more general algorithms that cover wider source regions without requiring waveform similarity. An iterative procedure to generate automatic bulletins would incorporate all the aftershock event hypotheses generated by the auxiliary process, and filter all phases from these events from the original detection lists prior to a new iteration of the global phase-association algorithm.
NASA Astrophysics Data System (ADS)
Vannucchi, Paola; Morgan, Jason P.
2015-04-01
Our paper (Vannucchi et al., 2015) focuses on geologic evidence for shock metamorphism found at the epicentral region of the 1908 Tunguska event. None of the currently proposed bolide explanations for the 1908 event can produce the shock pressures indicated by the geological evidence described in Vannucchi et al. (2015). If the 1908 event would have generated these pressures over the epicentral region, an observable crater should have also formed. The comment by Melott and Overholt discusses the possibility that a 1908 cometary bolide strike in Tunguska cannot be excluded because of the absence of a detectable 14C increase at this site. They dispute the findings of a recent Liu et al.'s (2014) study that an East Asian comet impact recorded by eyewitness accounts in 773 AD was coincident with a detectable 14C increase in regional South China Sea corals that grew at that time. Their point, whether true or not, is fairly peripheral to our study because the bolide hypothesis for the 1908 Tunguska event, no matter the nature of the bolide itself, does not provide a viable explanation for the geological evidence of shock metamorphism found at the 1908 Tunguska site. Furthermore, as we discuss in our paper, the probability of a prior large impact-shock event having occurred at the site of the 1908 event is extremely low, suggesting that a terrestrial shock-generating mechanism may be linked to the resolution of the Tunguska enigma. Our preferred resolution is that a terrestrial hyper-explosive gas release event, a Verneshot (Morgan et al., 2004), created the large shock-event during the emplacement of the Siberian Traps. In this scenario, the 1908 Tunguska event was due to a much smaller gas-burst that re-used the lithospheric weakness created by the ancient Verneshot. Melott and Overholt's discussion regarding the existence and size of regional and global 14C anomalies related to cometary impacts seems, therefore, to be better addressed in response to the work of Liu et al. (2014), as appears to be done in a paper and preprint that Melott and Overholt self-cite in their comment.
Cox model with interval-censored covariate in cohort studies.
Ahn, Soohyun; Lim, Johan; Paik, Myunghee Cho; Sacco, Ralph L; Elkind, Mitchell S
2018-05-18
In cohort studies the outcome is often time to a particular event, and subjects are followed at regular intervals. Periodic visits may also monitor a secondary irreversible event influencing the event of primary interest, and a significant proportion of subjects develop the secondary event over the period of follow-up. The status of the secondary event serves as a time-varying covariate, but is recorded only at the times of the scheduled visits, generating incomplete time-varying covariates. While information on a typical time-varying covariate is missing for entire follow-up period except the visiting times, the status of the secondary event are unavailable only between visits where the status has changed, thus interval-censored. One may view interval-censored covariate of the secondary event status as missing time-varying covariates, yet missingness is partial since partial information is provided throughout the follow-up period. Current practice of using the latest observed status produces biased estimators, and the existing missing covariate techniques cannot accommodate the special feature of missingness due to interval censoring. To handle interval-censored covariates in the Cox proportional hazards model, we propose an available-data estimator, a doubly robust-type estimator as well as the maximum likelihood estimator via EM algorithm and present their asymptotic properties. We also present practical approaches that are valid. We demonstrate the proposed methods using our motivating example from the Northern Manhattan Study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Implementing NLO DGLAP evolution in parton showers
Hoche, Stefan; Krauss, Frank; Prestel, Stefan
2017-10-13
Here, we present a parton shower which implements the DGLAP evolution of parton densities and fragmentation functions at next-to-leading order precision up to effects stemming from local four-momentum conservation. The Monte-Carlo simulation is based on including next-to-leading order collinear splitting functions in an existing parton shower and combining their soft enhanced contributions with the corresponding terms at leading order. Soft double counting is avoided by matching to the soft eikonal. Example results from two independent realizations of the algorithm, implemented in the two event generation frameworks Pythia and Sherpa, illustrate the improved precision of the new formalism.
Backup key generation model for one-time password security protocol
NASA Astrophysics Data System (ADS)
Jeyanthi, N.; Kundu, Sourav
2017-11-01
The use of one-time password (OTP) has ushered new life into the existing authentication protocols used by the software industry. It introduced a second layer of security to the traditional username-password authentication, thus coining the term, two-factor authentication. One of the drawbacks of this protocol is the unreliability of the hardware token at the time of authentication. This paper proposes a simple backup key model that can be associated with the real world applications’user database, which would allow a user to circumvent the second authentication stage, in the event of unavailability of the hardware token.
Visual Culture and Electronic Government: Exploring a New Generation of E-Government
NASA Astrophysics Data System (ADS)
Bekkers, Victor; Moody, Rebecca
E-government is becoming more picture-oriented. What meaning do stakeholders attach to visual events and visualization? Comparative case study research show the functional meaning primarily refers to registration, integration, transparency and communication. The political meaning refers to new ways of framing in order to secure specific interests and claims. To what the institutional meaning relates is ambiguous: either it improves the position of citizens, or it reinforces the existing bias presented by governments. Hence, we expect that the emergence of a visualized public space, through omnipresent penetration of (mobile) multimedia technologies, will influence government-citizen interactions.
Implementing NLO DGLAP evolution in parton showers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Höche, Stefan; Krauss, Frank; Prestel, Stefan
2017-10-01
We present a parton shower which implements the DGLAP evolution of parton densities and fragmentation functions at next-to-leading order precision up to effects stemming from local four-momentum conservation. The Monte-Carlo simulation is based on including next-to-leading order collinear splitting functions in an existing parton shower and combining their soft enhanced contributions with the corresponding terms at leading order. Soft double counting is avoided by matching to the soft eikonal. Example results from two independent realizations of the algorithm, implemented in the two event generation frameworks Pythia and Sherpa, illustrate the improved precision of the new formalism.
Positron emission tomography wrist detector
Schlyer, David J.; O'Connor, Paul; Woody, Craig; Junnarkar, Sachin Shrirang; Radeka, Veljko; Vaska, Paul; Pratte, Jean-Francois
2006-08-15
A method of serially transferring annihilation information in a compact positron emission tomography (PET) scanner includes generating a time signal representing a time-of-occurrence of an annihilation event, generating an address signal representing a channel detecting the annihilation event, and generating a channel signal including the time and address signals. The method also includes generating a composite signal including the channel signal and another similarly generated channel signal concerning another annihilation event. An apparatus that serially transfers annihilation information includes a time signal generator, address signal generator, channel signal generator, and composite signal generator. The time signal is asynchronous and the address signal is synchronous to a clock signal. A PET scanner includes a scintillation array, detection array, front-end array, and a serial encoder. The serial encoders include the time signal generator, address signal generator, channel signal generator, and composite signal generator.
77 FR 37827 - Airworthiness Directives; The Cessna Aircraft Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-25
...) generator control unit (GCU). We are proposing this AD to prevent DC generator overvoltage events, which... proposed AD. Discussion We have received reports of direct current (DC) generator overvoltage events. The... generator and the left and right engine DC generators, and corrective actions if necessary. That AD also...
Discerning the Chemistry in Individual Organelles with Small-Molecule Fluorescent Probes.
Xu, Wang; Zeng, Zebing; Jiang, Jian-Hui; Chang, Young-Tae; Yuan, Lin
2016-10-24
Principle has it that even the most advanced super-resolution microscope would be futile in providing biological insight into subcellular matrices without well-designed fluorescent tags/probes. Developments in biology have increasingly been boosted by advances of chemistry, with one prominent example being small-molecule fluorescent probes that not only allow cellular-level imaging, but also subcellular imaging. A majority, if not all, of the chemical/biological events take place inside cellular organelles, and researchers have been shifting their attention towards these substructures with the help of fluorescence techniques. This Review summarizes the existing fluorescent probes that target chemical/biological events within a single organelle. More importantly, organelle-anchoring strategies are described and emphasized to inspire the design of new generations of fluorescent probes, before concluding with future prospects on the possible further development of chemical biology. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
López-Bueno, Alberto; Parras-Moltó, Marcos; López-Barrantes, Olivia; Belda, Sylvia; Alejo, Alí
2017-05-01
Molluscum contagiosum virus (MCV) is the sole member of the Molluscipoxvirus genus and causes a highly prevalent human disease of the skin characterized by the formation of a variable number of lesions that can persist for prolonged periods of time. Two major genotypes, subtype 1 and subtype 2, are recognized, although currently only a single complete genomic sequence corresponding to MCV subtype 1 is available. Using next-generation sequencing techniques, we report the complete genomic sequence of four new MCV isolates, including the first one derived from a subtype 2. Comparisons suggest a relatively distant evolutionary split between both MCV subtypes. Further, our data illustrate concurrent circulation of distinct viruses within a population and reveal the existence of recombination events among them. These results help identify a set of MCV genes with potentially relevant roles in molluscum contagiosum epidemiology and pathogenesis.
NASA Astrophysics Data System (ADS)
Mac Dougall, Jean S.; Mc Leod, David M.; Mc Leod, Roger D.
2002-10-01
Florida invested in preserving the Tequesta Indians' "Stonehenge-like" site along the Miami River. Direct observation, and telecast reports, show that a strong association exists between this area and Native American place names, hurricanes, tornados, a waterspout, and other nearby phenomena. Electromagnetic stimulation of human nervous systems in areas like these, discernable by appropriately sensitive individuals when these types of events occur, could plausibly account for some correct "predictions" of events like earthquakes. Various sensory modalities may be activated there. It may be important to understand other historic aspects associated with cultural artifacts like Miami's Tequesta remains. If it also generates instrumentally detectable signals that correlate with visual, "auditory," or nerve ending "tinglings" like those cited by the psychiatrist Arthur Guirdham in books like his Obsessions, applied physicists could partly vindicate the investment and also provide a net return. Society and comparative religious study may benefit.
Seismic activity monitoring in the Izvorul Muntelui dam region
NASA Astrophysics Data System (ADS)
Borleanu, Felix; Otilia Placinta, Anca; Popa, Mihaela; Adelin Moldovan, Iren; Popescu, Emilia
2016-04-01
Earthquakes occurrences near the artificial water reservoirs are caused by stress variation due to the weight of water, weakness of fractures or faults and increasing of pore pressure in crustal rocks. In the present study we aim to investigate how Izvorul Muntelui dam, located in the Eastern Carpathians influences local seismicity. For this purpose we selected from the seismic bulletins computed within National Data Center of National Institute for Earth Physics, Romania, crustal events occurred between 984 and 2015 in a range of 0.3 deg around the artificial lake. Subsequently to improve the seismic monitoring of the region we applied a cross-correlation detector on the continuous recordings of Bicaz (BIZ) seismic stations. Besides the tectonic events we detected sources within this region that periodically generate artificial evens. We couldn't emphasize the existence of a direct correlation between the water level variations and natural seismicity of the investigated area.
Rapid changes in the electrical state of the 1999 Izmit earthquake rupture zone
Honkura, Yoshimori; Oshiman, Naoto; Matsushima, Masaki; Barış, Şerif; Kemal Tunçer, Mustafa; Bülent Tank, Sabri; Çelik, Cengiz; Çiftçi, Elif Tolak
2013-01-01
Crustal fluids exist near fault zones, but their relation to the processes that generate earthquakes, including slow-slip events, is unclear. Fault-zone fluids are characterized by low electrical resistivity. Here we investigate the time-dependent crustal resistivity in the rupture area of the 1999 Mw 7.6 Izmit earthquake using electromagnetic data acquired at four sites before and after the earthquake. Most estimates of apparent resistivity in the frequency range of 0.05 to 2.0 Hz show abrupt co-seismic decreases on the order of tens of per cent. Data acquired at two sites 1 month after the Izmit earthquake indicate that the resistivity had already returned to pre-seismic levels. We interpret such changes as the pressure-induced transition between isolated and interconnected fluids. Some data show pre-seismic changes and this suggests that the transition is associated with foreshocks and slow-slip events before large earthquakes. PMID:23820970
Two faces of entropy and information in biological systems.
Mitrokhin, Yuriy
2014-10-21
The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics. Copyright © 2014 Elsevier Ltd. All rights reserved.
Operations analysis (study 2.1): Program manual and users guide for the LOVES computer code
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1975-01-01
Information is provided necessary to use the LOVES Computer Program in its existing state, or to modify the program to include studies not properly handled by the basic model. The Users Guide defines the basic elements assembled together to form the model for servicing satellites in orbit. As the program is a simulation, the method of attack is to disassemble the problem into a sequence of events, each occurring instantaneously and each creating one or more other events in the future. The main driving force of the simulation is the deterministic launch schedule of satellites and the subsequent failure of the various modules which make up the satellites. The LOVES Computer Program uses a random number generator to simulate the failure of module elements and therefore operates over a long span of time typically 10 to 15 years. The sequence of events is varied by making several runs in succession with different random numbers resulting in a Monte Carlo technique to determine statistical parameters of minimum value, average value, and maximum value.
Correlated bursts and the role of memory range
NASA Astrophysics Data System (ADS)
Jo, Hang-Hyun; Perotti, Juan I.; Kaski, Kimmo; Kertész, János
2015-08-01
Inhomogeneous temporal processes in natural and social phenomena have been described by bursts that are rapidly occurring events within short time periods alternating with long periods of low activity. In addition to the analysis of heavy-tailed interevent time distributions, higher-order correlations between interevent times, called correlated bursts, have been studied only recently. As the underlying mechanism behind such correlated bursts is far from being fully understood, we devise a simple model for correlated bursts using a self-exciting point process with a variable range of memory. Whether a new event occurs is stochastically determined by a memory function that is the sum of decaying memories of past events. In order to incorporate the noise and/or limited memory capacity of systems, we apply two memory loss mechanisms: a fixed number or a variable number of memories. By analysis and numerical simulations, we find that too much memory effect may lead to a Poissonian process, implying that there exists an intermediate range of memory effect to generate correlated bursts comparable to empirical findings. Our conclusions provide a deeper understanding of how long-range memory affects correlated bursts.
Microtubule nucleation and organization in dendrites
Delandre, Caroline; Amikura, Reiko; Moore, Adrian W.
2016-01-01
ABSTRACT Dendrite branching is an essential process for building complex nervous systems. It determines the number, distribution and integration of inputs into a neuron, and is regulated to create the diverse dendrite arbor branching patterns characteristic of different neuron types. The microtubule cytoskeleton is critical to provide structure and exert force during dendrite branching. It also supports the functional requirements of dendrites, reflected by differential microtubule architectural organization between neuron types, illustrated here for sensory neurons. Both anterograde and retrograde microtubule polymerization occur within growing dendrites, and recent studies indicate that branching is enhanced by anterograde microtubule polymerization events in nascent branches. The polarities of microtubule polymerization events are regulated by the position and orientation of microtubule nucleation events in the dendrite arbor. Golgi outposts are a primary microtubule nucleation center in dendrites and share common nucleation machinery with the centrosome. In addition, pre-existing dendrite microtubules may act as nucleation sites. We discuss how balancing the activities of distinct nucleation machineries within the growing dendrite can alter microtubule polymerization polarity and dendrite branching, and how regulating this balance can generate neuron type-specific morphologies. PMID:27097122
The Climatology and Impacts of Atmospheric Rivers near the Coast of Southern Alaska
NASA Astrophysics Data System (ADS)
Nardi, K.; Barnes, E. A.; Mundhenk, B. D.
2015-12-01
Atmospheric rivers, narrow plumes of anomalously high tropospheric water vapor transport, frequently appear over the Pacific Ocean. Popularized by colloquialisms such as the "Pineapple Express," atmospheric rivers often interact with synoptic-scale disturbances to produce significant precipitation events over land masses. Previous research has focused extensively on the impacts of this phenomenon with respect to high-precipitation storms, namely during boreal winter, on the western coast of the contiguous United States. These events generate great scientific, political, and economic concerns for nearby cities, farms, and tourist destinations. Recently, researchers have investigated similar high-precipitation events along the southern coast of Alaska. Specifically, previous work has discussed several major events occurring during the September-November timeframe. One particular event, in October 2006, produced an all-time record for water levels at several river observation sites. This study examines the climatology of atmospheric rivers in the vicinity of southern Alaska. Data (1979-2014) from the Modern-Era Retrospective Analysis for Research and Applications (MERRA) is used to detect atmospheric rivers approaching, and making landfall on, the southern Alaskan coast from the Kenai Peninsula to the Gulf of Alaska region. A seasonal cycle in the strength and frequency of atmospheric rivers over Alaska is shown. Furthermore, the study assesses the synoptic conditions coincident with atmospheric rivers and examines several instances of particularly strong precipitation events. For example, wintertime atmospheric river events tend to occur when a blocking high exists over southeastern Alaska. These results have the potential to help forecasters and emergency managers predict high-precipitation events and lessen potential negative impacts.
Mohr, Christine; Koutrakis, Nikolaos; Kuhn, Gustav
2015-01-01
Magical ideation and belief in the paranormal is considered to represent a trait-like character; people either believe in it or not. Yet, anecdotes indicate that exposure to an anomalous event can turn skeptics into believers. This transformation is likely to be accompanied by altered cognitive functioning such as impaired judgments of event likelihood. Here, we investigated whether the exposure to an anomalous event changes individuals’ explicit traditional (religious) and non-traditional (e.g., paranormal) beliefs as well as cognitive biases that have previously been associated with non-traditional beliefs, e.g., repetition avoidance when producing random numbers in a mental dice task. In a classroom, 91 students saw a magic demonstration after their psychology lecture. Before the demonstration, half of the students were told that the performance was done respectively by a conjuror (magician group) or a psychic (psychic group). The instruction influenced participants’ explanations of the anomalous event. Participants in the magician, as compared to the psychic group, were more likely to explain the event through conjuring abilities while the reverse was true for psychic abilities. Moreover, these explanations correlated positively with their prior traditional and non-traditional beliefs. Finally, we observed that the psychic group showed more repetition avoidance than the magician group, and this effect remained the same regardless of whether assessed before or after the magic demonstration. We conclude that pre-existing beliefs and contextual suggestions both influence people’s interpretations of anomalous events and associated cognitive biases. Beliefs and associated cognitive biases are likely flexible well into adulthood and change with actual life events. PMID:25653626
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winans, J.
The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.
NASA Astrophysics Data System (ADS)
Rodas, Claudio; Pulido, Manuel
2017-09-01
A climatological characterization of Rossby wave generation events in the middle atmosphere of the Southern Hemisphere is conducted using 20 years of Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalysis. An automatic detection technique of wave generation events is developed and applied to MERRA reanalysis. The Rossby wave generation events with wave period of 1.25 to 5.5 days and zonal wave number from one to three dominate the Eliassen-Palm flux divergence around the stratopause at high latitudes in the examined 20 year period. These produce an eastward forcing of the general circulation between May and mid-August in that region. Afterward from mid-August to the final warming date, Rossby wave generation events are still present but the Eliassen-Palm flux divergence in the polar stratopause is dominated by low-frequency Rossby waves that propagate from the troposphere. The Rossby wave generation events are associated with potential vorticity gradient inversion, and so they are a manifestation of the dominant barotropic/baroclinic unstable modes that grow at the cost of smearing the negative meridional gradient of potential vorticity. The most likely region of wave generation is found between 60° and 80°S and at a height of 0.7 hPa, but events were detected from 40 hPa to 0.3 hPa (which is the top of the examined region). The mean number of events per year is 24, and its mean duration is 3.35 days. The event duration follows an exponential distribution.
Realistic computer network simulation for network intrusion detection dataset generation
NASA Astrophysics Data System (ADS)
Payer, Garrett
2015-05-01
The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.
Event generators for address event representation transmitters
NASA Astrophysics Data System (ADS)
Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe
2005-06-01
Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were freezed to transmit any further events during this time window. This limited the maximum transmission speed. In order to improve this speed, Boahen proposed an improved 'burst mode' scheme. In this scheme after the row arbitration, a complete row of events is pipelined out of the array and arbitered out of the chip at higher speed. During this single row event arbitration, the array is free to generate new events and communicate to the row arbiter, in a pipelined mode. This scheme significantly improves maximum event transmission speed, specially for high traffic situations were speed is more critical. We have analyzed and studied this approach and have detected some shortcomings in the circuits reported by Boahen, which may render some false situations under some statistical conditions. The present paper proposes some improvements to overcome such situations. The improved "AER Generator" has been implemented in an AER transmitter system
The time and place of European admixture in Ashkenazi Jewish history.
Xue, James; Lencz, Todd; Darvasi, Ariel; Pe'er, Itsik; Carmi, Shai
2017-04-01
The Ashkenazi Jewish (AJ) population is important in genetics due to its high rate of Mendelian disorders. AJ appeared in Europe in the 10th century, and their ancestry is thought to comprise European (EU) and Middle-Eastern (ME) components. However, both the time and place of admixture are subject to debate. Here, we attempt to characterize the AJ admixture history using a careful application of new and existing methods on a large AJ sample. Our main approach was based on local ancestry inference, in which we first classified each AJ genomic segment as EU or ME, and then compared allele frequencies along the EU segments to those of different EU populations. The contribution of each EU source was also estimated using GLOBETROTTER and haplotype sharing. The time of admixture was inferred based on multiple statistics, including ME segment lengths, the total EU ancestry per chromosome, and the correlation of ancestries along the chromosome. The major source of EU ancestry in AJ was found to be Southern Europe (≈60-80% of EU ancestry), with the rest being likely Eastern European. The inferred admixture time was ≈30 generations ago, but multiple lines of evidence suggest that it represents an average over two or more events, pre- and post-dating the founder event experienced by AJ in late medieval times. The time of the pre-bottleneck admixture event, which was likely Southern European, was estimated to ≈25-50 generations ago.
Formation of cratonic lithosphere during the initiation of plate tectonics
NASA Astrophysics Data System (ADS)
Moresi, L. N.; Beall, A.; Cooper, C. M.
2017-12-01
The Earth's oldest near-surface material, the cratonic crust, is typically underlain by unusually thick Archean lithosphere (<300 km). This cratonic lithosphere likely thickened in a high compressional stress environment. Mantle convection in the hotter Archean Earth would have imparted relatively low stresses on the lithosphere, whether or not tectonics was operating, so a high stress signal from the early Earth is paradoxical. We propose that a rapid transition, from a stagnant lid Earth to the onset of plate tectonics, generated the high stresses required to thicken the cratonic lithosphere. Numerical calculations are used to demonstrate that an existing buoyant and strong layer, representing harzburgite and felsic crust, can thicken and stabilize during the lid-breaking event. The peak compressional stress experienced by lithosphere is 3-4 higher than for the stagnant lid or mobile lid regimes immediately before and after. It is plausible that the cratonic lithosphere has still not returned to this high stress-state, explaining its stability. The lid-breaking thickening event reproduces craton features previously attributed to subduction: thrust structures, assembled crustal fragments and transport of basaltic upper crust to depths required to generate felsic melt. Palaeoarchean `pre-tectonic' structures can also survive the lid-breaking event, acting as strong crustal rafts. Together, the results indicate that the signature of a catastrophic switch, from a stagnant lid Earth to the initiation of plate tectonics, has been captured and preserved in the unusual characteristics of cratonic crust and lithosphere.
The time and place of European admixture in Ashkenazi Jewish history
Xue, James; Lencz, Todd; Darvasi, Ariel; Pe’er, Itsik
2017-01-01
The Ashkenazi Jewish (AJ) population is important in genetics due to its high rate of Mendelian disorders. AJ appeared in Europe in the 10th century, and their ancestry is thought to comprise European (EU) and Middle-Eastern (ME) components. However, both the time and place of admixture are subject to debate. Here, we attempt to characterize the AJ admixture history using a careful application of new and existing methods on a large AJ sample. Our main approach was based on local ancestry inference, in which we first classified each AJ genomic segment as EU or ME, and then compared allele frequencies along the EU segments to those of different EU populations. The contribution of each EU source was also estimated using GLOBETROTTER and haplotype sharing. The time of admixture was inferred based on multiple statistics, including ME segment lengths, the total EU ancestry per chromosome, and the correlation of ancestries along the chromosome. The major source of EU ancestry in AJ was found to be Southern Europe (≈60–80% of EU ancestry), with the rest being likely Eastern European. The inferred admixture time was ≈30 generations ago, but multiple lines of evidence suggest that it represents an average over two or more events, pre- and post-dating the founder event experienced by AJ in late medieval times. The time of the pre-bottleneck admixture event, which was likely Southern European, was estimated to ≈25–50 generations ago. PMID:28376121
TOWARD THE DEVELOPMENT OF A CONSENSUS MATERIALS DATABASE FOR PRESSURE TECHNOLGY APPLICATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swindeman, Robert W; Ren, Weiju
The ASME construction code books specify materials and fabrication procedures that are acceptable for pressure technology applications. However, with few exceptions, the materials properties provided in the ASME code books provide no statistics or other information pertaining to material variability. Such information is central to the prediction and prevention of failure events. Many sources of materials data exist that provide variability information but such sources do not necessarily represent a consensus of experts with respect to the reported trends that are represented. Such a need has been identified by the ASME Standards Technology, LLC and initial steps have been takenmore » to address these needs: however, these steps are limited to project-specific applications only, such as the joint DOE-ASME project on materials for Generation IV nuclear reactors. In contrast to light-water reactor technology, the experience base for the Generation IV nuclear reactors is somewhat lacking and heavy reliance must be placed on model development and predictive capability. The database for model development is being assembled and includes existing code alloys such as alloy 800H and 9Cr-1Mo-V steel. Ownership and use rights are potential barriers that must be addressed.« less
NASA Astrophysics Data System (ADS)
Hussain, Nur; Bhattacharjee, Buddhadeb
2017-08-01
Widths of the rapidity distributions of various identified hadrons generated with the UrQMD-3.4 event generator at all the Super Proton Synchrotron (SPS) energies have been presented and compared with the existing experimental results. An increase in the width of the rapidity distribution of Λ could be seen with both Monte Carlo (MC) and experimental data for the studied energies. Using MC data, the study has been extended to Relativistic Heavy Ion Collider (RHIC) and Large Hadron Collider (LHC) energies. A similar jump, as observed in the plot of rapidity width versus rest mass at Alternating Gradient Synchrotron (AGS) and all SPS energies, persists even at RHIC and LHC energies, confirming its universal nature from AGS to the highest LHC energies. Such observation indicates that pair production may not be the only mechanism of particle production at the highest LHC energies. However, with MC data, the separate mass scaling for mesons and baryons is found to exist even at the top LHC energy.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate. PMID:28066225
NASA Astrophysics Data System (ADS)
Green, David N.; Evers, Läslo G.; Fee, David; Matoza, Robin S.; Snellen, Mirjam; Smets, Pieter; Simons, Dick
2013-05-01
Explosive submarine volcanic processes are poorly understood, due to the difficulties associated with both direct observation and continuous monitoring. In this study hydroacoustic, infrasound, and seismic signals recorded during the May 2010 submarine eruption of South Sarigan seamount, Marianas Arc, are used to construct a detailed event chronology. The signals were recorded on stations of the International Monitoring System, which is a component of the verification measures for the Comprehensive Nuclear-Test-Ban Treaty. Numerical hydroacoustic and infrasound propagation modelling confirms that viable propagation paths from the source to receivers exist, and provide traveltimes allowing signals recorded on the different technologies to be associated. The eruption occurred in three stages, separated by three-hour periods of quiescence. 1) A 46 h period during which broadband impulsive hydroacoustic signals were generated in clusters lasting between 2 and 13 min. 95% of the 7602 identified events could be classified into 4 groups based on their waveform similarity. The time interval between clusters decreased steadily from 80 to 25 min during this period. 2) A five-hour period of 10 Hz hydroacoustic tremor, interspersed with large-amplitude, broadband signals. Associated infrasound signals were also recorded at this time. 3) An hour-long period of transient broadband events culminated in two large-amplitude hydroacoustic events and one broadband infrasound signal. A speculative interpretation, consistent with the data, suggests that during phase (1) transitions between endogenous dome growth and phreatomagmatic explosions occurred with the magma ascent rate accelerating throughout the period; during phase (2) continuous venting of fragmented magma occurred, and was powerful enough to breach the sea surface. During the climactic phase (3) discrete powerful explosions occurred, and sufficient seawater was vaporised to produce the contemporaneous 12 km altitude steam plume.
NASA Astrophysics Data System (ADS)
Odaka, Shigeru; Kurihara, Yoshimasa
2016-05-01
We have developed an event generator for direct-photon production in hadron collisions, including associated 2-jet production in the framework of the GR@PPA event generator. The event generator consistently combines γ + 2-jet production processes with the lowest-order γ + jet and photon-radiation (fragmentation) processes from quantum chromodynamics (QCD) 2-jet production using a subtraction method. The generated events can be fed to general-purpose event generators to facilitate the addition of hadronization and decay simulations. Using the obtained event information, we can simulate photon isolation and hadron-jet reconstruction at the particle (hadron) level. The simulation reasonably reproduces measurement data obtained at the large hadron collider (LHC) concerning not only the inclusive photon spectrum, but also the correlation between the photon and jet. The simulation implies that the contribution of the γ + 2-jet is very large, especially in low photon-pT ( ≲ 50 GeV) regions. Discrepancies observed at low pT, although marginal, may indicate the necessity for the consideration of further higher-order processes. Unambiguous particle-level definition of the photon-isolation condition for the signal events is desired to be given explicitly in future measurements.
Climate, Waterborne Disease, and Public Health in Eastern Russia
NASA Astrophysics Data System (ADS)
Tirrell, Andrew
2013-04-01
As global temperatures rise, waterborne diseases have expanded their ranges northward. Exposure to new diseases is especially threatening to isolated communities, whose remote locations and lack of health resources and infrastructure leave them particularly vulnerable. For this project, a time series analysis of existing data will be used to assess temporal and spatial associations between long-term, seasonal and short-term weather variability, and waterborne infectious diseases in several Siberian communities. Building on these associations, we will generate estimates of future changes in infectious disease patterns based upon existing forecasts of climate change and likely increases in extreme weather events in eastern Russia. Finally, we will contemplate the public health implications of these findings and offer appropriate policy recommendations. One of our policy aims will be to identify easily measured water quality indicators that may serve as useful proxies for environmental health in rural, especially indigenous, communities.
GENXICC2.1: An improved version of GENXICC for hadronic production of doubly heavy baryons
NASA Astrophysics Data System (ADS)
Wang, Xian-You; Wu, Xing-Gang
2013-03-01
We present an improved version of GENXICC, which is a generator for hadronic production of the doubly heavy baryons Ξcc, Ξbc and Ξbb and has been introduced by C.H. Chang, J.X. Wang and X.G. Wu [Comput. Phys. Commun. 177 (2007) 467; Comput. Phys. Commun. 181 (2010) 1144]. In comparison with the previous GENXICC versions, we update the program in order to generate the unweighted baryon events more effectively under various simulation environments, whose distributions are now generated according to the probability proportional to the integrand. One Les Houches Event (LHE) common block has been added to produce a standard LHE data file that contains useful information of the doubly heavy baryon and its accompanying partons. Such LHE data can be conveniently imported into PYTHIA to do further hadronization and decay simulation, especially, the color-flow problem can be solved with PYTHIA8.0. NEW VERSION PROGRAM SUMMARYTitle of program: GENXICC2.1 Program obtained from: CPC Program Library Reference to original program: GENXICC Reference in CPC: Comput. Phys. Commun. 177, 467 (2007); Comput. Phys. Commun. 181, 1144 (2010) Does the new version supersede the old program: No Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating systems: LINUX Programming language used: FORTRAN 77/90 Memory required to execute with typical data: About 2.0 MB No. of bytes in distributed program: About 2 MB, including PYTHIA6.4 Distribution format: .tar.gz Nature of physical problem: Hadronic production of doubly heavy baryons Ξcc, Ξbc and Ξbb. Method of solution: The upgraded version with a proper interface to PYTHIA can generate full production and decay events, either weighted or unweighted, conveniently and effectively. Especially, the unweighted events are generated by using an improved hit-and-miss approach. Reasons for new version: Responding to the feedback from users of CMS and LHCb groups at the Large Hadron Collider, and based on the recent improvements of PYTHIA on the color-flow problem, we improve the efficiency for generating the unweighted events, and also improve the color-flow part for further hadronization. Especially, an interface has been added to import the output production events into a suitable form for PYTHIA8.0 simulation, in which the color-flow during the simulation can be correctly set. Typical running time: It depends on which option is chosen to match PYTHIA when generating the full events and also on which mechanism is chosen to generate the events. Typically, for the dominant gluon-gluon fusion mechanism to generate the mixed events via the intermediate diquarks in (cc)[3S1]3¯ and (cc)[1S0]6 states, setting IDWTUP=3 and unwght =.true., it takes 30 min to generate 105 unweighted events on a 2.27 GHz Intel Xeon E5520 processor machine; setting IDWTUP=3 and unwght =.false. or IDWTUP=1 and IGENERATE=0, it only needs 2 min to generate the 105 baryon events (the fastest way, for theoretical purposes only). As a comparison, for previous GENXICC versions, if setting IDWTUP=1 and IGENERATE=1, it takes about 22 hours to generate 1000 unweighted events. Keywords: Event generator; Doubly heavy baryons; Hadronic production. Summary of the changes (improvements): (1) The scheme for generating unweighted events has been improved; (2) One Les Houches Event (LHE) common block has been added to record the standard LHE data in order to be the correct input for PYTHIA8.0 for later simulation; (3) We present the code for connecting GENXICC to PYTHIA8.0, where three color-flows have to be correctly set for later simulation. More specifically, we present the changes together with their detailed explanations in the following:
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Potirakis, Stelios M.; Papadimitriou, Constantinos; Zitis, Pavlos I.; Eftaxias, Konstantinos
2015-04-01
The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up to different extreme events, in order to support the suggestion that a dynamical analogy characterizes the generation of a single magnetic storm, solar flare, earthquake (in terms of pre-seismic electromagnetic signals) , epileptic seizure, and economic crisis. The analysis reveals that all the above mentioned different extreme events can be analyzed within similar mathematical framework. More precisely, we show that the populations of magnitudes of fluctuations included in all the above mentioned pulse-like-type time series follow the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar nonextensive q-parameter values. Moreover, based on a multidisciplinary statistical analysis we show that the extreme events are characterized by crucial common symptoms, namely: (i) high organization, high compressibility, low complexity, high information content; (ii) strong persistency; and (iii) existence of clear preferred direction of emerged activities. These symptoms clearly discriminate the appearance of the extreme events under study from the corresponding background noise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenkranz, Joshua-Benedict; Brancucci Martinez-Anido, Carlo; Hodge, Bri-Mathias
Solar power generation, unlike conventional forms of electricity generation, has higher variability and uncertainty in its output because solar plant output is strongly impacted by weather. As the penetration rate of solar capacity increases, grid operators are increasingly concerned about accommodating the increased variability and uncertainty that solar power provides. This paper illustrates the impacts of increasing solar power penetration on the ramping of conventional electricity generators by simulating the operation of the Independent System Operator -- New England power system. A production cost model was used to simulate the power system under five different scenarios, one without solar powermore » and four with increasing solar power penetrations up to 18%, in terms of annual energy. The impact of solar power is analyzed on six different temporal intervals, including hourly and multi-hourly (2- to 6-hour) ramping. The results show how the integration of solar power increases the 1- to 6-hour ramping events of the net load (electric load minus solar power). The study also analyzes the impact of solar power on the distribution of multi-hourly ramping events of fossil-fueled generators and shows increasing 1- to 6-hour ramping events for all different generators. Generators with higher ramp rates such as gas and oil turbine and internal combustion engine generators increased their ramping events by 200% to 280%. For other generator types--including gas combined-cycle generators, coal steam turbine generators, and gas and oil steam turbine generators--more and higher ramping events occurred as well for higher solar power penetration levels.« less
Choi, Yun Ho; Yoo, Sung Jin
2018-06-01
This paper investigates the event-triggered decentralized adaptive tracking problem of a class of uncertain interconnected nonlinear systems with unexpected actuator failures. It is assumed that local control signals are transmitted to local actuators with time-varying faults whenever predefined conditions for triggering events are satisfied. Compared with the existing control-input-based event-triggering strategy for adaptive control of uncertain nonlinear systems, the aim of this paper is to propose a tracking-error-based event-triggering strategy in the decentralized adaptive fault-tolerant tracking framework. The proposed approach can relax drastic changes in control inputs caused by actuator faults in the existing triggering strategy. The stability of the proposed event-triggering control system is analyzed in the Lyapunov sense. Finally, simulation comparisons of the proposed and existing approaches are provided to show the effectiveness of the proposed theoretical result in the presence of actuator faults. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
A Critical Evaluation of the Gamma-Hydroxybutyrate (GHB) Model of Absence Seizures
Venzi, Marcello; Di Giovanni, Giuseppe; Crunelli, Vincenzo
2015-01-01
Typical absence seizures (ASs) are nonconvulsive epileptic events which are commonly observed in pediatric and juvenile epilepsies and may be present in adults suffering from other idiopathic generalized epilepsies. Our understanding of the pathophysiological mechanisms of ASs has been greatly advanced by the availability of genetic and pharmacological models, in particular the γ-hydroxybutyrate (GHB) model which, in recent years, has been extensively used in studies in transgenic mice. GHB is an endogenous brain molecule that upon administration to various species, including humans, induces not only ASs but also a state of sedation/hypnosis. Analysis of the available data clearly indicates that only in the rat does there exist a set of GHB-elicited behavioral and EEG events that can be confidently classified as ASs. Other GHB activities, particularly in mice, appear to be mostly of a sedative/hypnotic nature: thus, their relevance to ASs requires further investigation. At the molecular level, GHB acts as a weak GABA-B agonist, while the existence of a GHB receptor remains elusive. The pre- and postsynaptic actions underlying GHB-elicited ASs have been thoroughly elucidated in thalamus, but little is known about the cellular/network effects of GHB in neocortex, the other brain region involved in the generation of ASs. PMID:25403866
Using the Statecharts paradigm for simulation of patient flow in surgical care.
Sobolev, Boris; Harel, David; Vasilakis, Christos; Levy, Adrian
2008-03-01
Computer simulation of patient flow has been used extensively to assess the impacts of changes in the management of surgical care. However, little research is available on the utility of existing modeling techniques. The purpose of this paper is to examine the capacity of Statecharts, a system of graphical specification, for constructing a discrete-event simulation model of the perioperative process. The Statecharts specification paradigm was originally developed for representing reactive systems by extending the formalism of finite-state machines through notions of hierarchy, parallelism, and event broadcasting. Hierarchy permits subordination between states so that one state may contain other states. Parallelism permits more than one state to be active at any given time. Broadcasting of events allows one state to detect changes in another state. In the context of the peri-operative process, hierarchy provides the means to describe steps within activities and to cluster related activities, parallelism provides the means to specify concurrent activities, and event broadcasting provides the means to trigger a series of actions in one activity according to transitions that occur in another activity. Combined with hierarchy and parallelism, event broadcasting offers a convenient way to describe the interaction of concurrent activities. We applied the Statecharts formalism to describe the progress of individual patients through surgical care as a series of asynchronous updates in patient records generated in reaction to events produced by parallel finite-state machines representing concurrent clinical and managerial activities. We conclude that Statecharts capture successfully the behavioral aspects of surgical care delivery by specifying permissible chronology of events, conditions, and actions.
NASA Astrophysics Data System (ADS)
Gibbons, S. J.; Harris, D. B.; Dahl-Jensen, T.; Kværna, T.; Larsen, T. B.; Paulsen, B.; Voss, P. H.
2017-12-01
The oceanic boundary separating the Eurasian and North American plates between 70° and 84° north hosts large earthquakes which are well recorded teleseismically, and many more seismic events at far lower magnitudes that are well recorded only at regional distances. Existing seismic bulletins have considerable spread and bias resulting from limited station coverage and deficiencies in the velocity models applied. This is particularly acute for the lower magnitude events which may only be constrained by a small number of Pn and Sn arrivals. Over the past two decades there has been a significant improvement in the seismic network in the Arctic: a difficult region to instrument due to the harsh climate, a sparsity of accessible sites (particularly at significant distances from the sea), and the expense and difficult logistics of deploying and maintaining stations. New deployments and upgrades to stations on Greenland, Svalbard, Jan Mayen, Hopen, and Bjørnøya have resulted in a sparse but stable regional seismic network which results in events down to magnitudes below 3 generating high-quality Pn and Sn signals on multiple stations. A catalogue of several hundred events in the region since 1998 has been generated using many new phase readings on stations on both sides of the spreading ridge in addition to teleseismic P phases. A Bayesian multiple event relocation has resulted in a significant reduction in the spread of hypocentre estimates for both large and small events. Whereas single event location algorithms minimize vectors of time residuals on an event-by-event basis, the Bayesloc program finds a joint probability distribution of origins, hypocentres, and corrections to traveltime predictions for large numbers of events. The solutions obtained favour those event hypotheses resulting in time residuals which are most consistent over a given source region. The relocations have been performed with different 1-D velocity models applicable to the Arctic region and hypocentres obtained using Bayesloc have been shown to be relatively insensitive to the specified velocity structure in the crust and upper mantle, even for events only constrained by regional phases. The patterns of time residuals resulting from the multiple-event location procedure provide well-constrained time correction surfaces for single-event location estimates and are sufficiently stable to identify a number of picking errors and instrumental timing anomalies. This allows for subsequent quality control of the input data and further improvement in the location estimates. We use the relocated events to form narrowband empirical steering vectors for wave fronts arriving at the SPITS array on Svalbard for azimuth and apparent velocity estimation. We demonstrate that empirical matched field parameter estimation determined by source region is a viable supplement to planewave f-k analysis, mitigating bias and obviating the need for Slowness and Azimuth Station Corrections. A database of reference events and phase arrivals is provided to facilitate further refinement of event locations and the construction of empirical signal detectors.
Zheng, Huimin; Luo, Jiayi; Yu, Rongjun
2014-01-01
Reflecting on past events and reflecting on future events are two fundamentally different processes, each traveling in the opposite direction of the other through conceptual time. But what we are able to imagine seems to be constrained by what we have previously experienced, suggesting a close link between memory and prospection. Recent theories suggest that recalling the past lies at the core of imagining and planning for the future. The existence of this link is supported by evidence gathered from neuroimaging, lesion, and developmental studies. Yet it is not clear exactly how the novel episodes people construct in their sense of the future develop out of their historical memories. There must be intermediary processes that utilize memory as a basis on which to generate future oriented thinking. Here, we review studies on goal-directed processing, associative learning, cognitive control, and creativity and link them with research on prospection. We suggest that memory cooperates with additional functions like goal-directed learning to construct and simulate novel events, especially self-referential events. The coupling between memory-related hippocampus and other brain regions may underlie such memory-based prospection. Abnormalities in this constructive process may contribute to mental disorders such as schizophrenia. PMID:25147532
The Search for Muon Neutrinos from Northern Hemisphere Gamma-Ray Bursts with AMANDA
NASA Astrophysics Data System (ADS)
Achterberg, A.; Ackermann, M.; Adams, J.; Ahrens, J.; Andeen, K.; Auffenberg, J.; Bahcall, J. N.; Bai, X.; Baret, B.; Barwick, S. W.; Bay, R.; Beattie, K.; Becka, T.; Becker, J. K.; Becker, K.-H.; Berghaus, P.; Berley, D.; Bernardini, E.; Bertrand, D.; Besson, D. Z.; Blaufuss, E.; Boersma, D. J.; Bohm, C.; Bolmont, J.; Böser, S.; Botner, O.; Bouchta, A.; Braun, J.; Burgess, C.; Burgess, T.; Castermans, T.; Chirkin, D.; Christy, B.; Clem, J.; Cowen, D. F.; D'Agostino, M. V.; Davour, A.; Day, C. T.; De Clercq, C.; Demirörs, L.; Descamps, F.; Desiati, P.; DeYoung, T.; Diaz-Velez, J. C.; Dreyer, J.; Dumm, J. P.; Duvoort, M. R.; Edwards, W. R.; Ehrlich, R.; Eisch, J.; Ellsworth, R. W.; Evenson, P. A.; Fadiran, O.; Fazely, A. R.; Filimonov, K.; Foerster, M. M.; Fox, B. D.; Franckowiak, A.; Gaisser, T. K.; Gallagher, J.; Ganugapati, R.; Geenen, H.; Gerhardt, L.; Goldschmidt, A.; Goodman, J. A.; Gozzini, R.; Griesel, T.; Gross, A.; Grullon, S.; Gunasingha, R. M.; Gurtner, M.; Hallgren, A.; Halzen, F.; Han, K.; Hanson, K.; Hardtke, D.; Hardtke, R.; Hart, J. E.; Hasegawa, Y.; Hauschildt, T.; Hays, D.; Heise, J.; Helbing, K.; Hellwig, M.; Herquet, P.; Hill, G. C.; Hodges, J.; Hoffman, K. D.; Hommez, B.; Hoshina, K.; Hubert, D.; Hughey, B.; Hulth, P. O.; Hülss, J.-P.; Hultqvist, K.; Hundertmark, S.; Inaba, M.; Ishihara, A.; Jacobsen, J.; Japaridze, G. S.; Johansson, H.; Jones, A.; Joseph, J. M.; Kampert, K.-H.; Kappes, A.; Karg, T.; Karle, A.; Kawai, H.; Kelley, J. L.; Kitamura, N.; Klein, S. R.; Klepser, S.; Kohnen, G.; Kolanoski, H.; Köpke, L.; Kowalski, M.; Kowarik, T.; Krasberg, M.; Kuehn, K.; Labare, M.; Landsman, H.; Leich, H.; Leier, D.; Liubarsky, I.; Lundberg, J.; Lünemann, J.; Madsen, J.; Mase, K.; Matis, H. S.; McCauley, T.; McParland, C. P.; Meli, A.; Messarius, T.; Mészáros, P.; Miyamoto, H.; Mokhtarani, A.; Montaruli, T.; Morey, A.; Morse, R.; Movit, S. M.; Münich, K.; Nahnhauer, R.; Nam, J. W.; Niessen, P.; Nygren, D. R.; Ögelman, H.; Olivas, A.; Patton, S.; Peña-Garay, C.; Pérez de los Heros, C.; Piegsa, A.; Pieloth, D.; Pohl, A. C.; Porrata, R.; Pretz, J.; Price, P. B.; Przybylski, G. T.; Rawlins, K.; Razzaque, S.; Resconi, E.; Rhode, W.; Ribordy, M.; Rizzo, A.; Robbins, S.; Roth, P.; Rott, C.; Rutledge, D.; Ryckbosch, D.; Sander, H.-G.; Sarkar, S.; Schlenstedt, S.; Schmidt, T.; Schneider, D.; Seckel, D.; Semburg, B.; Seo, S. H.; Seunarine, S.; Silvestri, A.; Smith, A. J.; Solarz, M.; Song, C.; Sopher, J. E.; Spiczak, G. M.; Spiering, C.; Stamatikos, M.; Stanev, T.; Steffen, P.; Stezelberger, T.; Stokstad, R. G.; Stoufer, M. C.; Stoyanov, S.; Strahler, E. A.; Straszheim, T.; Sulanke, K.-H.; Sullivan, G. W.; Sumner, T. J.; Taboada, I.; Tarasova, O.; Tepe, A.; Thollander, L.; Tilav, S.; Tluczykont, M.; Toale, P. A.; Turčan, D.; van Eijndhoven, N.; Vandenbroucke, J.; Van Overloop, A.; Viscomi, V.; Voigt, B.; Wagner, W.; Walck, C.; Waldmann, H.; Walter, M.; Wang, Y.-R.; Wendt, C.; Wiebusch, C. H.; Wikström, G.; Williams, D. R.; Wischnewski, R.; Wissing, H.; Woschnagg, K.; Xu, X. W.; Yodh, G.; Yoshida, S.; Zornoza, J. D.; Interplanetary Network, The
2008-02-01
We present the results of the analysis of neutrino observations by the Antarctic Muon and Neutrino Detector Array (AMANDA) correlated with photon observations of more than 400 gamma-ray bursts (GRBs) in the northern hemisphere from 1997 to 2003. During this time period, AMANDA's effective collection area for muon neutrinos was larger than that of any other existing detector. After the application of various selection criteria to our data, we expect ~1 neutrino event and <2 background events. Based on our observations of zero events during and immediately prior to the GRBs in the data set, we set the most stringent upper limit on muon neutrino emission correlated with GRBs. Assuming a Waxman-Bahcall spectrum and incorporating all systematic uncertainties, our flux upper limit has a normalization at 1 PeV of E2Φν <= 6.3 × 10-9 GeV cm-2 s-1 sr-1, with 90% of the events expected within the energy range of ~10 TeV to ~3 PeV. The impact of this limit on several theoretical models of GRBs is discussed, as well as the future potential for detection of GRBs by next-generation neutrino telescopes. Finally, we briefly describe several modifications to this analysis in order to apply it to other types of transient point sources.
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
Carreras, Carlos; Pascual, Marta; Tomás, Jesús; Marco, Adolfo; Hochscheid, Sandra; Castillo, Juan José; Gozalbes, Patricia; Parga, Mariluz; Piovano, Susanna; Cardona, Luis
2018-01-23
The colonisation of new suitable habitats is crucial for species survival at evolutionary scale under changing environmental conditions. However, colonisation potential may be limited by philopatry that facilitates exploiting successful habitats across generations. We examine the mechanisms of long distance dispersal of the philopatric loggerhead sea turtle (Caretta caretta) by analysing 40 sporadic nesting events in the western Mediterranean. The analysis of a fragment of the mitochondrial DNA and 7 microsatellites of 121 samples from 18 of these nesting events revealed that these nests were colonising events associated with juveniles from distant populations feeding in nearby foraging grounds. Considering the temperature-dependent sex determination of the species, we simulated the effect of the incubation temperature and propagule pressure on a potential colonisation scenario. Our results indicated that colonisation will succeed if warm temperature conditions, already existing in some of the beaches in the area, extend to the whole western Mediterranean. We hypothesize that the sporadic nesting events in developmental foraging grounds may be a mechanism to overcome philopatry limitations thus increasing the dispersal capabilities of the species and the adaptability to changing environments. Sporadic nesting in the western Mediterranean can be viewed as potential new populations in a scenario of rising temperatures.
Towards a Comprehensive Catalog of Volcanic Seismicity
NASA Astrophysics Data System (ADS)
Thompson, G.
2014-12-01
Catalogs of earthquakes located using differential travel-time techniques are a core product of volcano observatories, and while vital, they represent an incomplete perspective of volcanic seismicity. Many (often most) earthquakes are too small to locate accurately, and are omitted from available catalogs. Low frequency events, tremor and signals related to rockfalls, pyroclastic flows and lahars are not systematically catalogued, and yet from a hazard management perspective are exceedingly important. Because STA/LTA detection schemes break down in the presence of high amplitude tremor, swarms or dome collapses, catalogs may suggest low seismicity when seismicity peaks. We propose to develop a workflow and underlying software toolbox that can be applied to near-real-time and offline waveform data to produce comprehensive catalogs of volcanic seismicity. Existing tools to detect and locate phaseless signals will be adapted to fit within this framework. For this proof of concept the toolbox will be developed in MATLAB, extending the existing GISMO toolbox (an object-oriented MATLAB toolbox for seismic data analysis). Existing database schemas such as the CSS 3.0 will need to be extended to describe this wider range of volcano-seismic signals. WOVOdat may already incorporate many of the additional tables needed. Thus our framework may act as an interface between volcano observatories (or campaign-style research projects) and WOVOdat. We aim to take the further step of reducing volcano-seismic catalogs to sets of continuous metrics that are useful for recognizing data trends, and for feeding alarm systems and forecasting techniques. Previous experience has shown that frequency index, peak frequency, mean frequency, mean event rate, median event rate, and cumulative magnitude (or energy) are potentially useful metrics to generate for all catalogs at a 1-minute sample rate (directly comparable with RSAM and similar metrics derived from continuous data). Our framework includes tools to plot these metrics in a consistent manner. We work with data from unrest at Redoubt volcano and Soufriere Hills volcano to develop our framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Binh T. Pham; Nancy J. Lybeck; Vivek Agarwal
The Light Water Reactor Sustainability program at Idaho National Laboratory is actively conducting research to develop and demonstrate online monitoring capabilities for active components in existing nuclear power plants. Idaho National Laboratory and the Electric Power Research Institute are working jointly to implement a pilot project to apply these capabilities to emergency diesel generators and generator step-up transformers. The Electric Power Research Institute Fleet-Wide Prognostic and Health Management Software Suite will be used to implement monitoring in conjunction with utility partners: Braidwood Generating Station (owned by Exelon Corporation) for emergency diesel generators, and Shearon Harris Nuclear Generating Station (owned bymore » Duke Energy Progress) for generator step-up transformers. This report presents monitoring techniques, fault signatures, and diagnostic and prognostic models for emergency diesel generators. Emergency diesel generators provide backup power to the nuclear power plant, allowing operation of essential equipment such as pumps in the emergency core coolant system during catastrophic events, including loss of offsite power. Technical experts from Braidwood are assisting Idaho National Laboratory and Electric Power Research Institute in identifying critical faults and defining fault signatures associated with each fault. The resulting diagnostic models will be implemented in the Fleet-Wide Prognostic and Health Management Software Suite and tested using data from Braidwood. Parallel research on generator step-up transformers was summarized in an interim report during the fourth quarter of fiscal year 2012.« less
What are the characteristics of perinatal events perceived to be traumatic by midwives?
Sheen, Kayleigh; Spiby, Helen; Slade, Pauline
2016-09-01
there is potential for midwives to indirectly experience events whilst providing clinical care that fulfil criteria for trauma. This research aimed to investigate the characteristics of events perceived as traumatic by UK midwives. as part of a postal questionnaire survey conducted between December 2011 and April 2012, midwives (n=421) who had witnessed and/or listened to an account of an event and perceived this as traumatic for themselves provided a written description of their experience. A traumatic perinatal event was defined as occurring during labour or shortly after birth where the midwife perceived the mother or her infant to be at risk, and they (the midwife) had experienced fear, helplessness or horror in response. Descriptions of events were analysed using thematic analysis. Witnessed (W; n=299) and listened to (H; n=383) events were analysed separately and collated to identify common and distinct themes across both types of exposure. six themes were identified, each with subthemes. Five themes were identified in both witnessed and listened to accounts and one was salient to witnessed accounts only. Themes indicated that events were characterised as severe, unexpected and complex. They involved aspects relating to the organisational context; typically limited or delayed access to resources or personnel. There were aspects relating to parents, such as having an existing relationship with the parents, and negative perceptions of the conduct of colleagues. Traumatic events had a common theme of generating feelings of responsibility and blame Finally for witnessed events those that were perceived as traumatic sometimes held personal salience, so resonated in some way with the midwife's own life experience midwives are exposed to events as part of their work that they may find traumatic. Understanding the characteristics of the events that may trigger this perception may facilitate prevention of any associated distress and inform the development of supportive interventions. Copyright © 2016 Elsevier Ltd. All rights reserved.
Simplifying operations with an uplink/downlink integration toolkit
NASA Technical Reports Server (NTRS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
Simplifying operations with an uplink/downlink integration toolkit
NASA Astrophysics Data System (ADS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-11-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
NASA Astrophysics Data System (ADS)
Sklarew, Jennifer F.
External shocks do not always generate energy system transformation. This dissertation examines how government relationships with electric utilities and the public impact whether shocks catalyze energy system change. The study analyzes Japanese energy policymaking from the oil crises through the Fukushima nuclear disaster. Findings reveal that policymakers' cooperation with and clout over electric utilities and the public can enable shocks to transform energy systems. When electric utilities wield clout, public trust in and influence on the government determine the existing system's resilience and the potential for a new system to emerge. Understanding this effect informs energy policy design and innovation.
LHC collider phenomenology of minimal universal extra dimensions
NASA Astrophysics Data System (ADS)
Beuria, Jyotiranjan; Datta, AseshKrishna; Debnath, Dipsikha; Matchev, Konstantin T.
2018-05-01
We discuss the collider phenomenology of the model of Minimal Universal Extra Dimensions (MUED) at the Large hadron Collider (LHC). We derive analytical results for all relevant strong pair-production processes of two level 1 Kaluza-Klein partners and use them to validate and correct the existing MUED implementation in the fortran version of the PYTHIA event generator. We also develop a new implementation of the model in the C++ version of PYTHIA. We use our implementations in conjunction with the CHECKMATE package to derive the LHC bounds on MUED from a large number of published experimental analyses from Run 1 at the LHC.
Wolff, Phillip; Barbey, Aron K.
2015-01-01
Causal composition allows people to generate new causal relations by combining existing causal knowledge. We introduce a new computational model of such reasoning, the force theory, which holds that people compose causal relations by simulating the processes that join forces in the world, and compare this theory with the mental model theory (Khemlani et al., 2014) and the causal model theory (Sloman et al., 2009), which explain causal composition on the basis of mental models and structural equations, respectively. In one experiment, the force theory was uniquely able to account for people's ability to compose causal relationships from complex animations of real-world events. In three additional experiments, the force theory did as well as or better than the other two theories in explaining the causal compositions people generated from linguistically presented causal relations. Implications for causal learning and the hierarchical structure of causal knowledge are discussed. PMID:25653611
Stomach curvature is generated by left-right asymmetric gut morphogenesis
Davis, Adam; Amin, Nirav M.; Johnson, Caroline; Bagley, Kristen; Ghashghaei, H. Troy
2017-01-01
Left-right (LR) asymmetry is a fundamental feature of internal anatomy, yet the emergence of morphological asymmetry remains one of the least understood phases of organogenesis. Asymmetric rotation of the intestine is directed by forces outside the gut, but the morphogenetic events that generate anatomical asymmetry in other regions of the digestive tract remain unknown. Here, we show in mouse and Xenopus that the mechanisms that drive the curvature of the stomach are intrinsic to the gut tube itself. The left wall of the primitive stomach expands more than the right wall, as the left epithelium becomes more polarized and undergoes radial rearrangement. These asymmetries exist across several species, and are dependent on LR patterning genes, including Foxj1, Nodal and Pitx2. Our findings have implications for how LR patterning manifests distinct types of morphological asymmetries in different contexts. PMID:28242610
Stomach curvature is generated by left-right asymmetric gut morphogenesis.
Davis, Adam; Amin, Nirav M; Johnson, Caroline; Bagley, Kristen; Ghashghaei, H Troy; Nascone-Yoder, Nanette
2017-04-15
Left-right (LR) asymmetry is a fundamental feature of internal anatomy, yet the emergence of morphological asymmetry remains one of the least understood phases of organogenesis. Asymmetric rotation of the intestine is directed by forces outside the gut, but the morphogenetic events that generate anatomical asymmetry in other regions of the digestive tract remain unknown. Here, we show in mouse and Xenopus that the mechanisms that drive the curvature of the stomach are intrinsic to the gut tube itself. The left wall of the primitive stomach expands more than the right wall, as the left epithelium becomes more polarized and undergoes radial rearrangement. These asymmetries exist across several species, and are dependent on LR patterning genes, including Foxj1 , Nodal and Pitx2 Our findings have implications for how LR patterning manifests distinct types of morphological asymmetries in different contexts. © 2017. Published by The Company of Biologists Ltd.
Sada, Kiminori; Nishikawa, Takeshi; Kukidome, Daisuke; Yoshinaga, Tomoaki; Kajihara, Nobuhiro; Sonoda, Kazuhiro; Senokuchi, Takafumi; Motoshima, Hiroyuki; Matsumura, Takeshi; Araki, Eiichi
2016-01-01
We previously proposed that hyperglycemia-induced mitochondrial reactive oxygen species (mtROS) generation is a key event in the development of diabetic complications. Interestingly, some common aspects exist between hyperglycemia and hypoxia-induced phenomena. Thus, hyperglycemia may induce cellular hypoxia, and this phenomenon may also be involved in the pathogenesis of diabetic complications. In endothelial cells (ECs), cellular hypoxia increased after incubation with high glucose (HG). A similar phenomenon was observed in glomeruli of diabetic mice. HG-induced cellular hypoxia was suppressed by mitochondria blockades or manganese superoxide dismutase (MnSOD) overexpression, which is a specific SOD for mtROS. Overexpression of MnSOD also increased the expression of aquaporin-1 (AQP1), a water and oxygen channel. AQP1 overexpression in ECs suppressed hyperglycemia-induced cellular hypoxia, endothelin-1 and fibronectin overproduction, and apoptosis. Therefore, hyperglycemia-induced cellular hypoxia and mtROS generation may promote hyperglycemic damage in a coordinated manner.
Rogue wave generation by inelastic quasi-soliton collisions in optical fibres
NASA Astrophysics Data System (ADS)
Eberhard, M.; Savojardo, A.; Maruta, A.; Römer, R. A.
2017-11-01
We demonstrate a simple cascade mechanism that drives the formation and emergence of rogue waves in the generalized non-linear Schr\\"{o}dinger equation with third-order dispersion. This conceptually novel generation mechanism is based on inelastic collisions of quasi-solitons and is well described by a resonant-like scattering behaviour for the energy transfer in pair-wise quasi-soliton collisions. Our results demonstrate a threshold for rogue wave emergence and the existence of a period of reduced amplitudes - a "calm before the storm" - preceding the arrival of a rogue wave event. Comparing with ultra-long time window simulations of $3.865\\times 10^{6}$ps we observe the statistics of rogue waves in optical fibres with an unprecedented level of detail and accuracy, unambiguously establishing the long-ranged character of the rogue wave power-distribution function over seven orders of magnitude.
Search for neutrino generated air shower candidates with energy ≥ 1019 eV and Zenith angle θ
NASA Astrophysics Data System (ADS)
Knurenko, Stanislav; Petrov, Igor; Sabourov, Artem
2017-06-01
The description of the methodology and results of searching for air showers generated by neutral particles such as high energy gamma quanta and astroneutrinos are presented. For this purpose, we conducted a comprehensive analysis of the data: the electron, the muon and the EAS Cerenkov light, and their response time in scintillation and Cherenkov detectors. Air showers with energy more than 5·1018 eV and zenith angle θ ≥ 55∘ are selected and analyzed. Search results indicate a lack of air shower events formed by gamma-rays or high-energy neutrinos, but it does not mean that such air showers do not exist in nature; for example, experiments that recorded showers having a marked low muon content, i.e., "Muonless", are likely to be candidates for showers produced by neutral primary particles.
Controlled decoherence in a quantum Lévy kicked rotator
NASA Astrophysics Data System (ADS)
Schomerus, Henning; Lutz, Eric
2008-06-01
We develop a theory describing the dynamics of quantum kicked rotators (modeling cold atoms in a pulsed optical field) which are subjected to combined amplitude and timing noise generated by a renewal process (acting as an engineered reservoir). For waiting-time distributions of variable exponent (Lévy noise), we demonstrate the existence of a regime of nonexponential loss of phase coherence. In this regime, the momentum dynamics is subdiffusive, which also manifests itself in a non-Gaussian limiting distribution and a fractional power-law decay of the inverse participation ratio. The purity initially decays with a stretched exponential which is followed by two regimes of power-law decay with different exponents. The averaged logarithm of the fidelity probes the sprinkling distribution of the renewal process. These analytical results are confirmed by numerical computations on quantum kicked rotators subjected to noise events generated by a Yule-Simon distribution.
Direct generation of event-timing equations for generalized flow shop systems
NASA Astrophysics Data System (ADS)
Doustmohammadi, Ali; Kamen, Edward W.
1995-11-01
Flow shop production lines are very common in manufacturing systems such as car assemblies, manufacturing of electronic circuits, etc. In this paper, a systematic procedure is given for generating event-timing equations directly from the machine interconnections for a generalized flow shop system. The events considered here correspond to completion times of machine operations. It is assumed that the scheduling policy is cyclic (periodic). For a given flow shop system, the open connection dynamics of the machines are derived first. Then interconnection matrices characterizing the routing of parts in the system are obtained from the given system configuration. The open connection dynamics of the machines and the interconnection matrices are then combined together to obtain the overall system dynamics given by an equation of the form X(k+1) equals A(k)X(k) B(k)V(k+1) defined over the max-plus algebra. Here the state X(k) is the vector of completion times and V(k+1) is an external input vector consisting of the arrival times of parts. It is shown that if the machines are numbered in an appropriate way and the states are selected according to certain rules, the matrix A(k) will be in a special (canonical) form. The model obtained here is useful or the analysis of system behavior and for carrying out simulations. In particular, the canonical form of A(k) enables one to study system bottlenecks and the minimal cycle time during steady-state operation. The approach presented in this paper is believed to be more straightforward compared to existing max-plus algebra formulations of flow shop systems. In particular, three advantages of the proposed approach are: (1) it yields timing equations directly from the system configuration and hence there is no need to first derive a Petri net or a digraph equivalent of the system; (2) a change in the system configuration only affects the interconnection matrices and hence does not require rederiving the entire set of equations; (3) the system model is easily put into code using existing software packages such as MATLAB.
McClintock, Carlee S; Hettich, Robert L.
2012-01-01
Oxidative protein surface mapping has become a powerful approach for measuring the solvent accessibility of folded protein structures. A variety of techniques exist for generating the key reagent – hydroxyl radicals – for these measurements; however, these approaches range significantly in their complexity and expense of operation. This research expands upon earlier work to enhance the controllability of boron-doped diamond (BDD) electrochemistry as an easily accessible tool for producing hydroxyl radicals in order to oxidize a range of intact proteins. Efforts to modulate oxidation level while minimizing the adsorption of protein to the electrode involved the use of relatively high flow rates to reduce protein residence time inside the electrochemical flow chamber. Additionally, a different cell activation approach using variable voltage to supply a controlled current allowed us to precisely tune the extent of oxidation in a protein-dependent manner. In order to gain perspective on the level of protein adsorption onto the electrode surface, studies were conducted to monitor protein concentration during electrolysis and gauge changes in the electrode surface between cell activation events. This report demonstrates the successful use of BDD electrochemistry for greater precision in generating a target number of oxidation events upon intact proteins. PMID:23210708
Yonelinas, Andrew P.
2013-01-01
It is well established that the hippocampus plays a critical role in our ability to recollect past events. A number of recent studies have indicated that the hippocampus may also play a critical role in working memory and perception, but these results have been highly controversial because other similar studies have failed to find evidence for hippocampal involvement. Thus, the precise role that the hippocampus plays in cognition is still debated. In the current paper, I propose that the hippocampus supports the generation and utilization of complex high-resolution bindings that link together the qualitative aspects that make up an event; these bindings are essential for recollection, and they can also contribute to performance across a variety of tasks including perception and working memory. An examination of the existing patient literature provides support for this proposal by showing that hippocampal damage leads to impairments on perception and working memory tasks that require complex high-resolution bindings. Conversely, hippocampal damage is much less likely to lead to impairments on tasks that require only low-resolution or simple associations/relations. The current proposal can be distinguished from earlier accounts of hippocampal function, and it generates a number of novel predictions that can be tested in future studies. PMID:23721964
Constraint-based Temporal Reasoning with Preferences
NASA Technical Reports Server (NTRS)
Khatib, Lina; Morris, Paul; Morris, Robert; Rossi, Francesca; Sperduti, Alessandro; Venable, K. Brent
2005-01-01
Often we need to work in scenarios where events happen over time and preferences are associated to event distances and durations. Soft temporal constraints allow one to describe in a natural way problems arising in such scenarios. In general, solving soft temporal problems require exponential time in the worst case, but there are interesting subclasses of problems which are polynomially solvable. In this paper we identify one of such subclasses giving tractability results. Moreover, we describe two solvers for this class of soft temporal problems, and we show some experimental results. The random generator used to build the problems on which tests are performed is also described. We also compare the two solvers highlighting the tradeoff between performance and robustness. Sometimes, however, temporal local preferences are difficult to set, and it may be easier instead to associate preferences to some complete solutions of the problem. To model everything in a uniform way via local preferences only, and also to take advantage of the existing constraint solvers which exploit only local preferences, we show that machine learning techniques can be useful in this respect. In particular, we present a learning module based on a gradient descent technique which induces local temporal preferences from global ones. We also show the behavior of the learning module on randomly-generated examples.
Actionable Capability for Social and Economic Systems (ACSES)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandez, Steven J; Brecke, Peter K; Carmichael, Theodore D
The foundation of the Actionable Capability for Social and Economic Systems (ACSES) project is a useful regional-scale social-simulation system. This report is organized into five chapters that describe insights that were gained concerning the five key feasibility questions pertaining to such a system: (1) Should such a simulation system exist, would the current state of data sets or collectible data sets be adequate to support such a system? (2) By comparing different agent-based simulation systems, is it feasible to compare simulation systems and select one appropriate for a given application with agents behaving according to modern social theory rather thanmore » ad hoc rule sets? (3) Provided that a selected simulation system for a region of interest could be constructed, can the simulation system be updated with new and changing conditions so that the universe of potential outcomes are constrained by events on the ground as they evolve? (4) As these results are constrained by evolving events on the ground, is it feasible to still generate surprise and emerging behavior to suggest outcomes from novel courses of action? (5) As these systems may for the first time require large numbers (hundreds of millions) of agents operating with complexities demanded of modern social theories, can results still be generated within actionable decision cycles?« less
Airborne soil organic particles generated by precipitation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bingbing; Harder, Tristan H.; Kelly, Stephen T.
Airborne organic particles play a critical role in the Earth’s climate1, public health2, air quality3, and hydrological and carbon cycles4. These particles exist in liquid, amorphous semi-solid, or solid (glassy) phase states depending on their composition and ambient conditions5. However, sources and formation mechanisms for semi- solid and solid organic particles are poorly understood and typically neglected in atmospheric models6. Here we report field evidence for airborne solid organic particles generated by a “raindrop” mechanism7 pertinent to atmosphere – land surface interactions (Fig. 1). We find that after rain events at Southern Great Plains, Oklahoma, USA, submicron solid particles, withmore » a composition consistent with soil organic matter, contributed up to 60% of atmospheric particles in number. Subsequent experiments indicate that airborne soil organic particles are ejected from the surface of soils caused by intensive rains or irrigation. Our observations suggest that formation of these particles may be a widespread phenomenon in ecosystems where soils are exposed to strong, episodic precipitation events such as agricultural systems and grasslands8. Chemical imaging and micro-spectroscopy analysis of their physico-chemical properties suggests that airborne soil organic particles may have important impacts on cloud formation and efficiently absorb solar radiation and hence, are an important type of particles.« less
Climatology of atmospheric circulation patterns of Arabian dust in western Iran.
Najafi, Mohammad Saeed; Sarraf, B S; Zarrin, A; Rasouli, A A
2017-08-28
Being in vicinity of vast deserts, the west and southwest of Iran are characterized by high levels of dust events, which have adverse consequences on human health, ecosystems, and environment. Using ground based dataset of dust events in western Iran and NCEP/NCAR reanalysis data, the atmospheric circulation patterns of dust events in the Arabian region and west of Iran are identified. The atmospheric circulation patterns which lead to dust events in the Arabian region and western Iran were classified into two main categories: the Shamal dust events that occurs in warm period of year and the frontal dust events as cold period pattern. In frontal dust events, the western trough or blocking pattern at mid-level leads to frontogenesis, instability, and air uplift at lower levels of troposphere in the southwest of Asia. Non-frontal is other pattern of dust event in the cold period and dust generation are due to the regional circulation systems at the lower level of troposphere. In Shamal wind pattern, the Saudi Arabian anticyclone, Turkmenistan anticyclone, and Zagros thermal low play the key roles in formation of this pattern. Summer and transitional patterns are two sub-categories of summer Shamal wind pattern. In summer trough pattern, the mid-tropospheric trough leads to intensify the surface thermal systems in the Middle East and causes instability and rising of wind speed in the region. In synthetic pattern of Shamal wind and summer trough, dust is created by the impact of a trough in mid-levels of troposphere as well as existing the mentioned regional systems which are contributed in formation of summer Shamal wind pattern.
Characteristics that Produce White-light Enhancements in Solar Flares Observed by Hinode/SOT
NASA Astrophysics Data System (ADS)
Watanabe, Kyoko; Kitagawa, Jun; Masuda, Satoshi
2017-12-01
To understand the conditions that produce white-light (WL) enhancements in solar flares, a statistical analysis of visible continuum data as observed by Hinode/Solar Optical Telescope (SOT) was performed. In this study, approximately 100 flare events from M- and X-class flares were selected. The time period during which the data were recorded spans from 2011 January to 2016 February. Of these events, approximately half are classified as white-light flares (WLFs), whereas the remaining events do not show any enhancements of the visible continuum (non-WLF; NWL). To determine the existence of WL emission, running difference images of not only the Hinode/SOT WL (G-band, blue, green, and red filter) data, but also the Solar Dynamics Observatory/Helioseismic and Magnetic Imager continuum data are used. A comparison between these two groups of WL data in terms of duration, temperature, emission measure of GOES soft X-rays, distance between EUV flare ribbons, strength of hard X-rays, and photospheric magnetic field strength was undertaken. In this statistical study, WLF events are characterized by a shorter timescale and shorter ribbon distance compared with NWL events. From the scatter plots of the duration of soft X-rays and the energy of non-thermal electrons, a clear distinction between WLF and NWL events can be made. It is found that the precipitation of large amounts of accelerated electrons within a short time period plays a key role in generating WL enhancements. Finally, it was demonstrated that the coronal magnetic field strength in the flare region is one of the most important factors that allow the individual identification of WLF events from NWL events.
A strong-motion database from the Central American subduction zone
NASA Astrophysics Data System (ADS)
Arango, Maria Cristina; Strasser, Fleur O.; Bommer, Julian J.; Hernández, Douglas A.; Cepeda, Jose M.
2011-04-01
Subduction earthquakes along the Pacific Coast of Central America generate considerable seismic risk in the region. The quantification of the hazard due to these events requires the development of appropriate ground-motion prediction equations, for which purpose a database of recordings from subduction events in the region is indispensable. This paper describes the compilation of a comprehensive database of strong ground-motion recordings obtained during subduction-zone events in Central America, focusing on the region from 8 to 14° N and 83 to 92° W, including Guatemala, El Salvador, Nicaragua and Costa Rica. More than 400 accelerograms recorded by the networks operating across Central America during the last decades have been added to data collected by NORSAR in two regional projects for the reduction of natural disasters. The final database consists of 554 triaxial ground-motion recordings from events of moment magnitudes between 5.0 and 7.7, including 22 interface and 58 intraslab-type events for the time period 1976-2006. Although the database presented in this study is not sufficiently complete in terms of magnitude-distance distribution to serve as a basis for the derivation of predictive equations for interface and intraslab events in Central America, it considerably expands the Central American subduction data compiled in previous studies and used in early ground-motion modelling studies for subduction events in this region. Additionally, the compiled database will allow the assessment of the existing predictive models for subduction-type events in terms of their applicability for the Central American region, which is essential for an adequate estimation of the hazard due to subduction earthquakes in this region.
Analyzing Responses of Chemical Sensor Arrays
NASA Technical Reports Server (NTRS)
Zhou, Hanying
2007-01-01
NASA is developing a third-generation electronic nose (ENose) capable of continuous monitoring of the International Space Station s cabin atmosphere for specific, harmful airborne contaminants. Previous generations of the ENose have been described in prior NASA Tech Briefs issues. Sensor selection is critical in both (prefabrication) sensor material selection and (post-fabrication) data analysis of the ENose, which detects several analytes that are difficult to detect, or that are at very low concentration ranges. Existing sensor selection approaches usually include limited statistical measures, where selectivity is more important but reliability and sensitivity are not of concern. When reliability and sensitivity can be major limiting factors in detecting target compounds reliably, the existing approach is not able to provide meaningful selection that will actually improve data analysis results. The approach and software reported here consider more statistical measures (factors) than existing approaches for a similar purpose. The result is a more balanced and robust sensor selection from a less than ideal sensor array. The software offers quick, flexible, optimal sensor selection and weighting for a variety of purposes without a time-consuming, iterative search by performing sensor calibrations to a known linear or nonlinear model, evaluating the individual sensor s statistics, scoring the individual sensor s overall performance, finding the best sensor array size to maximize class separation, finding optimal weights for the remaining sensor array, estimating limits of detection for the target compounds, evaluating fingerprint distance between group pairs, and finding the best event-detecting sensors.
NASA Astrophysics Data System (ADS)
Yenier, E.; Baturan, D.; Karimi, S.
2016-12-01
Monitoring of seismicity related to oil and gas operations is routinely performed nowadays using a number of different surface and downhole seismic array configurations and technologies. Here, we provide a hydraulic fracture (HF) monitoring case study that compares the data set generated by a sparse local surface network of broadband seismometers to a data set generated by a single downhole geophone string. Our data was collected during a 5-day single-well HF operation, by a temporary surface network consisting of 10 stations deployed within 5 km of the production well. The downhole data was recorded by a 20 geophone string deployed in an observation well located 15 m from the production well. Surface network data processing included standard STA/LTA event triggering enhanced by template-matching subspace detection, grid search locations which was improved using the double-differencing re-location technique, as well as Richter (ML) and moment (Mw) magnitude computations for all detected events. In addition, moment tensors were computed from first motion polarities and amplitudes for the subset of highest SNR events. The resulting surface event catalog shows a very weak spatio-temporal correlation to HF operations with only 43% of recorded seismicity occurring during HF stages times. This along with source mechanisms shows that the surface-recorded seismicity delineates the activation of several pre-existing structures striking NNE-SSW and consistent with regional stress conditions as indicated by the orientation of SHmax. Comparison of the sparse-surface and single downhole string datasets allows us to perform a cost-benefit analysis of the two monitoring methods. Our findings show that although the downhole array recorded ten times as many events, the surface network provides a more coherent delineation of the underlying structure and more accurate magnitudes for larger magnitude events. We attribute this to the enhanced focal coverage provided by the surface network and the use of broadband instrumentation. The results indicate that sparse surface networks of high quality instruments can provide rich and reliable datasets for evaluation of the impact and effectiveness of hydraulic fracture operations in regions with favorable surface noise, local stress and attenuation characteristics.
Ambient Tremor, But No Triggered Tremor at the Northern Costa Rica Subduction Zone
NASA Astrophysics Data System (ADS)
Swiecki, Z.; Schwartz, S. Y.
2010-12-01
Non-volcanic tremor (NVT) has been found to be triggered during the passage of surface waves from various teleseismic events in locations around the world including Cascadia, Southwest Japan, Taiwan, and California. In this study we examine the northern Costa Rica subduction zone for evidence of triggered tremor. The Nicoya Peninsula segment of the northern Costa Rica margin experiences both slow-slip and tremor and is thus a prime candidate for triggered tremor observations. Eleven teleseismic events with magnitudes (Mw) greater than 8 occurring between 2006 and 2010 were examined using data from both broadband and short period sensors deployed on the Nicoya Peninsula, Costa Rica. Waveforms from several large regional events were also considered. The largest teleseismic and regional events (27 February 2010 Chile, Mw 8.8 and 28 May 2009 Honduras, Mw 7.3) induced peak ground velocities (PGV) at the NIcoya stations of ~2 and 6 mm/s, respectively; larger than PGVs in other locations that have triggered tremor. Many of the earthquakes examined occurred during small episodes of background ambient tremor. In spite of this, no triggered tremor was observed during the passage of seismic waves from any event. This is significant because other studies have demonstrated that NVT is not triggered everywhere by all events above some threshold magnitude, indicating that unique conditions are required for its occurrence. The lack of triggered tremor at the Costa Rica margin can help to better quantify the requisite conditions and triggering mechanisms. An inherent difference between the Costa Rica margin and the other subduction zones where triggered tremor exists is its erosional rather than accretionary nature. Its relatively low sediment supply likely results in a drier, lower pore fluid pressure, stronger and less compliant thrust interface that is less receptive to triggering tremor from external stresses generated by teleseismic or strong local earthquakes. Another important factor is Costa Rica’s relatively cool subduction zone structure where temperatures required for the fluid generating basalt/ecloginte reaction are not reached until far below tremor producing depths.
Time resolved measurements of the flow generated by suction feeding fish
NASA Astrophysics Data System (ADS)
Day, Steven W.; Higham, Timothy E.; Wainwright, Peter C.
2007-11-01
The majority of aquatic vertebrates are suction feeders: by rapidly expanding the mouth cavity they generate a fluid flow outside of their head in order to draw prey into their mouth. In addition to the biological relevance, the generated flow field is interesting fluid mechanically as it incorporates high velocities, is localized in front of the mouth, and is unsteady, typically lasting between 10 and 50 ms. Using manometry and high-speed particle image velocimetry, this is the first study to quantify pressure within and outside the mouth of a feeding fish while simultaneously measuring the velocity field outside the mouth. Measurements with a high temporal (2 ms) and spatial (<1 mm) resolution were made for several feeding events of a single largemouth bass ( Micropterus salmoides). General properties of the flow were evaluated, including the transient velocity field, its relationship to pressure within the mouth and pressure at the prey. We find that throughout the feeding event a relationship exists for the magnitude of fluid speed as a function of distance from the predator mouth that is based on scaling the velocity field according to the size of the mouth opening and the magnitude of fluid speed at the mouth. The velocity field is concentrated within an area extending approximately one mouth diameter from the fish and the generated pressure field is even more local to the mouth aperture. Although peak suction pressures measured inside the mouth were slightly larger than those that were predicted using the equations of motion, we find that these equations give a very accurate prediction of the timing of peak pressure, so long as the unsteady nature of the flow is included.
Time resolved measurements of the flow generated by suction feeding fish
NASA Astrophysics Data System (ADS)
Day, Steven W.; Higham, Timothy E.; Wainwright, Peter C.
The majority of aquatic vertebrates are suction feeders: by rapidly expanding the mouth cavity they generate a fluid flow outside of their head in order to draw prey into their mouth. In addition to the biological relevance, the generated flow field is interesting fluid mechanically as it incorporates high velocities, is localized in front of the mouth, and is unsteady, typically lasting between 10 and 50 ms. Using manometry and high-speed particle image velocimetry, this is the first study to quantify pressure within and outside the mouth of a feeding fish while simultaneously measuring the velocity field outside the mouth. Measurements with a high temporal (2 ms) and spatial (<1 mm) resolution were made for several feeding events of a single largemouth bass (Micropterus salmoides). General properties of the flow were evaluated, including the transient velocity field, its relationship to pressure within the mouth and pressure at the prey. We find that throughout the feeding event a relationship exists for the magnitude of fluid speed as a function of distance from the predator mouth that is based on scaling the velocity field according to the size of the mouth opening and the magnitude of fluid speed at the mouth. The velocity field is concentrated within an area extending approximately one mouth diameter from the fish and the generated pressure field is even more local to the mouth aperture. Although peak suction pressures measured inside the mouth were slightly larger than those that were predicted using the equations of motion, we find that these equations give a very accurate prediction of the timing of peak pressure, so long as the unsteady nature of the flow is included.
NASA Technical Reports Server (NTRS)
Bourgeois, Joanne; Wiberg, Patricia L.
1988-01-01
Impulse-generated waves (tsunamis) may be produced, at varying scales and global recurrence intervals (RI), by several processes. Meteorite-water impacts will produce tsunamis, and asteroid-scale impacts with associated mega-tsunamis may occur. A bolide-water impact would undoubtedly produce a major tsunami, whose sedimentological effects should be recognizable. Even a bolide-land impact might trigger major submarine landslides and thus tsunamis. In all posulated scenarios for the K/T boundary event, then, tsunamis are expected, and where to look for them must be determined, and how to distinguish deposits from different tsunamis. Also, because tsunamis decrease in height as they move away from their source, the proximal effects will differ by perhaps orders of magnitude from distal effects. Data on the characteristics of tsunamis at their origin are scarce. Some observations exist for tsunamis generated by thermonuclear explosions and for seismogenic tsunamis, and experimental work was conducted on impact-generated tsunamis. All tsunamis of interest have wave-lengths of 0(100) km and thus behave as shallow-water waves in all ocean depths. Typical wave periods are 0(10 to 100) minutes. The effect of these tsunamis can be estimated in the marine and coastal realm by calculating boundary shear stresses (expressed as U*, the shear velocity). An event layer at the K/T boundary in Texas occurs in mid-shelf muds. Only a large, long-period wave with a wave height of 0(50) m, is deemed sufficient to have produced this layer. Such wave heights imply a nearby volcanic explosion on the scale of Krakatau or larger, or a nearby submarine landslide also of great size, or a bolide-water impact in the ocean.
Widjaja, Michael; Berry, Iain J.; Pont, Elsa J.; Padula, Matthew P.; Djordjevic, Steven P.
2015-01-01
Mycoplasma pneumoniae is a significant cause of community acquired pneumonia globally. Despite having a genome less than 1 Mb in size, M. pneumoniae presents a structurally sophisticated attachment organelle that (i) provides cell polarity, (ii) directs adherence to receptors presented on respiratory epithelium, and (iii) plays a major role in cell motility. The major adhesins, P1 (Mpn141) and P30 (Mpn453), are localised to the tip of the attachment organelle by the surface accessible cleavage fragments P90 and P40 derived from Mpn142. Two events play a defining role in the formation of P90 and P40; removal of a leader peptide at position 26 (23SLA↓NTY28) during secretion to the cell surface and cleavage at amino acid 455 (452GPL↓RAG457) generating P40 and P90. Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) analysis of tryptic peptides generated by digesting size-fractionated cell lysates of M. pneumoniae identified 15 cleavage fragments of Mpn142 ranging in mass from 9–84 kDa. Further evidence for the existence of cleavage fragments of Mpn142 was generated by mapping tryptic peptides to proteins recovered from size fractionated eluents from affinity columns loaded with heparin, fibronectin, fetuin, actin, plasminogen and A549 surface proteins as bait. To define the sites of cleavage in Mpn142, neo-N-termini in cell lysates of M. pneumoniae were dimethyl-labelled and characterised by LC-MS/MS. Our data suggests that Mpn142 is cleaved to generate adhesins that are auxiliary to P1 and P30. PMID:28248283
NASA Astrophysics Data System (ADS)
Zaleski, Shawn
2017-01-01
A set of contact interaction (CI) Monte Carlo events, for which Standard Model Drell-Yan events are background, are generated using a leading-order parton-shower generator, Pythia8. We consider three isoscalar models with three different helicity structures, left-left (LL), left-right/right-left (LR), and rightright (RR), each with destructive and constructive interference. For each of these models, 150,000 events are generated for analysis of CI interactions in the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) with a centre of mass energy of 13 TeV. This study is a generator level study, and detector effects are accounted for by application of kinematic cuts on the generator-level quantities rather than application of a detailed detector simulation package (e.g. GEANT). Distributions of dilepton invariant mass, Collins-Soper angle, and the forward-backward asymmetry are compared with those arising from pure Drell-Yan events.
NASA Astrophysics Data System (ADS)
Cashman, Katharine V.; Cronin, Shane J.
2008-10-01
Volcanic eruptions can overwhelm all senses of observers in their violence, spectacle and sheer incredibility. When an eruption is catastrophic or unexpected, neither individuals nor communities can easily assimilate the event into their world view. Psychological studies of disaster aftermaths have shown that trauma can shake the very foundations of a person's faith and trigger a search - supernatural, religious, or scientific - for answers. For this reason, the ability to rapidly comprehend a traumatic event by "accepting" the catastrophe as part the observer's world represents an important component of community resilience to natural hazards. A relationship with the event may be constructed by adapting existing cosmological, ancestral, or scientific frameworks, as well as through creative and artistic expression. In non-literate societies, communal perceptions of an event may be transformed into stories that offer myth-like explanations. As these stories make their way into oral traditions, they often undergo major changes to allow transmission through generations and, in some cases, to serve political or religious purposes. Disaster responses in literate societies are no different, except that they are more easily recorded and therefore are less prone to change over time. Here we explore ways in which the language, imagery and metaphor used to describe volcanic events may link disparate societies (both present and past) in their search for understanding of volcanic catastrophes. Responses to modern eruptions (1980 Mount St Helens, USA, and 1995-present Soufriere Hills, Montserrat) provide a baseline for examining the progression to older historic events that have already developed oral traditions (1886 Tarawera, New Zealand) and finally to oral traditions many hundreds of years old in both the Pacific Northwest US and New Zealand (NZ). We see that repeated volcanism over many generations produces rich webs of cosmology and history surrounding volcanoes. NZ Maori have incorporated volcanoes into the lineage of tribes and individuals, thus good and bad outcomes from volcanism are part of long-term cycles of reciprocity and equilibrium that link modern Maori to their ancestors. In both regions, cosmologies and mythologies not only document the attempts of past cultures to recover from the impacts of volcanic disasters, but also provide a means by which following generations can understand, contextualize, and therefore recover from, future volcanic catastrophes. We further suggest that such local traditions can provide a valuable community education tool as well as an important means of aiding the psychosocial recovery of individuals and communities after volcanic disasters.
Using diatoms, hydrochemical and stable isotope tracers to infer runoff generation processes
NASA Astrophysics Data System (ADS)
Martínez-Carreras, N.; Wetzel, C. E.; Frentress, J.; Hlúbiková, D.; Ector, L.; McDonnell, J. J.; Hoffmann, L.; Pfister, L.
2012-04-01
Imaginative techniques are needed to improve our understanding of runoff generation processes. In this context, the hydrological community calls to cut across disciplines looking for new and exciting advances in knowledge. In this study, hydrologists and ecologists have worked together to use not only hydrochemical and stable isotope tracers, but also diatoms to infer runoff generation processes. Diatoms, one of the most common and divers algal group, can be easily transported by flowing water due to their small size (~10-200 μm). They are present in most terrestrial habitats and their diversified species distributions are largely controlled by physico-geographical factors (e.g. light, temperature, pH and moisture). Thus, hydrological systems largely control diatom species community composition and distribution. This study was conducted in the schistose Weierbach catchment (0.45 km2, NW Luxembourg). Its runoff regime is characterised by seasonal variation and a delayed shallow groundwater component originating from a saprolite zone. The catchment was instrumented with piezometers, suction cups, an automatic streamwater sampler, a sequential rainfall sampler, and soil moisture and temperature sensors. Samples collected bi-weekly and during storm runoff events allowed the characterisation of the different end-members. Chemical and isotopic hydrograph separations of stream discharge were used to determine not only the geographic sources of water, but also the fractions of old and new water contributing to streamflow. Diatoms intra-storm variability was also analysed and samples of diatoms from various terrestrial and subaerial substrates (bryophytes, litter and leaves), as well as from aquatic habitats (epilithon, epipelon and drift samples) were regularly collected. Diatoms were then used to constrain assumptions and to confirm or reject the hypothesis of existing surface runoff during rainfall-runoff events and to document the intermittent character of hydrological connectivity between upland, riparian and aquatic zones. As an advantage, diatoms do not seem to be subject to some inherent limitations of the classical tracer-based hydrograph separation techniques, such as unrealistic mixing assumptions, unstable end-member solutions and temporally varying input concentrations. Results suggested a substantial contribution of soil water during winter events in the Weierbach catchment, whereas groundwater played a more significant role during summer events. Even though overland flow remained insignificant during most of the sampled events, terrestrial diatom abundance increased with precipitation in all sampled events suggesting a rapid connectivity between soil surface and stream water. We hypothesise the mobilization and flushing away of terrestrial diatoms through a subsurface network of macropores in the shallow soils.
Elk River Watershed - Flood Study
NASA Astrophysics Data System (ADS)
Barnes, C. C.; Byrne, J. M.; MacDonald, R. J.; Lewis, D.
2014-12-01
Flooding has the potential to cause significant impacts to economic activities as well as to disrupt or displace populations. Changing climate regimes such as extreme precipitation events increase flood vulnerability and put additional stresses on infrastructure. Potential flooding from just under 100 (2009 NPRI Reviewed Facility Data Release, Environment Canada) toxic tailings ponds located in Canada increase risk to human safety and the environment. One such geotechnical failure spilt billions of litres of toxic tailings into the Fraser River watershed, British Columbia, when a tailings pond dam breach occurred in August 2014. Damaged and washed out roadways cut access to essential services as seen by the extensive floods that occurred in Saskatchewan and Manitoba in July 2014, and in Southern Alberta in 2013. Recovery efforts from events such as these can be lengthy, and have substantial social and economic impacts both in loss of revenue and cost of repair. The objective of this study is to investigate existing conditions in the Elk River watershed and model potential future hydrological changes that can increase flood risk hazards. By analyzing existing hydrology, meteorology, land cover, land use, economic, and settlement patterns a baseline is established for existing conditions in the Elk River watershed. Coupling the Generate Earth Systems Science (GENESYS) high-resolution spatial hydrometeorological model with flood hazard analysis methodology, high-resolution flood vulnerability base line maps are created using historical climate conditions. Further work in 2015 will examine possible impacts for a range of climate change and land use change scenarios to define changes to future flood risk and vulnerability.
BlackHoleCam: Fundamental physics of the galactic center
NASA Astrophysics Data System (ADS)
Goddi, C.; Falcke, H.; Kramer, M.; Rezzolla, L.; Brinkerink, C.; Bronzwaer, T.; Davelaar, J. R. J.; Deane, R.; de Laurentis, M.; Desvignes, G.; Eatough, R. P.; Eisenhauer, F.; Fraga-Encinas, R.; Fromm, C. M.; Gillessen, S.; Grenzebach, A.; Issaoun, S.; Janßen, M.; Konoplya, R.; Krichbaum, T. P.; Laing, R.; Liu, K.; Lu, R.-S.; Mizuno, Y.; Moscibrodzka, M.; Müller, C.; Olivares, H.; Pfuhl, O.; Porth, O.; Roelofs, F.; Ros, E.; Schuster, K.; Tilanus, R.; Torne, P.; van Bemmel, I.; van Langevelde, H. J.; Wex, N.; Younsi, Z.; Zhidenko, A.
Einstein’s General theory of relativity (GR) successfully describes gravity. Although GR has been accurately tested in weak gravitational fields, it remains largely untested in the general strong field cases. One of the most fundamental predictions of GR is the existence of black holes (BHs). After the recent direct detection of gravitational waves by LIGO, there is now near conclusive evidence for the existence of stellar-mass BHs. In spite of this exciting discovery, there is not yet direct evidence of the existence of BHs using astronomical observations in the electromagnetic spectrum. Are BHs observable astrophysical objects? Does GR hold in its most extreme limit or are alternatives needed? The prime target to address these fundamental questions is in the center of our own Milky Way, which hosts the closest and best-constrained supermassive BH candidate in the universe, Sagittarius A* (Sgr A*). Three different types of experiments hold the promise to test GR in a strong-field regime using observations of Sgr A* with new-generation instruments. The first experiment consists of making a standard astronomical image of the synchrotron emission from the relativistic plasma accreting onto Sgr A*. This emission forms a “shadow” around the event horizon cast against the background, whose predicted size (˜50μas) can now be resolved by upcoming very long baseline radio interferometry experiments at mm-waves such as the event horizon telescope (EHT). The second experiment aims to monitor stars orbiting Sgr A* with the next-generation near-infrared (NIR) interferometer GRAVITY at the very large telescope (VLT). The third experiment aims to detect and study a radio pulsar in tight orbit about Sgr A* using radio telescopes (including the Atacama large millimeter array or ALMA). The BlackHoleCam project exploits the synergy between these three different techniques and contributes directly to them at different levels. These efforts will eventually enable us to measure fundamental BH parameters (mass, spin, and quadrupole moment) with sufficiently high precision to provide fundamental tests of GR (e.g. testing the no-hair theorem) and probe the spacetime around a BH in any metric theory of gravity. Here, we review our current knowledge of the physical properties of Sgr A* as well as the current status of such experimental efforts towards imaging the event horizon, measuring stellar orbits, and timing pulsars around Sgr A*. We conclude that the Galactic center provides a unique fundamental-physics laboratory for experimental tests of BH accretion and theories of gravity in their most extreme limits.
Epigenetic Regulation of Centromere Chromatin Stability by Dietary and Environmental Factors.
Hernández-Saavedra, Diego; Strakovsky, Rita S; Ostrosky-Wegman, Patricia; Pan, Yuan-Xiang
2017-11-01
The centromere is a genomic locus required for the segregation of the chromosomes during cell division. This chromosomal region together with pericentromeres has been found to be susceptible to damage, and thus the perturbation of the centromere could lead to the development of aneuploidic events. Metabolic abnormalities that underlie the generation of cancer include inflammation, oxidative stress, cell cycle deregulation, and numerous others. The micronucleus assay, an early clinical marker of cancer, has been shown to provide a reliable measure of genotoxic damage that may signal cancer initiation. In the current review, we will discuss the events that lead to micronucleus formation and centromeric and pericentromeric chromatin instability, as well transcripts emanating from these regions, which were previously thought to be inactive. Studies were selected in PubMed if they reported the effects of nutritional status (macro- and micronutrients) or environmental toxicant exposure on micronucleus frequency or any other chromosomal abnormality in humans, animals, or cell models. Mounting evidence from epidemiologic, environmental, and nutritional studies provides a novel perspective on the origination of aneuploidic events. Although substantial evidence exists describing the role that nutritional status and environmental toxicants have on the generation of micronuclei and other nuclear aberrations, limited information is available to describe the importance of macro- and micronutrients on centromeric and pericentromeric chromatin stability. Moving forward, studies that specifically address the direct link between nutritional status, excess, or deficiency and the epigenetic regulation of the centromere will provide much needed insight into the nutritional and environmental regulation of this chromosomal region and the initiation of aneuploidy. © 2017 American Society for Nutrition.
Koren, S A; Persinger, M A
2002-12-01
In 2002 Persinger, Roll, Tiller, Koren, and Cook considered whether there are physical processes by which recondite information exists within the space and time of objects or events. The stimuli that compose this information might be directly detected within the whole brain without being processed by the typical sensory modalities. We tested the artist Ingo Swann who can reliably draw and describe randomly selected photographs sealed in envelopes in another room. In the present experiment the photographs were immersed continuously in repeated presentations (5 times per sec.) of one of two types of computer-generated complex magnetic field patterns whose intensities were less than 20 nT over most of the area. WINDOWS-generated but not DOS-generated patterns were associated with a marked decrease in Mr. Swann's accuracy. Whereas the DOS software generated exactly the same pattern, WINDOWS software phase-modulated the actual wave form resulting in an infinite bandwidth and complexity. We suggest that information obtained by processes attributed to "paranormal" phenomena have physical correlates that can be masked by weak, infinitely variable magnetic fields.
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Makarov, Yuri V.
This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on Februarymore » 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.« less
Impact of extreme precipitation events in the Miño-Sil river basin
NASA Astrophysics Data System (ADS)
Fernández-González, Manuel; Añel, Juan Antonio; de la Torre, Laura
2015-04-01
We herein research the impact of extreme rainfall events in the Miño-Sil basin, a heavily dammed basin located in the northwestern Iberian Peninsula. Extreme rainfall events are very important in this basin because with 106 dams it is the most dammed in Spain. These dams are almost exclusively used for hydropower generation, the installed generating capacity reaches more than 2700 MW and represents almost 9% of the total installed electrical generation capacity of the Iberian Peninsula, therefore with a potential impact on the energy market. We research the extreme events of rainfall an their return periods trying to reproduce the past extreme events of rainfall and their time periods to prove the proper functioning of the adapted model, so we can forecast future extreme events of rainfall in the basin. This research tries to optimize the storage of dams and adapt the management to problems as climate change. The results obtained are very relevant for hydroelectric generation because the operation of hydropower system depends primarily on the availability of storaged water.
ERIC Educational Resources Information Center
McConky, Katie Theresa
2013-01-01
This work covers topics in event coreference and event classification from spoken conversation. Event coreference is the process of identifying descriptions of the same event across sentences, documents, or structured databases. Existing event coreference work focuses on sentence similarity models or feature based similarity models requiring slot…
A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis
NASA Astrophysics Data System (ADS)
Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann
2017-04-01
The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for the entire study area.
CHARYBDIS: a black hole event generator
NASA Astrophysics Data System (ADS)
Harris, Christopher M.; Richardson, Peter; Webber, Bryan R.
2003-08-01
CHARYBDIS is an event generator which simulates the production and decay of miniature black holes at hadronic colliders as might be possible in certain extra dimension models. It interfaces via the Les Houches accord to general purpose Monte Carlo programs like HERWIG and PYTHIA which then perform the parton evolution and hadronization. The event generator includes the extra-dimensional `grey-body' effects as well as the change in the temperature of the black hole as the decay progresses. Various options for modelling the Planck-scale terminal decay are provided.
NASA Astrophysics Data System (ADS)
Lee, C. C.; Chen, W. S.
2015-06-01
This study is to know how the characteristics of sporadic E-layer (Es-layer) affect the generation of spread-F in the nighttime ionosphere near the crest of equatorial ionization anomaly during solar minimum. The data of Es-layer parameters and spread-F are obtained from the Chungli ionograms of 1996. The Es-layer parameters include foEs (critical frequency of Es-layer), fbEs (blanketing frequency of Es-layer), and Δf (≡foEs-fbEs). Results show that the nighttime variations of foEs and fbEs medians (Δf medians) are different from (similar to) that of the occurrence probabilities of spread-F. Because the total number of Es-layer events is greater than that of spread-F events, the comparison between the medians of Es-layer parameters and the occurrence probabilities of spread-F might have a shortfall. Further, we categorize the Es-layer and spread-F events into each frequency interval of Es-layer parameters. For the occurrence probabilities of spread-F versus foEs, an increasing trend is found in post-midnight of all three seasons. The increasing trend also exists in pre-midnight of the J-months and in post-midnight of all seasons, for the occurrence probabilities of spread-F versus Δf. These demonstrate that the spread-F occurrence increases with increasing foEs and/or Δf. Moreover, the increasing trends indicate that polarization electric fields generated in Es-layer assist to produce spread-F, through the electrodynamical coupling of Es-layer and F-region. Regarding the occurrence probabilities of spread-F versus fbEs, the significant trend only appears in post-midnight of the E-months. This implies that fbEs might not be a major factor for the spread-F formation.
Three-dimensional simulation of a rock slide impact into water
NASA Astrophysics Data System (ADS)
Weaver, R.; Gisler, G.; Gittings, M.; Ranta, D.
2007-12-01
The steep-sided fjords of western Norway have experienced numerous rock slide events that sometimes produced devastating tsunamis. The 1934 slide in the Tafjord region, when some 3 million cubic meters of rock plunged into the water, resulted in waves tens of meters high that destroyed two villages and killed about 40 people. A similarly dangerous situation exists now in Sunnylvsfjord, where a major expanding crack in the fjord wall at Aknes threatens to release from 5 to 40 million cubic meters of rock into the water. Such an event would devastate a large region, including the Geiranger Fjord, a UN World Heritage Site that is extremely popular with tourists. The Norwegian Government's Aknes-Tafjord project is responsible for studying and monitoring the potential slide area and for providing adequate warning to protect lives and property. In order to better understand tsunami generation from such events, we have performed 3-dimensional fully compressible hydrodynamical simulations of the impact of a large number of boulders from a steep slope into a deep body of water. We use the Los Alamos/SAIC adaptive-mesh-refined SAGE code, previously used to model tsunamis from underwater explosions, asteroid impacts, and both subaqueous and subaerial landslide sources. We find the interaction of boulders and water to be extremely turbulent and dissipative. It differs markedly from simulations of large-block impacts in similar geometry. No more than about 15% of the potential energy of the boulders ends up in the water wave. The rest of the energy goes into heating the boulders (and presumably fragmenting them, though that physics is not included) into generating winds, heating air and water, and generating turbulence. In the near field, the waves produced by the impact can be quite high -- tens of meters -- and have the potential to devastate coastlines at substantial distances from the site along a narrow fjord system.
DOE R&D Accomplishments Database
Perl, M. L.
1994-08-01
Several previous papers have given the history of the discovery of the {tau} lepton at the Stanford Linear Accelerator Center (SLAC). These papers emphasized (a) the experiments which led to our 1975 publication of the first evidence for the existence of the {tau}, (b) the subsequent experiments which confirmed the existence of the r, and (c) the experiments which elucidated the major properties of the {tau}. That history will be summarized in Part 2 of this talk. In this Part 1, I describe the earlier thoughts and work of myself and my colleagues at SLAC in the 1960's and early 1970's which led to the discovery. I also describe the theoretical and experimental events in particle physics in the 1960's in which our work was immersed. I will also try to describe for the younger generations of particle physicists, the atmosphere in the 1960's. That was before the elucidation of the quark model of hadrons, before the development of the concept of particle generations The experimental paths to program we hot as clear as they are today and we had to cast a wide experimental net.
Khachatryan, V.
2015-07-09
A search for pair production of third-generation scalar leptoquarks decaying to top quark andmore » $$\\tau$$ lepton pairs is presented using proton-proton collision data at a center-of-mass energy of $$\\sqrt{s}$$ = 8 TeV collected with the CMS detector at the LHC and corresponding to an integrated luminosity of 19.7 fb -1. The search is performed using events that contain an electron or a muon, a hadronically decaying $$\\tau$$ lepton, and two or more jets. The observations are found to be consistent with the standard model predictions. Assuming that all leptoquarks decay to a top quark and a $$\\tau$$ lepton, the existence of pair produced, charge -1/3, third-generation leptoquarks up to a mass of 685 GeV is excluded at 95% confidence level. This result constitutes the first direct limit for leptoquarks decaying into a top quark and a $$\\tau$$ lepton, and may also be applied directly to the pair production of bottom squarks decaying predominantly via the R-parity violating coupling λ' 333.« less
Liu, Biao; Conroy, Jeffrey M.; Morrison, Carl D.; Odunsi, Adekunle O.; Qin, Maochun; Wei, Lei; Trump, Donald L.; Johnson, Candace S.; Liu, Song; Wang, Jianmin
2015-01-01
Somatic Structural Variations (SVs) are a complex collection of chromosomal mutations that could directly contribute to carcinogenesis. Next Generation Sequencing (NGS) technology has emerged as the primary means of interrogating the SVs of the cancer genome in recent investigations. Sophisticated computational methods are required to accurately identify the SV events and delineate their breakpoints from the massive amounts of reads generated by a NGS experiment. In this review, we provide an overview of current analytic tools used for SV detection in NGS-based cancer studies. We summarize the features of common SV groups and the primary types of NGS signatures that can be used in SV detection methods. We discuss the principles and key similarities and differences of existing computational programs and comment on unresolved issues related to this research field. The aim of this article is to provide a practical guide of relevant concepts, computational methods, software tools and important factors for analyzing and interpreting NGS data for the detection of SVs in the cancer genome. PMID:25849937
Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes
NASA Astrophysics Data System (ADS)
Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana
2015-04-01
The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.
NASA Technical Reports Server (NTRS)
Hurwitz, Margaret M.; Garfinkel, Chaim I.; Newman, Paul A.; Oman, Luke D.
2013-01-01
Warm pool El Nino (WPEN) events are characterized by positive sea surface temperature (SST) anomalies in the central equatorial Pacific. Under present-day climate conditions, WPEN events generate poleward propagating wavetrains and enhance midlatitude planetary wave activity, weakening the stratospheric polar vortices. The late 21st century extratropical atmospheric response to WPEN events is investigated using the Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM), version 2. GEOSCCM simulations are forced by projected late 21st century concentrations of greenhouse gases (GHGs) and ozone-depleting substances (ODSs) and by SSTs and sea ice concentrations from an existing ocean-atmosphere simulation. Despite known ocean-atmosphere model biases, the prescribed SST fields represent a best estimate of the structure of late 21st century WPEN events. The future Arctic vortex response is qualitatively similar to that observed in recent decades but is weaker in late winter. This response reflects the weaker SST forcing in the Nino 3.4 region and subsequently weaker Northern Hemisphere tropospheric teleconnections. The Antarctic stratosphere does not respond to WPEN events in a future climate, reflecting a change in tropospheric teleconnections: The meridional wavetrain weakens while a more zonal wavetrain originates near Australia. Sensitivity simulations show that a strong poleward wavetrain response to WPEN requires a strengthening and southeastward extension of the South Pacific Convergence Zone; this feature is not captured by the late 21st century modeled SSTs. Expected future increases in GHGs and decreases in ODSs do not affect the polar stratospheric responses to WPEN.
Almendros, J.; Chouet, B.; Dawson, P.
2001-01-01
Array data from a seismic experiment carried out at Kilauea Volcano, Hawaii, in February 1997, are analyzed by the frequency-slowness method. The slowness vectors are determined at each of three small-aperture seismic antennas for the first arrivals of 1129 long-period (LP) events and 147 samples of volcanic tremor. The source locations are determined by using a probabilistic method which compares the event azimuths and slownesses with a slowness vector model. The results show that all the LP seismicity, including both discrete LP events and tremor, was generated in the same source region along the east flank of the Halemaumau pit crater, demonstrating the strong relation that exists between the two types of activities. The dimensions of the source region are approximately 0.6 X 1.0 X 0.5 km. For LP events we are able to resolve at least three different clusters of events. The most active cluster is centered ???200 m northeast of Halemaumau at depths shallower than 200 m beneath the caldera floor. A second cluster is located beneath the northeast quadrant of Halemaumau at a depth of ???400 m. The third cluster is <200 m deep and extends southeastward from the northeast quadrant of Halemaumau. Only one source zone is resolved for tremor. This zone is coincident with the most active source zone of LP events, northeast of Halemaumau. The location, depth, and size of the source region suggest a hydrothermal origin for all the analyzed LP seismicity. Copyright 2001 by the American Geophysical Union.
Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei
2017-05-01
Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Triggers and monitoring in intelligent personal health record.
Luo, Gang
2012-10-01
Although Web-based personal health records (PHRs) have been widely deployed, the existing ones have limited intelligence. Previously, we introduced expert system technology and Web search technology into the PHR domain and proposed the concept of an intelligent PHR (iPHR). iPHR provides personalized healthcare information to facilitate users' daily activities of living. The current iPHR is passive and follows the pull model of information distribution. This paper introduces triggers and monitoring into iPHR to make iPHR become active. Our idea is to let medical professionals pre-compile triggers and store them in iPHR's knowledge base. Each trigger corresponds to an abnormal event that may have potential medical impact. iPHR keeps collecting, processing, and analyzing the user's medical data from various sources such as wearable sensors. Whenever an abnormal event is detected from the user's medical data, the corresponding trigger fires and the related personalized healthcare information is pushed to the user using natural language generation technology, expert system technology, and Web search technology.
Decoding the regulatory landscape of melanoma reveals TEADS as regulators of the invasive cell state
Verfaillie, Annelien; Imrichova, Hana; Atak, Zeynep Kalender; Dewaele, Michael; Rambow, Florian; Hulselmans, Gert; Christiaens, Valerie; Svetlichnyy, Dmitry; Luciani, Flavie; Van den Mooter, Laura; Claerhout, Sofie; Fiers, Mark; Journe, Fabrice; Ghanem, Ghanem-Elias; Herrmann, Carl; Halder, Georg; Marine, Jean-Christophe; Aerts, Stein
2015-01-01
Transcriptional reprogramming of proliferative melanoma cells into a phenotypically distinct invasive cell subpopulation is a critical event at the origin of metastatic spreading. Here we generate transcriptome, open chromatin and histone modification maps of melanoma cultures; and integrate this data with existing transcriptome and DNA methylation profiles from tumour biopsies to gain insight into the mechanisms underlying this key reprogramming event. This shows thousands of genomic regulatory regions underlying the proliferative and invasive states, identifying SOX10/MITF and AP-1/TEAD as regulators, respectively. Knockdown of TEADs shows a previously unrecognized role in the invasive gene network and establishes a causative link between these transcription factors, cell invasion and sensitivity to MAPK inhibitors. Using regulatory landscapes and in silico analysis, we show that transcriptional reprogramming underlies the distinct cellular states present in melanoma. Furthermore, it reveals an essential role for the TEADs, linking it to clinically relevant mechanisms such as invasion and resistance. PMID:25865119
Major earthquakes recorded by Speleothems in Midwestern U.S. caves
Panno, S.V.; Lundstrom, C.C.; Hackley, Keith C.; Curry, B. Brandon; Fouke, B.W.; Zhang, Z.
2009-01-01
Historic earthquakes generated by the New Madrid seismic zone represent some of the largest recorded in the United States, yet prehistoric events are recognized only through deformation in late-Wisconsin to Holocene-age, near surface sediments (liquefaction, monoclinal folding, and changes in river meanders). In this article, we show that speleothems in caves of southwestern Illinois and southeastern Missouri may constitute a previously unrecognized recorder of large earthquakes in the U.S. midcontinent region. The timing of the initiation and regrowth of stalagmites in southwestern Illinois and southeastern Missouri caves is consistent with the historic and prehistoric record of several known seismic events in the U.S. midcontinent region. We conclude that dating the initiation of original stalagmite growth and later postearthquake rejuvenation constitutes a new paleoseismic method that has the potential for being applied to any region around the world in the vicinity of major seismic zones where caves exist. Use of this technique could expand the geographical distribution of paleoseimic data, document prehistoric earthquakes, and help improve interpretations of paleoearthquakes.
Basic experiments during loss of vacuum event (LOVE) in fusion experimental reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogawa, Masuro; Kunugi, Tomoaki; Seki, Yasushi
If a loss of vacuum event (LOVE) occurs due to damage of the vacuum vessel of a nuclear fusion experimental reactor, some chemical reactions such as a graphic oxidation and a buoyancy-driven exchange flow take place after equalization of the gas pressure between the inside and outside of the vacuum vessel. The graphite oxidation would generate inflammable carbon monoxide and release tritium retained in the graphite. The exchange flow through the breaches may transport the carbon monoxide and tritium out of the vacuum vessel. To add confidence to the safety evaluations and analyses, it is important to grasp the basicmore » phenomena such as the exchange flow and the graphite oxidation. Experiments of the exchange flow and the graphite oxidation were carried out to obtain the exchange flow rate and the rate constant for the carbon monoxide combustion, respectively. These experimental results were compared with existing correlations. The authors plan a scaled-model test and a full-scale model test for the LOVE.« less
Generating functions and stability study of multivariate self-excited epidemic processes
NASA Astrophysics Data System (ADS)
Saichev, A. I.; Sornette, D.
2011-09-01
We present a stability study of the class of multivariate self-excited Hawkes point processes, that can model natural and social systems, including earthquakes, epileptic seizures and the dynamics of neuron assemblies, bursts of exchanges in social communities, interactions between Internet bloggers, bank network fragility and cascading of failures, national sovereign default contagion, and so on. We present the general theory of multivariate generating functions to derive the number of events over all generations of various types that are triggered by a mother event of a given type. We obtain the stability domains of various systems, as a function of the topological structure of the mutual excitations across different event types. We find that mutual triggering tends to provide a significant extension of the stability (or subcritical) domain compared with the case where event types are decoupled, that is, when an event of a given type can only trigger events of the same type.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birkholzer, Jens; Pruess, Karsten; Lewicki, Jennifer
2005-09-19
While the purpose of geologic storage of CO{sub 2} in deep saline formations is to trap greenhouse gases underground, the potential exists for CO{sub 2} to escape from the target reservoir, migrate upward along permeable pathways, and discharge at the land surface. Such discharge is not necessarily a serious concern, as CO{sub 2} is a naturally abundant and relatively benign gas in low concentrations. However, there is a potential risk to health, safety and environment (HSE) in the event that large localized fluxes of CO{sub 2} were to occur at the land surface, especially where CO{sub 2} could accumulate. Inmore » this paper, we develop possible scenarios for large CO{sub 2} fluxes based on the analysis of natural analogues, where large releases of gas have been observed. We are particularly interested in scenarios which could generate sudden, possibly self-enhancing, or even eruptive release events. The probability for such events may be low, but the circumstances under which they might occur and potential consequences need to be evaluated in order to design appropriate site selection and risk management strategies. Numerical modeling of hypothetical test cases is needed to determine critical conditions for such events, to evaluate whether such conditions may be possible at designated storage sites, and, if applicable, to evaluate the potential HSE impacts of such events and design appropriate mitigation strategies.« less
Molecular toolbox for the identification of unknown genetically modified organisms.
Ruttink, Tom; Demeyer, Rolinde; Van Gulck, Elke; Van Droogenbroeck, Bart; Querci, Maddalena; Taverniers, Isabel; De Loose, Marc
2010-03-01
Competent laboratories monitor genetically modified organisms (GMOs) and products derived thereof in the food and feed chain in the framework of labeling and traceability legislation. In addition, screening is performed to detect the unauthorized presence of GMOs including asynchronously authorized GMOs or GMOs that are not officially registered for commercialization (unknown GMOs). Currently, unauthorized or unknown events are detected by screening blind samples for commonly used transgenic elements, such as p35S or t-nos. If (1) positive detection of such screening elements shows the presence of transgenic material and (2) all known GMOs are tested by event-specific methods but are not detected, then the presence of an unknown GMO is inferred. However, such evidence is indirect because it is based on negative observations and inconclusive because the procedure does not identify the causative event per se. In addition, detection of unknown events is hampered in products that also contain known authorized events. Here, we outline alternative approaches for analytical detection and GMO identification and develop new methods to complement the existing routine screening procedure. We developed a fluorescent anchor-polymerase chain reaction (PCR) method for the identification of the sequences flanking the p35S and t-nos screening elements. Thus, anchor-PCR fingerprinting allows the detection of unique discriminative signals per event. In addition, we established a collection of in silico calculated fingerprints of known events to support interpretation of experimentally generated anchor-PCR GM fingerprints of blind samples. Here, we first describe the molecular characterization of a novel GMO, which expresses recombinant human intrinsic factor in Arabidopsis thaliana. Next, we purposefully treated the novel GMO as a blind sample to simulate how the new methods lead to the molecular identification of a novel unknown event without prior knowledge of its transgene sequence. The results demonstrate that the new methods complement routine screening procedures by providing direct conclusive evidence and may also be useful to resolve masking of unknown events by known events.
Emotional intensity in episodic autobiographical memory and counterfactual thinking.
Stanley, Matthew L; Parikh, Natasha; Stewart, Gregory W; De Brigard, Felipe
2017-02-01
Episodic counterfactual thoughts-imagined alternative ways in which personal past events might have occurred-are frequently accompanied by intense emotions. Here, participants recollected positive and negative autobiographical memories and then generated better and worse episodic counterfactual events from those memories. Our results suggest that the projected emotional intensity during the simulated remembered/imagined event is significantly higher than but typically positively related to the emotional intensity while remembering/imagining the event. Furthermore, repeatedly simulating counterfactual events heightened the emotional intensity felt while simulating the counterfactual event. Finally, for both the emotional intensity accompanying the experience of remembering/imagining and the projected emotional intensity during the simulated remembered/imagined event, the emotional intensity of negative memories was greater than the emotional intensity of upward counterfactuals generated from them but lower than the emotional intensity of downward counterfactuals generated from them. These findings are discussed in relation to clinical work and functional theories of counterfactual thinking. Copyright © 2017 Elsevier Inc. All rights reserved.
Battaglia, J.; Got, J.-L.; Okubo, P.
2003-01-01
We present methods for improving the location of long-period (LP) events, deep and shallow, recorded below Kilauea Volcano by the permanent seismic network. LP events might be of particular interest to understanding eruptive processes as their source mechanism is assumed to directly involve fluid transport. However, it is usually difficult or impossible to locate their source using traditional arrival time methods because of emergent wave arrivals. At Kilauea, similar LP waveform signatures suggest the existence of LP multiplets. The waveform similarity suggests spatially close sources, while catalog solutions using arrival time estimates are widely scattered beneath Kilauea's summit caldera. In order to improve estimates of absolute LP location, we use the distribution of seismic amplitudes corrected for station site effects. The decay of the amplitude as a function of hypocentral distance is used for inferring LP location. In a second stage, we use the similarity of the events to calculate their relative positions. The analysis of the entire LP seismicity recorded between January 1997 and December 1999 suggests that a very large part of the LP event population, both deep and shallow, is generated by a small number of compact sources. Deep events are systematically composed of a weak high-frequency onset followed by a low-frequency wave train. Aligning the low-frequency wave trains does not lead to aligning the onsets indicating the two parts of the signal are dissociated. This observation favors an interpretation in terms of triggering and resonance of a magmatic conduit. Instead of defining fault planes, the precise relocation of similar LP events, based on the alignment of the high-energy low-frequency wave trains, defines limited size volumes. Copyright 2003 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Boslough, M.
2011-12-01
Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.
System on chip module configured for event-driven architecture
Robbins, Kevin; Brady, Charles E.; Ashlock, Tad A.
2017-10-17
A system on chip (SoC) module is described herein, wherein the SoC modules comprise a processor subsystem and a hardware logic subsystem. The processor subsystem and hardware logic subsystem are in communication with one another, and transmit event messages between one another. The processor subsystem executes software actors, while the hardware logic subsystem includes hardware actors, the software actors and hardware actors conform to an event-driven architecture, such that the software actors receive and generate event messages and the hardware actors receive and generate event messages.
Modeling of the Geosocial Process using GIS «Disasters»
NASA Astrophysics Data System (ADS)
Vikulina, Marina; Turchaninova, Alla; Dolgaya, Anna; Vikulin, Alexandr; Petrova, Elena
2016-04-01
The natural and social disasters generate a huge stress in the world community. Most researches searching for the relationships between different catastrophic events consider the limited sets of disasters and do not take into account their size. This fact puts to doubt the completeness and statistical significance of such approach. Thus the next indispensible step is to overpass from narrow subject framework researches of disasters to more complex researches. In order to study the relationships between the Nature and the Society a database of natural disasters and dreadful social events occurred during the last XXXVI (36) centuries of human history weighted by the magnitude was created and became a core of the GIS «Disasters» (ArcGIS 10.0). By the moment the database includes more than 2500 most socially significant ("strong") catastrophic natural (earthquakes, fires, floods, droughts, climatic anomalies, other natural disasters) as well as social (wars, revolts, genocide, epidemics, fires caused by the human being, other social disasters) events. So far, each event is presented as a point feature located in the center of the struck region in the World Map. If the event affects several countries, it is placed in the approximate center of the affected area. Every event refers to the country or group of countries which are located in a zone of its influence now. The grade J (I, II and III) is specified for each event according to the disaster force assessment scale developed by the authors. The GIS with such a detailed database of disastrous events weighted by the magnitude over a long period of time is compiled for the first time and creates fairly complete and statistically representative basis for studies of the distribution of natural and social disasters and their relationship. By the moment the statistical analysis of the database performed both for each aggregate (natural disasters and catastrophic social phenomena), and for particular statistically representative types of events led to the following conclusions: natural disasters and dreadful social events have appeared to be closely related to each other despite their apparently different nature. The numbers of events of different magnitude are distributed by logarithmic law: the bigger the event, the less likely it happens. For each type of events and each aggregate the existence of periodicities with periods of 280 ± 60 years was established. The identified properties of cyclicity, grouping and interaction create a basis for modeling essentially unified Geosocial Process at a high enough statistical level and prove the existence of the uniform planetary Geosocial Process. The evidence of interaction between "lifeless" Nature and Society is fundamental and provided a new forecasting approach of demographic crises taking into account both natural disasters and social phenomena. The idea of the interaction of Nature and Society through the disasters «exchange» as a uniform planetary Geosocial Process is an essentially new statement introduced for the first time.
76 FR 49769 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-11
... partner or limited partner, (4) adding event types to the 4(k) schedule, (5) requiring the reporting of...: FR Y-10, FR Y-6, and FR Y-7. OMB control number: 7100-0297. Frequency: FR Y-10: Event-generated; FR Y... an event generated information collection submitted by FBOs; top-tier BHCs; state member banks...
NASA Astrophysics Data System (ADS)
Kersevan, Borut Paul; Richter-Waş, Elzbieta
2013-03-01
The AcerMC Monte Carlo generator is dedicated to the generation of Standard Model background processes which were recognised as critical for the searches at LHC, and generation of which was either unavailable or not straightforward so far. The program itself provides a library of the massive matrix elements (coded by MADGRAPH) and native phase space modules for generation of a set of selected processes. The hard process event can be completed by the initial and the final state radiation, hadronisation and decays through the existing interface with either PYTHIA, HERWIG or ARIADNE event generators and (optionally) TAUOLA and PHOTOS. Interfaces to all these packages are provided in the distribution version. The phase-space generation is based on the multi-channel self-optimising approach using the modified Kajantie-Byckling formalism for phase space construction and further smoothing of the phase space was obtained by using a modified ac-VEGAS algorithm. An additional improvement in the recent versions is the inclusion of the consistent prescription for matching the matrix element calculations with parton showering for a select list of processes. Catalogue identifier: ADQQ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADQQ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3853309 No. of bytes in distributed program, including test data, etc.: 68045728 Distribution format: tar.gz Programming language: FORTRAN 77 with popular extensions (g77, gfortran). Computer: All running Linux. Operating system: Linux. Classification: 11.2, 11.6. External routines: CERNLIB (http://cernlib.web.cern.ch/cernlib/), LHAPDF (http://lhapdf.hepforge.org/) Catalogue identifier of previous version: ADQQ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 149(2003)142 Does the new version supersede the previous version?: Yes Nature of problem: Despite a large repertoire of processes implemented for generation in event generators like PYTHIA [1] or HERWIG [2] a number of background processes, crucial for studying the expected physics of the LHC experiments, is missing. For some of these processes the matrix element expressions are rather lengthy and/or to achieve a reasonable generation efficiency it is necessary to tailor the phase space selection procedure to the dynamics of the process. That is why it is not practical to imagine that any of the above general purpose generators will contain every, or even only observable, processes which will occur at LHC collisions. A more practical solution can be found in a library of dedicated matrix-element-based generators, with the standardised interfaces like that proposed in [3], to the more universal one which is used to complete the event generation. Solution method: The AcerMC EventGenerator provides a library of the matrix-element-based generators for several processes. The initial- and final-state showers, beam remnants and underlying events, fragmentation and remaining decays are supposed to be performed by the other universal generator to which this one is interfaced. We will call it a supervising generator. The interfaces to PYTHIA 6.4, ARIADNE 4.1 and HERWIG 6.5, as such generators, are provided. Provided is also an interface to TAUOLA [4] and PHOTOS [5] packages for τ-lepton decays (including spin correlations treatment) and QED radiations in decays of particles. At present, the following matrix-element-based processes have been implemented: gg,qq¯→tt¯bb¯, qq¯→W(→ℓν)bb¯; qq¯→W(→ℓν)tt¯; gg,qq¯→Z/γ∗(→ℓℓ)bb¯; gg,qq¯→Z/γ∗(→ℓℓ,νν,bb¯)tt¯; complete EW gg,qq¯→(Z/W/γ∗→)tt¯bb¯; gg,qq¯→tt¯tt¯; gg,qq¯→(tt¯→)ff¯bff¯b¯; gg,qq¯→(WWbb →)ff¯ff¯bb¯. Both interfaces allow the use of the LHAPDF/LHAGLUE library of parton density functions. Provided is also a set of control processes: qq¯→W→ℓν; qq¯→Z/γ∗→ℓℓ; gg,qq¯→tt¯ and gg→(tt¯→)WbWb¯; Reasons for new version: Implementation of several new processes and methods. Summary of revisions: Each version added new processes or functionalities, a detailed list is given in the section “Changes since AcerMC 1.0”. Restrictions: The package is optimised for the 14 TeV pp collision simulated in the LHC environment and also works at the achieved LHC energies of 7 TeV and 8 TeV. The consistency between results of the complete generation using PYTHIA 6.4 or HERWIG 6.5 interfaces is technically limited by the different approaches taken in both these generators for evaluating αQCD and αQED couplings and by the different models for fragmentation/hadronisation. For the consistency check, in the AcerMC library contains native coded definitions of the QCD and αQED. Using these native definitions leads to the same total cross-sections both with PYTHIA 6.4 or HERWIG 6.5 interfaces.
Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services
NASA Astrophysics Data System (ADS)
Collins, Patrick; Bahr, Thomas
2016-04-01
The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of the DEM difference to analyze the surface changes in 3D. The automated point cloud generation and analysis introduced here can be embedded in virtually any existing geospatial workflow for operational applications. Three integration options were implemented in this case study: • Integration within any ArcGIS environment whether deployed on the desktop, in the cloud, or online. Execution uses a customized ArcGIS script tool. A Python script file retrieves the parameters from the user interface and runs the precompiled IDL code. That IDL code is used to interface between the Python script and the relevant ENVITasks. • Publishing the point cloud processing tasks as services via the ENVI Services Engine (ESE). ESE is a cloud-based image analysis solution to publish and deploy advanced ENVI image and data analytics to existing enterprise infrastructures. For this purpose the entire IDL code can be capsuled in a single ENVITask. • Integration in an existing geospatial workflow using the Python-to-IDL Bridge. This mechanism allows calling IDL code within Python on a user-defined platform. The results of this case study allow a 3D estimation of the topographic changes within the tectonically active and anthropogenically invaded Malin area after the landslide event. Accordingly, the point cloud analysis was correlated successfully with modelled displacement contours of the slope. Based on optical satellite imagery, such point clouds of high precision and density distribution can be obtained in a few minutes to support the operational monitoring of landslide processes.
Multiple runoff processes and multiple thresholds control agricultural runoff generation
NASA Astrophysics Data System (ADS)
Saffarpour, Shabnam; Western, Andrew W.; Adams, Russell; McDonnell, Jeffrey J.
2016-11-01
Thresholds and hydrologic connectivity associated with runoff processes are a critical concept for understanding catchment hydrologic response at the event timescale. To date, most attention has focused on single runoff response types, and the role of multiple thresholds and flow path connectivities has not been made explicit. Here we first summarise existing knowledge on the interplay between thresholds, connectivity and runoff processes at the hillslope-small catchment scale into a single figure and use it in examining how runoff response and the catchment threshold response to rainfall affect a suite of runoff generation mechanisms in a small agricultural catchment. A 1.37 ha catchment in the Lang Lang River catchment, Victoria, Australia, was instrumented and hourly data of rainfall, runoff, shallow groundwater level and isotope water samples were collected. The rainfall, runoff and antecedent soil moisture data together with water levels at several shallow piezometers are used to identify runoff processes in the study site. We use isotope and major ion results to further support the findings of the hydrometric data. We analyse 60 rainfall events that produced 38 runoff events over two runoff seasons. Our results show that the catchment hydrologic response was typically controlled by the Antecedent Soil Moisture Index and rainfall characteristics. There was a strong seasonal effect in the antecedent moisture conditions that led to marked seasonal-scale changes in runoff response. Analysis of shallow well data revealed that streamflows early in the runoff season were dominated primarily by saturation excess overland flow from the riparian area. As the runoff season progressed, the catchment soil water storage increased and the hillslopes connected to the riparian area. The hillslopes transferred a significant amount of water to the riparian zone during and following events. Then, during a particularly wet period, this connectivity to the riparian zone, and ultimately to the stream, persisted between events for a period of 1 month. These findings are supported by isotope results which showed the dominance of pre-event water, together with significant contributions of event water early (rising limb and peak) in the event hydrograph. Based on a combination of various hydrometric analyses and some isotope and major ion data, we conclude that event runoff at this site is typically a combination of subsurface event flow and saturation excess overland flow. However, during high intensity rainfall events, flashy catchment flow was observed even though the soil moisture threshold for activation of subsurface flow was not exceeded. We hypothesise that this was due to the activation of infiltration excess overland flow and/or fast lateral flow through preferential pathways on the hillslope and saturation overland flow from the riparian zone.
A trick to improve the efficiency of generating unweighted B events from BCVEGPY
NASA Astrophysics Data System (ADS)
Wang, Xian-You; Wu, Xing-Gang
2012-02-01
In the present paper, we provide an addendum to improve the efficiency of generating unweighted events within PYTHIA environment for the generator BCVEGPY2.1 [C.H. Chang, J.X. Wang, X.G. Wu, Comput. Phys. Commun. 174 (2006) 241]. This trick is helpful for experimental simulation. Moreover, the BCVEGPY output has also been improved, i.e. one Les Houches Event common block has been added so as to generate a standard Les Houches Event file that contains the information of the generated B meson and the accompanying partons, which can be more conveniently used for further simulation. New version program summaryTitle of program: BCVEGPY2.1a Catalogue identifier: ADTJ_v2_2 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTJ_v2_2.html Program obtained from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 166 133 No. of bytes in distributed program, including test data, etc.: 1 655 390 Distribution format: tar.gz Programming language used: FORTRAN 77/90 Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating systems: LINUX RAM: About 2.0 MB Classification: 11.2, 11.5 Catalogue identifier of previous version: ADTJ_v2_1 Reference in CPC: Comput. Phys. Commun. 175 (2006) 624 Does the new version supersede the old program: No Nature of physical problem: Hadronic Production of B meson and its excited states Method of solution: To generate weighted and unweighted B events within PYTHIA environment effectively. Restrictions on the complexity of the problem: Hadronic production of ( cb¯)-quarkonium via the gluon-gluon fusion mechanism are given by the 'complete calculation approach'. The simulation of B events is done within PYTHIA environment. Reasons for new version: More and more data are accumulated at the large hadronic collider, it would be possible to make precise studies on B meson properties, such as its lifetime, mass spectrum and etc. The BCVEGPY has been adopted by several experimental groups due to its high efficiency in comparison to that of PYTHIA. However, to generate unweighted events with PYTHIA inner mechanism as programmed by the previous version is still time-consuming. So it would be helpful to improve the efficiency for generating unweighted events within PYTHIA. Moreover, it would be better to use an uniform and standard output format for further detector simulation. Typical running time: Typical running time is machine and user-parameters dependent. I) To generate 10 6 weighted S-wave ( cb¯)-quarkonium events (IDWTUP = 3), it will take about 40 minutes on a 1.8 GHz Intel P4-processor machine. II) To generate unweighted S-wave ( cb¯)-quarkonium events with PYTHIA inner structure (IDWTUP = 1), it will take about 20 hour on a 1.8 GHz Intel P4-processor machine to generate 1000 events. III) To generate 10 6 unweighted S-wave ( cb¯)-quarkonium events with the present trick (IDWTUP = 1), it will take 17 hour on a 3.16 Hz Intel E8500 processor machine. Moreover, it can be found that the running time for the P-wave ( cb¯)-quarkonium production is about two times longer than the case of S-wave production under the same conditions. Keywords: Event generator; Hadronic production; B meson; Unweighted events Summary of revisions: (1) The generator BCVEGPY [1-3] has been programmed to generate B events under PYTHIA environment [4], which has been frequently adopted for theoretical and experimental studies, e.g. Refs. [5-18]. It is found that each experimental group shall have its own simulation software architecture, and the users will spend a lot of time to write an interface so as to implement BCVEGPY into their own software. So it would be better to supply a standard output. The LHE format becomes a standard format [19], which is proposed to store process and event information from the matrix-element-based generators. The users can pass these parton-level information to the general event generators like PYTHIA and HERWIG [20] for further simulation. For such purpose, we add two common blocks in genevent.F. One common block is called as bcvegpy_pyupin and the other one is write_lhe. The bcvegpy_pyupin, which is similar to PYUPIN subroutine in PYTHIA, stores the initialization information in the HEPRUP common block. INTEGER MAXPUP PARAMETER (MAXPUP = 100) INTEGER IDBMUP,PDFGUP,PDFSUP,IDWTUP,NPRUP,LPRUP DOUBLE PRECISION EBMUP,XSECUP,XERRUP,XMAXUP COMMON/HEPRUP/IDBMUP(2),EBMUP(2),PDFGUP(2),PDFSUP(2), &IDWTUP,NPRUP,XSECUP(MAXPUP),XERRUP(MAXPUP), &XMAXUP(MAXPUP),LPRUP(MAXPUP) The write_lhe, which is similar to PYUPEV subroutine in pythia, stores the information of each separate event in the HEPEUP common block. INTEGER MAXNUP PARAMETER (MAXNUP = 500) INTEGER NUP,IDPRUP,IDUP,ISTUP,MOTHUP,ICOLUP DOUBLE PRECISION XWGTUP,SCALUP,AQEDUP,AQCDUP,PUP,VTIMUP, &SPINUP COMMON/HEPEUP/NUP,IDPRUP,XWGTUP,SCALUP,AQEDUP,AQCDUP, &IDUP(MAXNUP),ISTUP(MAXNUP),MOTHUP(2,MAXNUP), &ICOLUP(2,MAXNUP),PUP(5,MAXNUP),VTIMUP(MAXNUP), &SPINUP(MAXNUP)
Episodic and semantic content of memory and imagination: A multilevel analysis.
Devitt, Aleea L; Addis, Donna Rose; Schacter, Daniel L
2017-10-01
Autobiographical memories of past events and imaginations of future scenarios comprise both episodic and semantic content. Correlating the amount of "internal" (episodic) and "external" (semantic) details generated when describing autobiographical events can illuminate the relationship between the processes supporting these constructs. Yet previous studies performing such correlations were limited by aggregating data across all events generated by an individual, potentially obscuring the underlying relationship within the events themselves. In the current article, we reanalyzed datasets from eight studies using a multilevel approach, allowing us to explore the relationship between internal and external details within events. We also examined whether this relationship changes with healthy aging. Our reanalyses demonstrated a largely negative relationship between the internal and external details produced when describing autobiographical memories and future imaginations. This negative relationship was stronger and more consistent for older adults and was evident both in direct and indirect measures of semantic content. Moreover, this relationship appears to be specific to episodic tasks, as no relationship was observed for a nonepisodic picture description task. This negative association suggests that people do not generate semantic information indiscriminately, but do so in a compensatory manner, to embellish episodically impoverished events. Our reanalysis further lends support for dissociable processes underpinning episodic and semantic information generation when remembering and imagining autobiographical events.
Yokotsuka, M; Aoyama, M; Kubota, K
2000-07-01
The Medical Dictionary for Regulatory Activities Terminology (MedDRA) version 2.1 (V2.1) was released in March 1999 accompanied by the MedDRA/J V2.1J specifically for Japanese users. In prescription-event monitoring in Japan (J-PEM), we have employed the MedDRA/J for data entry, signal generation and event listing. In J-PEM, the lowest level terms (LLTs) in the MedDRA/J are used in data entry because the richness of LLTs is judged to be advantageous. A signal is generated normally at the preferred term (PT) level, but it has been found that various reporters describe the same event using descriptions that are potentially encoded by LLTs under different PTs. In addition, some PTs are considered too specific to generate the proper signal. In the system used in J-PEM, when an LLT is selected as a candidate to encode an event, another LLT under a different PT, if any, is displayed on the computer screen so that it may be coded instead of, or in addition to, the candidate LLT. The five-level structure of the MedDRA is used when listing events but some modification is required to generate a functional event list.
Capture of complexity of specialty care in pediatric cardiology by work RVU measures.
Bergersen, Lisa; Gauvreau, Kimberlee; McElhinney, Doff; Fenwick, Sandra; Kirshner, David; Harding, Julie; Hickey, Patricia; Mayer, John; Marshall, Audrey
2013-02-01
We sought to determine the relationship between relative value units (RVUs) and intended measures of work in catheterization for congenital heart disease. RVU was determined by matching RVU values to Current Procedural Terminology codes generated for cases performed at a single institution. Differences in median case duration, radiation exposure, adverse events, and RVU values by risk category and cases were assessed. Interventional case types were ranked from lowest to highest median RVU value, and correlations with case duration, radiation dose, and a cases-predicted probability of an adverse event were quantified with the Spearman rank correlation coefficient. Between January 2008 and December 2010, 3557 of 4011 cases were identified with an RVU and risk category designation, of which 2982 were assigned a case type. Median RVU values, radiation dose, and case duration increased with procedure risk category. Although all diagnostic cases had similar RVU values (median 10), adverse event rates ranged from 6% to 21% by age group (P < .001). Median RVU values ranged from 9 to 54 with the lowest in diagnostic and biopsy cases and increasing with isolated and then multiple interventions. Among interventional cases, no correlation existed between ranked RVU value and case duration, radiation dose, or adverse event probability (P = .13, P = .62, and P = .43, respectively). Time, skill, and stress inherent to performing catheterization procedures for congenital heart disease are not captured by measurement of RVU alone.
O'Neel, Shad; Larsen, Christopher F.; Rupert, Natalia; Hansen, Roger
2010-01-01
Since the installation of the Alaska Regional Seismic Network in the 1970s, data analysts have noted nontectonic seismic events thought to be related to glacier dynamics. While loose associations with the glaciers of the St. Elias Mountains have been made, no detailed study of the source locations has been undertaken. We performed a two-step investigation surrounding these events, beginning with manual locations that guided an automated detection and event sifting routine. Results from the manual investigation highlight characteristics of the seismic waveforms including single-peaked (narrowband) spectra, emergent onsets, lack of distinct phase arrivals, and a predominant cluster of locations near the calving termini of several neighboring tidewater glaciers. Through these locations, comparison with previous work, analyses of waveform characteristics, frequency-magnitude statistics and temporal patterns in seismicity, we suggest calving as a source for the seismicity. Statistical properties and time series analysis of the event catalog suggest a scale-invariant process that has no single or simple forcing. These results support the idea that calving is often a response to short-lived or localized stress perturbations. Our results demonstrate the utility of passive seismic instrumentation to monitor relative changes in the rate and magnitude of iceberg calving at tidewater glaciers that may be volatile or susceptible to ensuing rapid retreat, especially when existing seismic infrastructure can be used.
The Search for Muon Neutrinos from Northern Hemisphere Gamma-Ray Bursts with AMANDA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achterberg, A.; Ackermann, M.; Bernardini, E.
2008-02-10
We present the results of the analysis of neutrino observations by the Antarctic Muon and Neutrino Detector Array (AMANDA) correlated with photon observations of more than 400 gamma-ray bursts (GRBs) in the northern hemisphere from 1997 to 2003. During this time period, AMANDA's effective collection area for muon neutrinos was larger than that of any other existing detector. After the application of various selection criteria to our data, we expect {approx}1 neutrino event and <2 background events. Based on our observations of zero events during and immediately prior to the GRBs in the data set, we set the most stringentmore » upper limit on muon neutrino emission correlated with GRBs. Assuming a Waxman-Bahcall spectrum and incorporating all systematic uncertainties, our flux upper limit has a normalization at 1 PeV of E{sup 2}{phi}{sub {nu}} {<=} 6.3 x 10{sup -9} GeV cm{sup -2} s{sup -1} sr{sup -1}, with 90% of the events expected within the energy range of {approx}10 TeV to {approx}3 PeV. The impact of this limit on several theoretical models of GRBs is discussed, as well as the future potential for detection of GRBs by next-generation neutrino telescopes. Finally, we briefly describe several modifications to this analysis in order to apply it to other types of transient point sources.« less
Optical turbulence and transverse rogue waves in a cavity with triple-quantum-dot molecules
NASA Astrophysics Data System (ADS)
Eslami, M.; Khanmohammadi, M.; Kheradmand, R.; Oppo, G.-L.
2017-09-01
We show that optical turbulence extreme events can exist in the transverse dynamics of a cavity containing molecules of triple quantum dots under conditions close to tunneling-induced transparency. These nanostructures, when coupled via tunneling, form a four-level configuration with tunable energy-level separations. We show that such a system exhibits multistability and bistability of Turing structures in instability domains with different critical wave vectors. By numerical simulation of the mean-field equation that describes the transverse dynamics of the system, we show that the simultaneous presence of two transverse solutions with opposite nonlinearities gives rise to a series of turbulent structures with the capability of generating two-dimensional rogue waves.
2011-01-01
The impact of Mount Pinatubo’s 1991 eruption on the traditional use of natural resources by the indigenous Aeta was devastating. The damage resulted in the immediate and sustained disconnection of traditional knowledge from the biological resources integral to practice that knowledge. The relatively slow ecosystem recovery a full 20 years after the event hinders the transfer of traditional knowledge to younger generations of Aeta. Their traditional knowledge is at risk of disappearing from the cultural fabric of the Philippines. In seeking to adapt, decisions by the Aeta to accept the development of foreign-designed ecotourism enterprises may negatively affect natural ecosystem recovery. Alternatives to the existing ecotourism practices may be warranted to safeguard Aeta traditional knowledge. PMID:22446557
Marler, Thomas E
2011-11-01
The impact of Mount Pinatubo's 1991 eruption on the traditional use of natural resources by the indigenous Aeta was devastating. The damage resulted in the immediate and sustained disconnection of traditional knowledge from the biological resources integral to practice that knowledge. The relatively slow ecosystem recovery a full 20 years after the event hinders the transfer of traditional knowledge to younger generations of Aeta. Their traditional knowledge is at risk of disappearing from the cultural fabric of the Philippines. In seeking to adapt, decisions by the Aeta to accept the development of foreign-designed ecotourism enterprises may negatively affect natural ecosystem recovery. Alternatives to the existing ecotourism practices may be warranted to safeguard Aeta traditional knowledge.
Construction of a photovoltaic power system at Natural Bridges National Monument
NASA Astrophysics Data System (ADS)
Benoit, A. E.
1980-12-01
A 100 kW peak photovoltaic (PV) power system at Natural Bridges National Monument in Utah is described. This system is the largest of its kind in the world. The construction phases of the program are described, and a chronological history of the events and problems encountered when such a large and complex task is undertaken in a remote area with very limited fabrication facilities is given. This experiment demonstrates the application of solar energy to the variety of loads found in a small and remote community. This solar energy system was designed to meet all electrical requirements when there is no utility grid, with only occasional back-up from an existing diesel generator.
A Multi-Hazard Vulnerability Assessment of Coastal Landmarks along Cape Hatteras National Seashore
NASA Astrophysics Data System (ADS)
Flynn, M. J.
2015-12-01
Cape Hatteras National Seashore is located along the Outer Banks, a narrow string of barrier islands in eastern North Carolina. The seashore was established to preserve cultural and natural resources of national significance, yet these islands have shoreline rates of change that are predominately erosional, frequently experience storm surge inundation driven by tropical and extra-tropical storm events, and are highly vulnerable to sea level rise. The National Park Service staff are concerned about the vulnerability of historic structures located within the park, and recognized the utility of a coastal hazard risk assessment to assist park managers with long-term planning. They formed a cooperative agreement with researchers at East Carolina University to conduct the assessment, which primarily used GIS to evaluate the susceptibility of 27 historical structures to coastal erosion, storm surge, and sea-level rise. The Digital Shoreline Analysis System was used to calculate a linear regression rate of shoreline movement based on historical shorelines. Those rates were used to simulate the future position of the shoreline along transects. The SLOSH model output was down scaled to a DEM generated from the 2014 NC QL2 LiDAR collection to determine the extent and depth of inundation that would occur from storm events. Sea level rise was modeled for various scenarios referenced to existing MHHW, and also added to each SLOSH model output to determine the effect of a storm event under those sea level rise scenarios. Risk maps were developed to include not only areal coverage for existing structures and districts, but also identify potential areas of relocation or retreat in the long-term. In addition to evaluating vulnerability, timelines for potential impacts provided scenarios for National Park Service staff to research adaption and mitigation strategies.
Chen, Xing-Jie; Liu, Lu-Lu; Cui, Ji-Fang; Wang, Ya; Shum, David H. K.; Chan, Raymond C. K.
2015-01-01
Mental time travel refers to the ability to recall episodic past and imagine future events. The present study aimed to investigate cultural differences in mental time travel between Chinese and Australian university students. A total of 231 students (108 Chinese and 123 Australians) participated in the study. Their mental time travel abilities were measured by the Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test (SCEFT). Results showed that there were no cultural differences in the number of specific events generated for the past or future. Significant differences between the Chinese and Australian participants were found mainly in the emotional valence and content of the events generated. Both Chinese and Australian participants generated more specific positive events compared to negative events when thinking about the future and Chinese participants were more positive about their past than Australian participants when recalling specific events. For content, Chinese participants recalled more events about their interpersonal relationships, while Australian participants imagined more about personal future achievements. These findings shed some lights on cultural differences in episodic past and future thinking. PMID:26167154
Cost-Effective Strategies for Rural Community Outreach, Hawaii, 2010–2011
Barbato, Anna; Holuby, R. Scott; Ciarleglio, Anita E.; Taniguchi, Ronald
2014-01-01
Three strategies designed to maximize attendance at educational sessions on chronic disease medication safety in older adults in rural areas were implemented sequentially and compared for cost-effectiveness: 1) existing community groups and events, 2) formal advertisement, and 3) employer-based outreach. Cost-effectiveness was measured by comparing overall cost per attendee recruited and number of attendees per event. The overall cost per attendee was substantially higher for the formal advertising strategy, which produced the lowest number of attendees per event. Leveraging existing community events and employers in rural areas was more cost-effective than formal advertisement for recruiting rural community members. PMID:25496555
Cost-effective strategies for rural community outreach, Hawaii, 2010-2011.
Pellegrin, Karen L; Barbato, Anna; Holuby, R Scott; Ciarleglio, Anita E; Taniguchi, Ronald
2014-12-11
Three strategies designed to maximize attendance at educational sessions on chronic disease medication safety in older adults in rural areas were implemented sequentially and compared for cost-effectiveness: 1) existing community groups and events, 2) formal advertisement, and 3) employer-based outreach. Cost-effectiveness was measured by comparing overall cost per attendee recruited and number of attendees per event. The overall cost per attendee was substantially higher for the formal advertising strategy, which produced the lowest number of attendees per event. Leveraging existing community events and employers in rural areas was more cost-effective than formal advertisement for recruiting rural community members.
Multi-viewpoint Coronal Mass Ejection Catalog Based on STEREO COR2 Observations
NASA Astrophysics Data System (ADS)
Vourlidas, Angelos; Balmaceda, Laura A.; Stenborg, Guillermo; Dal Lago, Alisson
2017-04-01
We present the first multi-viewpoint coronal mass ejection (CME) catalog. The events are identified visually in simultaneous total brightness observations from the twin SECCHI/COR2 coronagraphs on board the Solar Terrestrial Relations Observatory mission. The Multi-View CME Catalog differs from past catalogs in three key aspects: (1) all events between the two viewpoints are cross-linked, (2) each event is assigned a physics-motivated morphological classification (e.g., jet, wave, and flux rope), and (3) kinematic and geometric information is extracted semi-automatically via a supervised image segmentation algorithm. The database extends from the beginning of the COR2 synoptic program (2007 March) to the end of dual-viewpoint observations (2014 September). It contains 4473 unique events with 3358 events identified in both COR2s. Kinematic properties exist currently for 1747 events (26% of COR2-A events and 17% of COR2-B events). We examine several issues, made possible by this cross-linked CME database, including the role of projection on the perceived morphology of events, the missing CME rate, the existence of cool material in CMEs, the solar cycle dependence on CME rate, speeds and width, and the existence of flux rope within CMEs. We discuss the implications for past single-viewpoint studies and for Space Weather research. The database is publicly available on the web including all available measurements. We hope that it will become a useful resource for the community.
BEEC: An event generator for simulating the Bc meson production at an e+e- collider
NASA Astrophysics Data System (ADS)
Yang, Zhi; Wu, Xing-Gang; Wang, Xian-You
2013-12-01
The Bc meson is a doubly heavy quark-antiquark bound state and carries flavors explicitly, which provides a fruitful laboratory for testing potential models and understanding the weak decay mechanisms for heavy flavors. In view of the prospects in Bc physics at the hadronic colliders such as Tevatron and LHC, Bc physics is attracting more and more attention. It has been shown that a high luminosity e+e- collider running around the Z0-peak is also helpful for studying the properties of Bc meson and has its own advantages. For this purpose, we write down an event generator for simulating Bc meson production through e+e- annihilation according to relevant publications. We name it BEEC, in which the color-singlet S-wave and P-wave (cb¯)-quarkonium states together with the color-octet S-wave (cb¯)-quarkonium states can be generated. BEEC can also be adopted to generate the similar charmonium and bottomonium states via the semi-exclusive channels e++e-→|(QQ¯)[n]>+Q+Q¯ with Q=b and c respectively. To increase the simulation efficiency, we simplify the amplitude as compact as possible by using the improved trace technology. BEEC is a Fortran program written in a PYTHIA-compatible format and is written in a modular structure, one may apply it to various situations or experimental environments conveniently by using the GNU C compiler make. A method to improve the efficiency of generating unweighted events within PYTHIA environment is proposed. Moreover, BEEC will generate a standard Les Houches Event data file that contains useful information of the meson and its accompanying partons, which can be conveniently imported into PYTHIA to do further hadronization and decay simulation. Catalogue identifier: AEQC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQC_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 114868 No. of bytes in distributed program, including test data, etc.: 963939 Distribution format: tar.gz Programming language: FORTRAN 77/90. Computer: Any computer with Fortran compiler, the program is tested with GNU Fortran compiler and Intel Fortran compiler. Operating system: UNIX, Linux and Windows. RAM: About 2.0 MB. Classification: 11.2. Nature of problem: Production of charmonium, (cb¯)-quarkonium and bottomonium via e+e- annihilation channel around the Z0 peak. Solution method: The production of heavy (QQ)-quarkonium (Q,Q‧=b,c) via e+e- annihilation are estimated by using the improved trace technology. The (QQ)-quarkonium in color-singlet 1S-wave state, 1P-wave state, and the color-octet 1S-wave states have been studied within the framework of non-relativistic QCD. The code with option can generate weighted and unweighted events conveniently, in particular, the unweighted events are generated by using an improved hit-and-miss approach so as to improve the generating efficiency. Restrictions: The generator is aimed at the production of double heavy quarkonium through e+e- annihilation at the Z0 peak. The considered processes are those that are associated with two heavy quark jets, which could provide sizable quarkonium events around the Z0 peak. Running time: It depends on which option one chooses to match PYTHIA when generating the heavy quarkonium events. Typically, for the production of the S-wave quarkonium states, if setting IDPP=2 (unweighted events), then it takes about 2 h on a 2.9 GHz AMD Athlon (tm) II×4 635 Processor machine to generate 105 events; if setting IDPP=3 (weighted events), it takes only ˜16 min to generate 105 events. For the production of the P-wave quarkonium states, the time will be almost one hundred times longer than the case of the S-wave quarkonium.
Stochastic Generation of Spatiotemporal Rainfall Events for Flood Risk Assessment
NASA Astrophysics Data System (ADS)
Diederen, D.; Liu, Y.; Gouldby, B.; Diermanse, F.
2017-12-01
Current flood risk analyses that only consider peaks of hydrometeorological forcing variables have limitations regarding their representation of reality. Simplistic assumptions regarding antecedent conditions are required, often different sources of flooding are considered in isolation, and the complex temporal and spatial evolution of the events is not considered. Mid-latitude storms, governed by large scale climatic conditions, often exhibit a high degree of temporal dependency, for example. For sustainable flood risk management, that accounts appropriately for climate change, it is desirable for flood risk analyses to reflect reality more appropriately. Analysis of risk mitigation measures and comparison of their relative performance is therefore likely to be more robust and lead to improved solutions. We provide a new framework for the provision of boundary conditions to flood risk analyses that more appropriately reflects reality. The boundary conditions capture the temporal dependencies of complex storms whilst preserving the extreme values and associated spatial dependencies. We demonstrate the application of this framework to generate a synthetic rainfall events time series boundary condition set from reanalysis rainfall data (CFSR) on the continental scale. We define spatiotemporal clusters of rainfall as events, extract hydrological parameters for each event, generate synthetic parameter sets with a multivariate distribution with a focus on the joint tail probability [Heffernan and Tawn, 2004], and finally create synthetic events from the generated synthetic parameters. We highlight the stochastic integration of (a) spatiotemporal features, e.g. event occurrence intensity over space-time, or time to previous event, which we use for the spatial placement and sequencing of the synthetic events, and (b) value-specific parameters, e.g. peak intensity and event extent. We contrast this to more traditional approaches to highlight the significant improvements in terms of representing the reality of extreme flood events.
Imagining the Future in Children with Severe Traumatic Brain Injury.
Lah, Suncica; Gott, Chloe; Epps, Adrienne; Parry, Louise
2018-06-12
Imagining future events is thought to rely on recombination and integration of past episodic memory traces into future events. Future and past events contain episodic and nonepisodic details. Children with severe traumatic brain injury (TBI) were found to have impaired recall of past episodic (but not semantic) event details. Here, we examined whether severe TBI impairs construction of future events. Children with severe TBI (n = 15) and healthy controls (NC; n = 33) 1) completed tests of anterograde (narrative and relational) memory and executive skills, 2) recalled past events and generated future events, and 3) rated events' phenomenological qualities. Events were scored for episodic (internal) and semantic (external) details. The groups did not differ in generating details of future events, although children with TBI recalled significantly fewer past internal (but not external) events' details relative to NCs. Moreover, the number of past internal details relative to future internal details was significantly higher in the NC group, but not in the TBI groups. Significant correlations between past and future were found for 1) internal details in both groups and 2) external details in the NC group. The TBI group rated their events as being less significant than did the NC group. The groups did not differ on ratings of visual intensity and rehearsal. Our study has shown that children who have sustained severe TBI had impoverished recall of past, but not generation of future, events. This unexpected dissociation between past and future event construction requires further research.
Kinnell, P I A
2017-10-15
Traditionally, the Universal Soil Loss Equation (USLE) and the revised version of it (RUSLE) have been applied to predicting the long term average soil loss produced by rainfall erosion in many parts of the world. Overtime, it has been recognized that there is a need to predict soil losses over shorter time scales and this has led to the development of WEPP and RUSLE2 which can be used to predict soil losses generated by individual rainfall events. Data currently exists that enables the RUSLE2, WEPP and the USLE-M to estimate historic soil losses from bare fallow runoff and soil loss plots recorded in the USLE database. Comparisons of the abilities of the USLE-M and RUSLE2 to estimate event soil losses from bare fallow were undertaken under circumstances where both models produced the same total soil loss as observed for sets of erosion events on 4 different plots at 4 different locations. Likewise, comparisons of the abilities of the USLE-M and WEPP to model event soil loss from bare fallow were undertaken for sets of erosion events on 4 plots at 4 different locations. Despite being calibrated specifically for each plot, WEPP produced the worst estimates of event soil loss for all the 4 plots. Generally, the USLE-M using measured runoff to calculate the product of the runoff ratio, storm kinetic energy and the maximum 30-minute rainfall intensity produced the best estimates. As to be expected, ability of the USLE-M to estimate event soil loss was reduced when runoff predicted by either RUSLE2 or WEPP was used. Despite this, the USLE-M using runoff predicted by WEPP estimated event soil loss better than WEPP. RUSLE2 also outperformed WEPP. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hopp, C. J.; Savage, M. K.; Townend, J.; Sherburn, S.
2016-12-01
Monitoring patterns in local microseismicity gives clues to the existence and location of subsurface structures. In the context of a geothermal reservoir, subsurface structures often indicate areas of high permeability and are vitally important in understanding fluid flow within the geothermal resource. Detecting and locating microseismic events within an area of power generation, however, is often challenging due to high levels of noise associated with nearby power plant infrastructure. In this situation, matched filter detection improves drastically upon standard earthquake detection techniques, specifically when events are likely induced by fluid injection and are therefore near-repeating. Using an earthquake catalog of 637 events which occurred between 1 January and 18 November 2015 as our initial dataset, we implemented a matched filtering routine for the Mighty River Power (MRP) geothermal fields at Rotokawa and Ngatamariki, central North Island, New Zealand. We detected nearly 21,000 additional events across both geothermal fields, a roughly 30-fold increase from the original catalog. On average, each of the 637 template events detected 45 additional events throughout the study period, with a maximum number of additional detections for a single template of 359. Cumulative detection rates for all template events, in general, do not mimic large scale changes in injection rates within the fields, however we do see indications of an increase in detection rate associated with power plant shutdown at Ngatamariki. Locations of detected events follow established patterns of historic seismicity at both Ngatamariki and Rotokawa. One large cluster of events persists in the southeastern portion of Rotokawa and is likely bounded to the northwest by a known fault dividing the injection and production sections of the field. Two distinct clusters of microseismicity occur in the North and South of Ngatamariki, the latter appearing to coincide with a structure dividing the production zone and the southern injection zone.
Grade 1 to 6 Thai Students' Existing Ideas about Energy
ERIC Educational Resources Information Center
Yuenyong, Chokchai; Yuenyong, Jirakarn
2007-01-01
This study explored 30 Grade 1 to 6 (6-12 years old) Thai students' existing ideas about energy. The study employed the Interview about Event (IAE) approach. During IAE, the cards of an event or things were showed to students in order to probe their views of energy concepts. Findings indicated that young students held various alternative…
Realistic training scenario simulations and simulation techniques
Dunlop, William H.; Koncher, Tawny R.; Luke, Stanley John; Sweeney, Jerry Joseph; White, Gregory K.
2017-12-05
In one embodiment, a system includes a signal generator operatively coupleable to one or more detectors; and a controller, the controller being both operably coupled to the signal generator and configured to cause the signal generator to: generate one or more signals each signal being representative of at least one emergency event; and communicate one or more of the generated signal(s) to a detector to which the signal generator is operably coupled. In another embodiment, a method includes: receiving data corresponding to one or more emergency events; generating at least one signal based on the data; and communicating the generated signal(s) to a detector.
NASA Astrophysics Data System (ADS)
Marani, M.; Zorzetto, E.; Hosseini, S. R.; Miniussi, A.; Scaioni, M.
2017-12-01
The Generalized Extreme Value (GEV) distribution is widely adopted irrespective of the properties of the stochastic process generating the extreme events. However, GEV presents several limitations, both theoretical (asymptotic validity for a large number of events/year or hypothesis of Poisson occurrences of Generalized Pareto events), and practical (fitting uses just yearly maxima or a few values above a high threshold). Here we describe the Metastatistical Extreme Value Distribution (MEVD, Marani & Ignaccolo, 2015), which relaxes asymptotic or Poisson/GPD assumptions and makes use of all available observations. We then illustrate the flexibility of the MEVD by applying it to daily precipitation, hurricane intensity, and storm surge magnitude. Application to daily rainfall from a global raingauge network shows that MEVD estimates are 50% more accurate than those from GEV when the recurrence interval of interest is much greater than the observational period. This makes MEVD suited for application to satellite rainfall observations ( 20 yrs length). Use of MEVD on TRMM data yields extreme event patterns that are in better agreement with surface observations than corresponding GEV estimates.Applied to the HURDAT2 Atlantic hurricane intensity dataset, MEVD significantly outperforms GEV estimates of extreme hurricanes. Interestingly, the Generalized Pareto distribution used for "ordinary" hurricane intensity points to the existence of a maximum limit wind speed that is significantly smaller than corresponding physically-based estimates. Finally, we applied the MEVD approach to water levels generated by tidal fluctuations and storm surges at a set of coastal sites spanning different storm-surge regimes. MEVD yields accurate estimates of large quantiles and inferences on tail thickness (fat vs. thin) of the underlying distribution of "ordinary" surges. In summary, the MEVD approach presents a number of theoretical and practical advantages, and outperforms traditional approaches in several applications. We conclude that the MEVD is a significant contribution to further generalize extreme value theory, with implications for a broad range of Earth Sciences.
NASA Astrophysics Data System (ADS)
Viegas, G. F.; Urbancic, T.; Baig, A. M.
2014-12-01
In hydraulic fracturing completion programs fluids are injected under pressure into fractured rock formations to open escape pathways for trapped hydrocarbons along pre-existing and newly generated fractures. To characterize the failure process, we estimate static and dynamic source and rupture parameters, such as dynamic and static stress drop, radiated energy, seismic efficiency, failure modes, failure plane orientations and dimensions, and rupture velocity to investigate the rupture dynamics and scaling relations of micro-earthquakes induced during a hydraulic fracturing shale completion program in NE British Columbia, Canada. The relationships between the different parameters combined with the in-situ stress field and rock properties provide valuable information on the rupture process giving insights into the generation and development of the fracture network. Approximately 30,000 micro-earthquakes were recorded using three multi-sensor arrays of high frequency geophones temporarily placed close to the treatment area at reservoir depth (~2km). On average the events have low radiated energy, low dynamic stress and low seismic efficiency, consistent with the obtained slow rupture velocities. Events fail in overshoot mode (slip weakening failure model), with fluids lubricating faults and decreasing friction resistance. Events occurring in deeper formations tend to have faster rupture velocities and are more efficient in radiating energy. Variations in rupture velocity tend to correlate with variation in depth, fault azimuth and elapsed time, reflecting a dominance of the local stress field over other factors. Several regions with different characteristic failure modes are identifiable based on coherent stress drop, seismic efficiency, rupture velocities and fracture orientations. Variations of source parameters with rock rheology and hydro-fracture fluids are also observed. Our results suggest that the spatial and temporal distribution of events with similar characteristic rupture behaviors can be used to determine reservoir geophysical properties, constrain reservoir geo-mechanical models, classify dynamic rupture processes for fracture models and improve fracture treatment designs.
Chen, Xing-jie; Liu, Lu-lu; Cui, Ji-fang; Wang, Ya; Chen, An-tao; Li, Feng-hua; Wang, Wei-hong; Zheng, Han-feng; Gan, Ming-yuan; Li, Chun-qiu; Shum, David H. K.; Chan, Raymond C. K.
2016-01-01
Mental time travel refers to the ability to recall past events and to imagine possible future events. Schizophrenia (SCZ) patients have problems in remembering specific personal experiences in the past and imagining what will happen in the future. This study aimed to examine episodic past and future thinking in SCZ spectrum disorders including SCZ patients and individuals with schizotypal personality disorder (SPD) proneness who are at risk for developing SCZ. Thirty-two SCZ patients, 30 SPD proneness individuals, and 33 healthy controls participated in the study. The Sentence Completion for Events from the Past Test (SCEPT) and the Sentence Completion for Events in the Future Test were used to measure past and future thinking abilities. Results showed that SCZ patients showed significantly reduced specificity in recalling past and imagining future events, they generated less proportion of specific and extended events compared to healthy controls. SPD proneness individuals only generated less extended events compared to healthy controls. The reduced specificity was mainly manifested in imagining future events. Both SCZ patients and SPD proneness individuals generated less positive events than controls. These results suggest that mental time travel impairments in SCZ spectrum disorders and have implications for understanding their cognitive and emotional deficits. PMID:27507958
Eastwick, Paul W
2009-09-01
Evolutionary psychologists explore the adaptive function of traits and behaviors that characterize modern Homo sapiens. However, evolutionary psychologists have yet to incorporate the phylogenetic relationship between modern Homo sapiens and humans' hominid and pongid relatives (both living and extinct) into their theorizing. By considering the specific timing of evolutionary events and the role of evolutionary constraint, researchers using the phylogenetic approach can generate new predictions regarding mating phenomena and derive new explanations for existing evolutionary psychological findings. Especially useful is the concept of the adaptive workaround-an adaptation that manages the maladaptive elements of a pre-existing evolutionary constraint. The current review organizes 7 features of human mating into their phylogenetic context and presents evidence that 2 adaptive workarounds played a critical role as Homo sapiens's mating psychology evolved. These adaptive workarounds function in part to mute or refocus the effects of older, previously evolved adaptations and highlight the layered nature of humans' mating psychology. (c) 2009 APA, all rights reserved.
Orhan, U.; Erdogmus, D.; Roark, B.; Oken, B.; Purwar, S.; Hild, K. E.; Fowler, A.; Fried-Oken, M.
2013-01-01
RSVP Keyboard™ is an electroencephalography (EEG) based brain computer interface (BCI) typing system, designed as an assistive technology for the communication needs of people with locked-in syndrome (LIS). It relies on rapid serial visual presentation (RSVP) and does not require precise eye gaze control. Existing BCI typing systems which uses event related potentials (ERP) in EEG suffer from low accuracy due to low signal-to-noise ratio. Henceforth, RSVP Keyboard™ utilizes a context based decision making via incorporating a language model, to improve the accuracy of letter decisions. To further improve the contributions of the language model, we propose recursive Bayesian estimation, which relies on non-committing string decisions, and conduct an offline analysis, which compares it with the existing naïve Bayesian fusion approach. The results indicate the superiority of the recursive Bayesian fusion and in the next generation of RSVP Keyboard™ we plan to incorporate this new approach. PMID:23366432
Stainbrook, David P.; Diefenbach, Duane R.
2012-01-01
The mission at Gettysburg National Military Park and Eisenhower National Historic Site (GNMP-ENHS) is to preserve the historic character of the parks to enable current and future generations to understand and interpret the events that took place at each park. Management objectives include maintaining the landscape as it existed during the historic 1863 Civil War battle (e.g., dense understory in woodlots) in GNMP and as it existed during Eisenhower’s occupancy (e.g., patchwork of cropfields) in ENHS. Browsing by white-tailed deer (Odocoileus virginianus) diminished regeneration of native trees in woodlots and prevented crops from reaching maturity. Thus, to increase regeneration in woodlots and reduce crop damage, the National Park Service (NPS) began culling deer in 1995 to reach a density goal of 10 deer/km2 of forest. However, park managers were interested in an accurate population estimate to determine if their management goal has been met and possible methods to monitor future abundance.
Enriching Great Britain's National Landslide Database by searching newspaper archives
NASA Astrophysics Data System (ADS)
Taylor, Faith E.; Malamud, Bruce D.; Freeborough, Katy; Demeritt, David
2015-11-01
Our understanding of where landslide hazard and impact will be greatest is largely based on our knowledge of past events. Here, we present a method to supplement existing records of landslides in Great Britain by searching an electronic archive of regional newspapers. In Great Britain, the British Geological Survey (BGS) is responsible for updating and maintaining records of landslide events and their impacts in the National Landslide Database (NLD). The NLD contains records of more than 16,500 landslide events in Great Britain. Data sources for the NLD include field surveys, academic articles, grey literature, news, public reports and, since 2012, social media. We aim to supplement the richness of the NLD by (i) identifying additional landslide events, (ii) acting as an additional source of confirmation of events existing in the NLD and (iii) adding more detail to existing database entries. This is done by systematically searching the Nexis UK digital archive of 568 regional newspapers published in the UK. In this paper, we construct a robust Boolean search criterion by experimenting with landslide terminology for four training periods. We then apply this search to all articles published in 2006 and 2012. This resulted in the addition of 111 records of landslide events to the NLD over the 2 years investigated (2006 and 2012). We also find that we were able to obtain information about landslide impact for 60-90% of landslide events identified from newspaper articles. Spatial and temporal patterns of additional landslides identified from newspaper articles are broadly in line with those existing in the NLD, confirming that the NLD is a representative sample of landsliding in Great Britain. This method could now be applied to more time periods and/or other hazards to add richness to databases and thus improve our ability to forecast future events based on records of past events.
NASA Technical Reports Server (NTRS)
Fensholt, R.; Anyamba, A.; Huber, S.; Proud, S. R.; Tucker, C. J.; Small, J.; Pak, E.; Rasmussen, M. O.; Sandholt, I.; Shisanya, C.
2011-01-01
Since 1972, satellite remote sensing of the environment has been dominated by polar-orbiting sensors providing useful data for monitoring the earth s natural resources. However their observation and monitoring capacity are inhibited by daily to monthly looks for any given ground surface which often is obscured by frequent and persistent cloud cover creating large gaps in time series measurements. The launch of the Meteosat Second Generation (MSG) satellite into geostationary orbit has opened new opportunities for land surface monitoring. The Spinning Enhanced Visible and Infrared Imager (SEVIRI) instrument on-board MSG with an imaging capability every 15 minutes which is substantially greater than any temporal resolution that can be obtained from existing polar operational environmental satellites (POES) systems currently in use for environmental monitoring. Different areas of the African continent were affected by droughts and floods in 2008 caused by periods of abnormally low and high rainfall, respectively. Based on the effectiveness of monitoring these events from Earth Observation (EO) data the current analyses show that the new generation of geostationary remote sensing data can provide higher temporal resolution cloud-free (less than 5 days) measurements of the environment as compared to existing POES systems. SEVIRI MSG 5-day continental scale composites will enable rapid assessment of environmental conditions and improved early warning of disasters for the African continent such as flooding or droughts. The high temporal resolution geostationary data will complement existing higher spatial resolution polar-orbiting satellite data for various dynamic environmental and natural resource applications of terrestrial ecosystems.
NASA Astrophysics Data System (ADS)
Zhao, J.; Mangeney, A.; Moretti, L.; Stutzmann, E.; Calder, E. S.; Smith, P. J.; Capdeville, Y.; Le Friant, A.; Cole, P.; Luckett, R.; Robertson, R.
2011-12-01
Gravitational instabilities such as debris avalanches or pyroclastic flows represent one of the major natural hazards for populations who live in mountainous or volcanic areas. Detection and understanding of the dynamics of these events is crucial for risk assessment. Furthermore, during an eruption, a series of explosions and gravitational flows can occur, making it difficult to retrieve the characteristics of the individual gravitational events such as their volume, velocity, etc. In this context, the seismic signal generated by these events provides a unique tool to extract information on the history of the eruptive process and to validate gravitational flow models. We analyze here a series of events including explosions, debris avalanche and pyroclastic flows occurring in Montserrat in December 1997. This seismic signal is composed of six main pulses. The characteristics of the seismic signals generated by pyroclastic flows (amplitude, emergent onset, frequency spectrum, etc.) are described and linked to the volume of the individual events estimated from past field surveys. As a first step, we simulate the waveform of each event by assuming that the generation process reduces to a simple force applied at the surface of the topography. Going further, we perform detailed numerical simulation of the Boxing Day debris avalanche and of the following pyroclastic flow using a landslide model able to take into account the 3D topography. The stress field generated by the gravitational flows on the topography is then applied as surface boundary condition in a wave propagation model, making it possible to simulate the seismic signal generated by the avalanche and pyroclastic flow. Comparison between the simulated signal and the seismic signal recorded at the Puerto Rico seismic station located 450 km away from the source, show that this method allows us to reproduce the low frequency seismic signal and to constrain the volume and frictional behavior of the individual events. As a result, simulation of seismic signals generated by gravitational flows provides insight into the history of eruptive sequences and into the characteristics of the individual events.
Novel high-fidelity realistic explosion damage simulation for urban environments
NASA Astrophysics Data System (ADS)
Liu, Xiaoqing; Yadegar, Jacob; Zhu, Youding; Raju, Chaitanya; Bhagavathula, Jaya
2010-04-01
Realistic building damage simulation has a significant impact in modern modeling and simulation systems especially in diverse panoply of military and civil applications where these simulation systems are widely used for personnel training, critical mission planning, disaster management, etc. Realistic building damage simulation should incorporate accurate physics-based explosion models, rubble generation, rubble flyout, and interactions between flying rubble and their surrounding entities. However, none of the existing building damage simulation systems sufficiently faithfully realize the criteria of realism required for effective military applications. In this paper, we present a novel physics-based high-fidelity and runtime efficient explosion simulation system to realistically simulate destruction to buildings. In the proposed system, a family of novel blast models is applied to accurately and realistically simulate explosions based on static and/or dynamic detonation conditions. The system also takes account of rubble pile formation and applies a generic and scalable multi-component based object representation to describe scene entities and highly scalable agent-subsumption architecture and scheduler to schedule clusters of sequential and parallel events. The proposed system utilizes a highly efficient and scalable tetrahedral decomposition approach to realistically simulate rubble formation. Experimental results demonstrate that the proposed system has the capability to realistically simulate rubble generation, rubble flyout and their primary and secondary impacts on surrounding objects including buildings, constructions, vehicles and pedestrians in clusters of sequential and parallel damage events.
EARS : Repositioning data management near data acquisition.
NASA Astrophysics Data System (ADS)
Sinquin, Jean-Marc; Sorribas, Jordi; Diviacco, Paolo; Vandenberghe, Thomas; Munoz, Raquel; Garcia, Oscar
2016-04-01
The EU FP7 Projects Eurofleets and Eurofleets2 are an European wide alliance of marine research centers that aim to share their research vessels, to improve information sharing on planned, current and completed cruises, on details of ocean-going research vessels and specialized equipment, and to durably improve cost-effectiveness of cruises. Within this context logging of information on how, when and where anything happens on board of the vessel is crucial information for data users in a later stage. This forms a primordial step in the process of data quality control as it could assist in the understanding of anomalies and unexpected trends recorded in the acquired data sets. In this way completeness of the metadata is improved as it is recorded accurately at the origin of the measurement. The collection of this crucial information has been done in very different ways, using different procedures, formats and pieces of software in the context of the European Research Fleet. At the time that the Eurofleets project started, every institution and country had adopted different strategies and approaches, which complicated the task of users that need to log general purpose information and events on-board whenever they access a different platform loosing the opportunity to produce this valuable metadata on-board. Among the many goals the Eurofleets project has, a very important task is the development of an "event log software" called EARS (Eurofleets Automatic Reporting System) that enables scientists and operators to record what happens during a survey. EARS will allow users to fill, in a standardized way, the gap existing at the moment in metadata description that only very seldom links data with its history. Events generated automatically by acquisition instruments will also be handled, enhancing the granularity and precision of the event annotation. The adoption of a common procedure to log survey events and a common terminology to describe them is crucial to provide a friendly and successfully metadata on-board creation procedure for the whole the European Fleet. The possibility of automatically reporting metadata and general purpose data, will simplify the work of scientists and data managers with regards to data transmission. An improved accuracy and completeness of metadata is expected when events are recorded at acquisition time. This will also enhance multiple usages of the data as it allows verification of the different requirements existing in different disciplines.
The Boxing Day Tsunami: Could the Disaster have been Anticipated?
NASA Astrophysics Data System (ADS)
Cummins, P. R.; Burbdige, D.
2005-05-01
The occurrence of the 26 December, 2004 Sumatra-Andaman earthquake and the accompanying "Boxing Day" Tsunami, which killed over 280,00, has been described as one of the most lethal natural disasters in human history. Many lives could have been saved had a tsunami warning system, similar to that which exists for the Pacific Ocean, been in operation for the Indian Ocean. The former exists because great subduction zone earthquakes have generated destructive, Pacific-wide tsunami in the Pacific Ocean with some frequency. Prior to 26 December, 2004, all of the world's earthquakes with magnitude > 9 were widely thought to have occurred in the Pacific Ocean, where they caused destructive tsunami. Could the occurrence of similar earthquakes and tsunami in the Indian Ocean been predicted prior to the 2004 Box Day Tragedy? This presentation will argue that the answer is "Yes". Almost without exception (the exception being the 1952 Kamchatka earthquake) the massive subduction zone earthquakes and tsunami of the Pacific Ocean have been associated with the subduction of relatively young ocean lithosphere (< 60 Ma), and the theory for why this should be so seems well established. Although the eastern part of the Sunda Arc off Java does not meet this criterion, the western part of the Sunda Arc offshore Sumatra does. Although there appears to be no reference to the great earthquakes off Sumatra which occurred in 1833 and 1861 in widely-used earthquake catalogs, these events have been reported in the literature and were the subject of recent research. In particular, research by Zachariasen et al. (1999 and 2000) had inferred that the magnitude of the 1833 event may have been as high as 9.2. Calculations for the tsunami that might have been associated with this event had shown, prior to 26 Dec, that it would affect the entire Indian Ocean basin, although due to the earthquake's location 1000 km southeast of the Boxing day event, the effects in the Bay of Bengal would not have been as severe. Thus, it seems to this author that the Boxing Day event could and should have been anticipated. This presentation will further consider why it was not, and what steps can be taken to anticipate and mitigate the effects of future events that may occur in the Indian Ocean and elsewhere.
NASA Astrophysics Data System (ADS)
Singh, N.
2014-12-01
It is now widely recognized that superthermal electrons commonly exist with the thermal population in most space plasmas. When plasmas consisting of such electron population expand, double layers (DLs) naturally forma due to charge separation; the more mobile superthermal electrons march ahead of the thermal population, leaving a positive charge behind and generating electric fields. Under certain conditions such fields evolve into thin double layers or shocks. The double layers accelerate ions. Such double-layer formation was first invoked to explain expansion of laser produced plasmas. Since then it has been studied in laboratory experiments, and applied to (i) polar wind acceleration,(ii) the existence of low-altitude double layers in the auroral acceleration, (iii) a possible mechanism for the origination of the solar wind, (iv) the helicon double layer thrusters, and (v) the deceleration of electrons after their acceleration in solar flare events. The role of superthermal-electron driven double layers, also known as the low-altitude auroral double layers in the upward current region, in the upward acceleration of ionospheric ions is well-known. In the auroral application the upward moving superthermal electrons consist of backscattered downgoing primary energetic electrons as well as the secondary electrons. Similarly we suggest that such double layers might play roles in the acceleration of ions in the solar wind across the coronal transition region, where the superthermal electrons are supplied by magnetic reconnection events. We will present a unified theoretical view of the superthermal electron-driven double layers and their applications. We will summarize theoretical, experimental, simulation and observational results highlighting the common threads running through the various existing studies.
Do, Hoang Dang Khoa; Kim, Joo-Hwan
2017-01-01
Chloroplast genomes (cpDNA) are highly valuable resources for evolutionary studies of angiosperms, since they are highly conserved, are small in size, and play critical roles in plants. Slipped-strand mispairing (SSM) was assumed to be a mechanism for generating repeat units in cpDNA. However, research on the employment of different small repeated sequences through SSM events, which may induce the accumulation of distinct types of repeats within the same region in cpDNA, has not been documented. Here, we sequenced two chloroplast genomes from the endemic species Heloniopsis tubiflora (Korea) and Xerophyllum tenax (USA) to cover the gap between molecular data and explore "hot spots" for genomic events in Melanthiaceae. Comparative analysis of 23 complete cpDNA sequences revealed that there were different stages of deletion in the rps16 region across the Melanthiaceae. Based on the partial or complete loss of rps16 gene in cpDNA, we have firstly reported potential molecular markers for recognizing two sections ( Veratrum and Fuscoveratrum ) of Veratrum . Melathiaceae exhibits a significant change in the junction between large single copy and inverted repeat regions, ranging from trnH_GUG to a part of rps3 . Our results show an accumulation of tandem repeats in the rpl23-ycf2 regions of cpDNAs. Small conserved sequences exist and flank tandem repeats in further observation of this region across most of the examined taxa of Liliales. Therefore, we propose three scenarios in which different small repeated sequences were used during SSM events to generate newly distinct types of repeats. Occasionally, prior to the SSM process, point mutation event and double strand break repair occurred and induced the formation of initial repeat units which are indispensable in the SSM process. SSM may have likely occurred more frequently for short repeats than for long repeat sequences in tribe Parideae (Melanthiaceae, Liliales). Collectively, these findings add new evidence of dynamic results from SSM in chloroplast genomes which can be useful for further evolutionary studies in angiosperms. Additionally, genomics events in cpDNA are potential resources for mining molecular markers in Liliales.
Spatio-temporal foreshock activity during stick-slip experiments of large rock samples
NASA Astrophysics Data System (ADS)
Tsujimura, Y.; Kawakata, H.; Fukuyama, E.; Yamashita, F.; Xu, S.; Mizoguchi, K.; Takizawa, S.; Hirano, S.
2016-12-01
Foreshock activity has sometimes been reported for large earthquakes, and has been roughly classified into the following two classes. For shallow intraplate earthquakes, foreshocks occurred in the vicinity of the mainshock hypocenter (e.g., Doi and Kawakata, 2012; 2013). And for intraplate subduction earthquakes, foreshock hypocenters migrated toward the mainshock hypocenter (Kato, et al., 2012; Yagi et al., 2014). To understand how foreshocks occur, it is useful to investigate the spatio-temporal activities of foreshocks in the laboratory experiments under controlled conditions. We have conducted stick-slip experiments by using a large-scale biaxial friction apparatus at NIED in Japan (e.g., Fukuyama et al., 2014). Our previous results showed that stick-slip events repeatedly occurred in a run, but only those later events were preceded by foreshocks. Kawakata et al. (2014) inferred that the gouge generated during the run was an important key for foreshock occurrence. In this study, we proceeded to carry out stick-slip experiments of large rock samples whose interface (fault plane) is 1.5 meter long and 0.5 meter wide. After some runs to generate fault gouge between the interface. In the current experiments, we investigated spatio-temporal activities of foreshocks. We detected foreshocks from waveform records of 3D array of piezo-electric sensors. Our new results showed that more than three foreshocks (typically about twenty) had occurred during each stick-slip event, in contrast to the few foreshocks observed during previous experiments without pre-existing gouge. Next, we estimated the hypocenter locations of the stick-slip events, and found that they were located near the opposite end to the loading point. In addition, we observed a migration of foreshock hypocenters toward the hypocenter of each stick-slip event. This suggests that the foreshock activity observed in our current experiments was similar to that for the interplate earthquakes in terms of the spatio-temporal pattern. This work was supported by NIED research project "Development of monitoring and forecasting technology for crustal activity", JSPS KAKENHI Grant Number 23340131, and MEXT of Japan, under its Earthquake and Volcano Hazards Observation and Research Program.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-28
... rare life-threatening events to more common events that are perceived to have less severe clinical sequelae. Many of these events were evident in the premarket studies; however, rare events such as erosion... events in the overall context of the disease and existing treatment options; (2) to discuss whether...
Situational Awareness from a Low-Cost Camera System
NASA Technical Reports Server (NTRS)
Freudinger, Lawrence C.; Ward, David; Lesage, John
2010-01-01
A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.
NASA Astrophysics Data System (ADS)
Lu, C.
2017-12-01
This study utilized field outcrops, thin sections, geochemical data, and GR logging curves to investigate the development model of paleokarst within the Longwangmiao Formation in the Lower Cambrian, western Central Yangtze Block, SW China. The Longwangmiao Formation, which belongs to a third-order sequence, consists of four forth-order sequences and is located in the uppermost part of the Lower Cambrian. The vertical variations of the δ13C and δ18O values indicate the existence of multi-stage eogenetic karst events. The eogenetic karst event in the uppermost part of the Longwangmiao Formation is recognized by the dripstones developed within paleocaves, vertical paleoweathering crust with four zones (bedrock, a weak weathering zone, an intense weathering zone and a solution collapsed zone), two generations of calcsparite cement showing bright luminescence and a zonation from nonluminescent to bright to nonluminescent, two types breccias (matrix-rich clast-supported chaotic breccia and matrix-supported chaotic breccia) and rundkarren. The episodic variations of stratiform dissolution vugs and breccias in vertical, and facies-controlled dissolution and filling features indicated the development of multi-stages eogenetic karst. The development of the paleokarst model is controlled by multi-level sea-level changes. The long eccentricity cycle dictates the fluctuations of the forth-order sea-level, generating multi-stage eogenetic karst events. The paleokarst model is an important step towards better understanding the link between the probably orbitally forced sea-level oscillations and eogenetic karst in the Lower Cambrian. According to this paleokarst model, hydrocarbon exploration should focus on both the karst highlands and the karst transitional zone.
Hébert, Emily T; Vandewater, Elizabeth A; Businelle, Michael S; Harrell, Melissa B; Kelder, Steven H; Perry, Cheryl L
2017-10-01
Existing measures of tobacco marketing and messaging exposure are limited, relying on recall, recognition, or proxy measures. This study aimed to determine the feasibility and reliability of a mobile application for the measurement of tobacco and e-cigarette marketing and message exposure using ecological momentary assessment (EMA). Young adults from Austin, TX (n=181, ages 18-29) were instructed to use a mobile application to record all sightings of marketing or social media related to tobacco (including e-cigarettes) in real-time for 28days (Event EMAs). Tobacco product use and recall of message encounters were assessed daily using an app-initiated EMA (Daily EMAs). The mobile app was a feasible and acceptable method to measure exposure to tobacco messages. The majority of messages (45.0%) were seen on the Internet, and many were user-generated. Thirty-day recall of messages at baseline was poorly correlated with messages reported via Event EMA during the study period; however, the correlation between post-study 30-day recall and Event EMA was much stronger (r=0.603 for industry-sponsored messages, r=0.599 for user-generated messages). Correlations between Daily EMAs and 30-day recall of message exposure (baseline and post-study) were small (baseline: r=0.329-0.389) to large (post-study: r=0.656-0.766). These findings suggest that EMA is a feasible and reliable method for measuring tobacco message exposure, especially given the prevalence of messages encountered online and on social media. Recall measures are limited in their ability to accurately represent marketing exposure, but might be improved by a period of priming or clearer response categories. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bambi, Cosimo
2013-01-01
Black holes have the peculiar and intriguing property of having an event horizon, a one-way membrane causally separating their internal region from the rest of the Universe. Today, astrophysical observations provide some evidence for the existence of event horizons in astrophysical black hole candidates. In this short paper, I compare the constraint we can infer from the nonobservation of electromagnetic radiation from the putative surface of these objects with the bound coming from the ergoregion instability, pointing out the respective assumptions and limitations.
Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.
Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.
A Web Tool for Generating High Quality Machine-readable Biological Pathways.
Ramirez-Gaona, Miguel; Marcu, Ana; Pon, Allison; Grant, Jason; Wu, Anthony; Wishart, David S
2017-02-08
PathWhiz is a web server built to facilitate the creation of colorful, interactive, visually pleasing pathway diagrams that are rich in biological information. The pathways generated by this online application are machine-readable and fully compatible with essentially all web-browsers and computer operating systems. It uses a specially developed, web-enabled pathway drawing interface that permits the selection and placement of different combinations of pre-drawn biological or biochemical entities to depict reactions, interactions, transport processes and binding events. This palette of entities consists of chemical compounds, proteins, nucleic acids, cellular membranes, subcellular structures, tissues, and organs. All of the visual elements in it can be interactively adjusted and customized. Furthermore, because this tool is a web server, all pathways and pathway elements are publicly accessible. This kind of pathway "crowd sourcing" means that PathWhiz already contains a large and rapidly growing collection of previously drawn pathways and pathway elements. Here we describe a protocol for the quick and easy creation of new pathways and the alteration of existing pathways. To further facilitate pathway editing and creation, the tool contains replication and propagation functions. The replication function allows existing pathways to be used as templates to create or edit new pathways. The propagation function allows one to take an existing pathway and automatically propagate it across different species. Pathways created with this tool can be "re-styled" into different formats (KEGG-like or text-book like), colored with different backgrounds, exported to BioPAX, SBGN-ML, SBML, or PWML data exchange formats, and downloaded as PNG or SVG images. The pathways can easily be incorporated into online databases, integrated into presentations, posters or publications, or used exclusively for online visualization and exploration. This protocol has been successfully applied to generate over 2,000 pathway diagrams, which are now found in many online databases including HMDB, DrugBank, SMPDB, and ECMDB.
Initiation of a thrust fault revealed by analog experiments
NASA Astrophysics Data System (ADS)
Dotare, Tatsuya; Yamada, Yasuhiro; Adam, Juergen; Hori, Takane; Sakaguchi, Hide
2016-08-01
To reveal in detail the process of initiation of a thrust fault, we conducted analog experiments with dry quartz sand using a high-resolution digital image correlation technique to identify minor shear-strain patterns for every 27 μm of shortening (with an absolute displacement accuracy of 0.5 μm). The experimental results identified a number of "weak shear bands" and minor uplift prior to the initiation of a thrust in cross-section view. The observations suggest that the process is closely linked to the activity of an adjacent existing thrust, and can be divided into three stages. Stage 1 is characterized by a series of abrupt and short-lived weak shear bands at the location where the thrust will subsequently be generated. The area that will eventually be the hanging wall starts to uplift before the fault forms. The shear strain along the existing thrust decreases linearly during this stage. Stage 2 is defined by the generation of the new thrust and active displacements along it, identified by the shear strain along the thrust. The location of the new thrust may be constrained by its back-thrust, generally produced at the foot of the surface slope. The activity of the existing thrust falls to zero once the new thrust is generated, although these two events are not synchronous. Stage 3 of the thrust is characterized by a constant displacement that corresponds to the shortening applied to the model. Similar minor shear bands have been reported in the toe area of the Nankai accretionary prism, SW Japan. By comparing several transects across this subduction margin, we can classify the lateral variations in the structural geometry into the same stages of deformation identified in our experiments. Our findings may also be applied to the evaluation of fracture distributions in thrust belts during unconventional hydrocarbon exploration and production.
Hierarchy of temporal responses of multivariate self-excited epidemic processes
NASA Astrophysics Data System (ADS)
Saichev, Alexander; Maillart, Thomas; Sornette, Didier
2013-04-01
Many natural and social systems are characterized by bursty dynamics, for which past events trigger future activity. These systems can be modelled by so-called self-excited Hawkes conditional Poisson processes. It is generally assumed that all events have similar triggering abilities. However, some systems exhibit heterogeneity and clusters with possibly different intra- and inter-triggering, which can be accounted for by generalization into the "multivariate" self-excited Hawkes conditional Poisson processes. We develop the general formalism of the multivariate moment generating function for the cumulative number of first-generation and of all generation events triggered by a given mother event (the "shock") as a function of the current time t. This corresponds to studying the response function of the process. A variety of different systems have been analyzed. In particular, for systems in which triggering between events of different types proceeds through a one-dimension directed or symmetric chain of influence in type space, we report a novel hierarchy of intermediate asymptotic power law decays ˜ 1/ t 1-( m+1) θ of the rate of triggered events as a function of the distance m of the events to the initial shock in the type space, where 0 < θ < 1 for the relevant long-memory processes characterizing many natural and social systems. The richness of the generated time dynamics comes from the cascades of intermediate events of possibly different kinds, unfolding via random changes of types genealogy.
Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM
NASA Astrophysics Data System (ADS)
Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan
2018-02-01
The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.
Modeling tools for the assessment of microbiological risks during floods: a review
NASA Astrophysics Data System (ADS)
Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin
2015-04-01
Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing particular flood-related microbial risks, and model improvements are suggested that may better characterize key microbial risks during flood events. The state of current tools are assessed in the context of a changing climate where the frequency, intensity and duration of flooding are shifting in some areas.
On transient events in the upper atmosphere generated away of thunderstorm regions
NASA Astrophysics Data System (ADS)
Morozenko, V.; Garipov, G.; Khrenov, B.; Klimov, P.; Panasyuk, M.; Sharakin, S.; Zotov, M.
2011-12-01
Experimental data on transient events in UV and Red-IR ranges obtained in the MSU missions "Unversitetsky-Tatiana" (wavelengths 300-400 nm) and "Unversitetsky-Tatiana-2" (wavelengths 300-400 nm and 600-800 nm), published by Garipov et al, in 2010 at COSPAR session http://www.cospar2010.org, at TEPA conference http://www.aragats.am/Conferences/tepa2010 and in 2011 by Sadovnichy et al, Solar System Research, 45, #1, 3-29 (2011); Vedenkin et al, JETP, v. 140, issue 3(9), 1-11 (2011) demonstrated existence of transients at large distances (up to thousands km) away of cloud thunderstorm regions. Those "remote" transients are short (1-5 msec) and are less luminous than the transients above thunderstorm regions. The ratio of Red-IR to UV photon numbers in those transients indicates high altitude of their origin (~70 km). Important observation facts are also: 1. a change of the exponent in transient distribution on luminosity Q ("-1" for photon numbers Q=1020 -1023 to "-2" for Q>1023), 2. a change of global distribution of transient with their luminosity (transients with Q>1023 are concentrated in equatorial range above continents, while transients with low luminosity are distributed more uniformly), 3. a phenomenon of transient sequences in one satellite orbit which is close to geomagnetic meridian. In the present paper phenomenological features of transients are explained in assumption that the observed transients have to be divided in two classes: 1. transients related to local, lower in the atmosphere, lightning at distance not more than hundreds km from satellite detector field of view in the atmosphere and 2. transients generated by far away lightning. Local transients are luminous and presumably are events called "transient luminous events" (TLE). In distribution on luminosity those events have some threshold Q~1023 and their differential luminosity distribution is approximated by power law exponent "-2". Remote transients have to be considered separately. Their origin may be related to electromagnetic pulses (EMP) or waves (whistler, EMW) generated by lightning. The EMP-EMW is transmitted in the ionosphere- ground channel to large distances R with low absorption. The part of EMP-EMW "visible" in the detector aperture diminishes with distance as R-1 due to observation geometry. The EMP-EMW triggers the electric discharge in the upper atmosphere (lower ionosphere, ~70 km). Estimates of resulting transients luminosity and their correlation with geomagnetic field are in progress.
Organizational Analysis of the United States Army Contracting Command-Kuwait
2008-09-01
incongruent, which may have contributed to documented organizational dysfunctions; 2) The command should initiate meaningful morale building events into...the command’s schedule and encourage use of existing morale , welfare, and recreation activities; and 3) Recommend the command establish clear and...should initiate meaningful morale building events into the command’s schedule and encourage use of existing morale , welfare, and recreation
NASA Astrophysics Data System (ADS)
Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph
2017-04-01
Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.
2013-02-06
This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing lowmore » to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way. Another interesting observation is that the recognized ‘overdispersion’ problem in daily rainfall simulation ascribes more to the loss of rainfall extremes than the under-representation of first-order persistence. The developed generator appears to be a sound option for daily rainfall simulation, especially in particular hydrologic planning situations when rare rainfall events are of great importance.« less
NASA Astrophysics Data System (ADS)
Khan, Prosanta Kumar; Banerjee, Jayashree; Shamim, Sk; Mohanty, Manoranjan
2018-03-01
The present study investigates the temporal variation of few seismic parameters between the Myanmar (Zone I), Andaman-Nicobar-Northwest Sumatra (Zone II), Southeast Sumatra-West Indonesia (Zone III) and East Indonesia (Zone IV) converging boundaries in reference to the generation of 26 December 2004 M w > 9.0 off-Sumatra mega-earthquake event. The four segments are distinguished based on tectonics parameters, distinct geological locations, great earthquake occurrences, and the Wadati-Benioff zone characteristics. Two important seismic parameters such as seismic energy and b values are computed over a time-window of 6-month period during the entire 1976-2013 period for these segments. The b values show a constant decrease in Zones II, III, and IV, whereas the Zone I does not show any such pattern prior to the 2004 mega-event. The release of seismic energy was also gradually decreasing in Zones II and III till the 2004 event, and little similar pattern was also noted in Zone IV. This distinct observation might be indicating that the stress accumulation was dominant near the Sumatra-Java area located towards southeast of Zone II and northwest of Zone III. The released strain energy during the 2004 event was subsequently migrated towards north, rupturing 1300 km of the boundary between the Northwest Sumatra and the North Andaman. The occurrence of 2004 mega-event was apparently concealed behind the long-term seismic quiescence existing near the Sumatra and Nicobar margin. A systematic study of the patterns of seismic energy release and b values, and the long-term observation of collective behaviour of the margin tectonics might have had given clues to the possibility of the 2004 mega-event.
NASA Astrophysics Data System (ADS)
Williams, C. A.; Wallace, L. M.; Bartlow, N. M.
2017-12-01
Slow slip events (SSEs) have been observed throughout the world, and the existence of these events has fundamentally altered our understanding of the possible ranges of slip behavior at subduction plate boundaries. In New Zealand, SSEs occur along the Hikurangi Margin, with shallower events in the north and deeper events to the south. In a recent study, Williams and Wallace (2015) found that static SSE inversions that consider elastic property variations provided significantly different results than those based on an elastic half-space. For deeper events, the heterogeneous models predicted smaller amounts of slip, while for shallower events the heterogeneous model predicted larger amounts of slip. In this study, we extend our initial work to examine the temporal variations in slip. We generate Green's functions using the PyLith finite element code (Aagaard et al., 2013) to allow consideration of elastic property variations provided by the New Zealand-wide seismic velocity model (Eberhart-Phillips et al., 2010). These Green's functions are then integrated to provide Green's functions compatible with the Network Inversion Filter (NIF, Segall and Matthews,1997; McGuire and Segall, 2003; Miyazaki et al.,2006). We examine 12 SSEs occurring along the Hikurangi Margin during 2010 and 2011, and compare the results using heterogeneous Green's functions with those of Bartlow et al. (2014), who examined the same set of SSEs with the NIF using a uniform elastic half-space model. The use of heterogeneous Green's functions should provide a more accurate picture of the slip distribution and evolution of the SSEs. This will aid in understanding the correlations between SSEs and seismicity and/or tremor and the role of SSEs in the accommodation of plate motion budgets in New Zealand.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Jason P.; Carlson, Deborah K.; Ortiz, Anne
Accurate location of seismic events is crucial for nuclear explosion monitoring. There are several sources of error in seismic location that must be taken into account to obtain high confidence results. Most location techniques account for uncertainties in the phase arrival times (measurement error) and the bias of the velocity model (model error), but they do not account for the uncertainty of the velocity model bias. By determining and incorporating this uncertainty in the location algorithm we seek to improve the accuracy of the calculated locations and uncertainty ellipses. In order to correct for deficiencies in the velocity model, itmore » is necessary to apply station specific corrections to the predicted arrival times. Both master event and multiple event location techniques assume that the station corrections are known perfectly, when in reality there is an uncertainty associated with these corrections. For multiple event location algorithms that calculate station corrections as part of the inversion, it is possible to determine the variance of the corrections. The variance can then be used to weight the arrivals associated with each station, thereby giving more influence to stations with consistent corrections. We have modified an existing multiple event location program (based on PMEL, Pavlis and Booker, 1983). We are exploring weighting arrivals with the inverse of the station correction standard deviation as well using the conditional probability of the calculated station corrections. This is in addition to the weighting already given to the measurement and modeling error terms. We re-locate a group of mining explosions that occurred at Black Thunder, Wyoming, and compare the results to those generated without accounting for station correction uncertainty.« less
High-definition reconstruction of clonal composition in cancer.
Fischer, Andrej; Vázquez-García, Ignacio; Illingworth, Christopher J R; Mustonen, Ville
2014-06-12
The extensive genetic heterogeneity of cancers can greatly affect therapy success due to the existence of subclonal mutations conferring resistance. However, the characterization of subclones in mixed-cell populations is computationally challenging due to the short length of sequence reads that are generated by current sequencing technologies. Here, we report cloneHD, a probabilistic algorithm for the performance of subclone reconstruction from data generated by high-throughput DNA sequencing: read depth, B-allele counts at germline heterozygous loci, and somatic mutation counts. The algorithm can exploit the added information present in correlated longitudinal or multiregion samples and takes into account correlations along genomes caused by events such as copy-number changes. We apply cloneHD to two case studies: a breast cancer sample and time-resolved samples of chronic lymphocytic leukemia, where we demonstrate that monitoring the response of a patient to therapy regimens is feasible. Our work provides new opportunities for tracking cancer development. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
On a study of optically coupled memristive Chua circuits-rhythmogenesis and amplitude death
NASA Astrophysics Data System (ADS)
Chakraborty, Arindam; Ray, Anirban; Basak, Sankar; Roy Chowdhury, A.
2015-07-01
Properties of memristive inductorless Chua circuits are studied when they are coupled optically to characterize the oscillation quenching phenomenon of amplitude death (AD) and oscillation generation procedure of rhythmogenesis. The behaviors of these systems, when studied under coupled condition, show some new features which are not seen previously. This phenomenon is really a novel one as it is the generation of oscillation due to the interaction of two such systems each at their respective steady states. The other event is amplitude death (AD) observed by increase in the coupling strength. The numerical simulation is supported with the data obtained via analogue circuit implementation of the system. Two circuits coupled through a LED (light emitting diode) and LDR (photo resistor) pair show transition to chaotic state under parameter variation. The experimental data was collected with the help of digital to analog converter system. Our data indicates that there exist two different routes to chaos-either through period doubling or without it.
A 3D human neural cell culture system for modeling Alzheimer’s disease
Kim, Young Hye; Choi, Se Hoon; D’Avanzo, Carla; Hebisch, Matthias; Sliwinski, Christopher; Bylykbashi, Enjana; Washicosky, Kevin J.; Klee, Justin B.; Brüstle, Oliver; Tanzi, Rudolph E.; Kim, Doo Yeon
2015-01-01
Stem cell technologies have facilitated the development of human cellular disease models that can be used to study pathogenesis and test therapeutic candidates. These models hold promise for complex neurological diseases such as Alzheimer’s disease (AD) because existing animal models have been unable to fully recapitulate all aspects of pathology. We recently reported the characterization of a novel three-dimensional (3D) culture system that exhibits key events in AD pathogenesis, including extracellular aggregation of β-amyloid and accumulation of hyperphosphorylated tau. Here we provide instructions for the generation and analysis of 3D human neural cell cultures, including the production of genetically modified human neural progenitor cells (hNPCs) with familial AD mutations, the differentiation of the hNPCs in a 3D matrix, and the analysis of AD pathogenesis. The 3D culture generation takes 1–2 days. The aggregation of β-amyloid is observed after 6-weeks of differentiation followed by robust tau pathology after 10–14 weeks. PMID:26068894
Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.
2014-01-01
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514
Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G
2014-12-10
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.
Tsunami Source Identification on the 1867 Tsunami Event Based on the Impact Intensity
NASA Astrophysics Data System (ADS)
Wu, T. R.
2014-12-01
The 1867 Keelung tsunami event has drawn significant attention from people in Taiwan. Not only because the location was very close to the 3 nuclear power plants which are only about 20km away from the Taipei city but also because of the ambiguous on the tsunami sources. This event is unique in terms of many aspects. First, it was documented on many literatures with many languages and with similar descriptions. Second, the tsunami deposit was discovered recently. Based on the literatures, earthquake, 7-meter tsunami height, volcanic smoke, and oceanic smoke were observed. Previous studies concluded that this tsunami was generated by an earthquake with a magnitude around Mw7.0 along the Shanchiao Fault. However, numerical results showed that even a Mw 8.0 earthquake was not able to generate a 7-meter tsunami. Considering the steep bathymetry and intense volcanic activities along the Keelung coast, one reasonable hypothesis is that different types of tsunami sources were existed, such as the submarine landslide or volcanic eruption. In order to confirm this scenario, last year we proposed the Tsunami Reverse Tracing Method (TRTM) to find the possible locations of the tsunami sources. This method helped us ruling out the impossible far-field tsunami sources. However, the near-field sources are still remain unclear. This year, we further developed a new method named 'Impact Intensity Analysis' (IIA). In the IIA method, the study area is divided into a sequence of tsunami sources, and the numerical simulations of each source is conducted by COMCOT (Cornell Multi-grid Coupled Tsunami Model) tsunami model. After that, the resulting wave height from each source to the study site is collected and plotted. This method successfully helped us to identify the impact factor from the near-field potential sources. The IIA result (Fig. 1) shows that the 1867 tsunami event was a multi-source event. A mild tsunami was trigged by a Mw7.0 earthquake, and then followed by the submarine landslide or volcanic events. A near-field submarine landslide and landslide at Mien-Hwa Canyon were the most possible scenarios. As for the volcano scenarios, the volcanic eruption located about 10 km away from Keelung with 2.5x108 m3 disturbed water volume might be a candidate. The detailed scenario results will be presented in the full paper.
Refining lunar impact chronology through high spatial resolution 40Ar/39Ar dating of impact melts
Mercer, Cameron M.; Young, Kelsey E.; Weirich, John R.; Hodges, Kip V.; Jolliff, Bradley L.; Wartho, Jo-Anne; van Soest, Matthijs C.
2015-01-01
Quantitative constraints on the ages of melt-forming impact events on the Moon are based primarily on isotope geochronology of returned samples. However, interpreting the results of such studies can often be difficult because the provenance region of any sample returned from the lunar surface may have experienced multiple impact events over the course of billions of years of bombardment. We illustrate this problem with new laser microprobe 40Ar/39Ar data for two Apollo 17 impact melt breccias. Whereas one sample yields a straightforward result, indicating a single melt-forming event at ca. 3.83 Ga, data from the other sample document multiple impact melt–forming events between ca. 3.81 Ga and at least as young as ca. 3.27 Ga. Notably, published zircon U/Pb data indicate the existence of even older melt products in the same sample. The revelation of multiple impact events through 40Ar/39Ar geochronology is likely not to have been possible using standard incremental heating methods alone, demonstrating the complementarity of the laser microprobe technique. Evidence for 3.83 Ga to 3.81 Ga melt components in these samples reinforces emerging interpretations that Apollo 17 impact breccia samples include a significant component of ejecta from the Imbrium basin impact. Collectively, our results underscore the need to quantitatively resolve the ages of different melt generations from multiple samples to improve our current understanding of the lunar impact record, and to establish the absolute ages of important impact structures encountered during future exploration missions in the inner Solar System. PMID:26601128
Mercer, Cameron M; Young, Kelsey E; Weirich, John R; Hodges, Kip V; Jolliff, Bradley L; Wartho, Jo-Anne; van Soest, Matthijs C
2015-02-01
Quantitative constraints on the ages of melt-forming impact events on the Moon are based primarily on isotope geochronology of returned samples. However, interpreting the results of such studies can often be difficult because the provenance region of any sample returned from the lunar surface may have experienced multiple impact events over the course of billions of years of bombardment. We illustrate this problem with new laser microprobe (40)Ar/(39)Ar data for two Apollo 17 impact melt breccias. Whereas one sample yields a straightforward result, indicating a single melt-forming event at ca. 3.83 Ga, data from the other sample document multiple impact melt-forming events between ca. 3.81 Ga and at least as young as ca. 3.27 Ga. Notably, published zircon U/Pb data indicate the existence of even older melt products in the same sample. The revelation of multiple impact events through (40)Ar/(39)Ar geochronology is likely not to have been possible using standard incremental heating methods alone, demonstrating the complementarity of the laser microprobe technique. Evidence for 3.83 Ga to 3.81 Ga melt components in these samples reinforces emerging interpretations that Apollo 17 impact breccia samples include a significant component of ejecta from the Imbrium basin impact. Collectively, our results underscore the need to quantitatively resolve the ages of different melt generations from multiple samples to improve our current understanding of the lunar impact record, and to establish the absolute ages of important impact structures encountered during future exploration missions in the inner Solar System.
Incidence of Major Cardiovascular Events in Immigrants to Ontario, Canada
Chu, Anna; Rezai, Mohammad R.; Guo, Helen; Maclagan, Laura C.; Austin, Peter C.; Booth, Gillian L.; Manuel, Douglas G.; Chiu, Maria; Ko, Dennis T.; Lee, Douglas S.; Shah, Baiju R.; Donovan, Linda R.; Sohail, Qazi Zain; Alter, David A.
2015-01-01
Background— Immigrants from ethnic minority groups represent an increasing proportion of the population in many high-income countries, but little is known about the causes and amount of variation between various immigrant groups in the incidence of major cardiovascular events. Methods and Results— We conducted the Cardiovascular Health in Ambulatory Care Research Team (CANHEART) Immigrant Study, a big data initiative, linking information from Citizenship and Immigration Canada’s Permanent Resident database to 9 population-based health databases. A cohort of 824 662 first-generation immigrants aged 30 to 74 as of January 2002 from 8 major ethnic groups and 201 countries of birth who immigrated to Ontario, Canada between 1985 and 2000 were compared with a reference group of 5.2 million long-term residents. The overall 10-year age-standardized incidence of major cardiovascular events was 30% lower among immigrants than among long-term residents. East Asian immigrants (predominantly ethnic Chinese) had the lowest incidence overall (2.4 in males, 1.1 in females per 1000 person-years), but this increased with greater duration of stay in Canada. South Asian immigrants, including those born in Guyana, had the highest event rates (8.9 in males, 3.6 in females per 1000 person-years), along with immigrants born in Iraq and Afghanistan. Adjustment for traditional risk factors reduced but did not eliminate the differences in cardiovascular risk between various ethnic groups and long-term residents. Conclusions— Striking differences in the incidence of cardiovascular events exist among immigrants to Canada from different ethnic backgrounds. Traditional risk factors explain a part but not all of these differences. PMID:26324719
Vossen, Catherine J.; Vossen, Helen G. M.; Marcus, Marco A. E.; van Os, Jim; Lousberg, Richel
2013-01-01
In analyzing time-locked event-related potentials (ERPs), many studies have focused on specific peaks and their differences between experimental conditions. In theory, each latency point after a stimulus contains potentially meaningful information, regardless of whether it is peak-related. Based on this assumption, we introduce a new concept which allows for flexible investigation of the whole epoch and does not primarily focus on peaks and their corresponding latencies. For each trial, the entire epoch is partitioned into event-related fixed-interval areas under the curve (ERFIAs). These ERFIAs, obtained at single trial level, act as dependent variables in a multilevel random regression analysis. The ERFIA multilevel method was tested in an existing ERP dataset of 85 healthy subjects, who underwent a rating paradigm of 150 painful and non-painful somatosensory electrical stimuli. We modeled the variability of each consecutive ERFIA with a set of predictor variables among which were stimulus intensity and stimulus number. Furthermore, we corrected for latency variations of the P2 (260 ms). With respect to known relationships between stimulus intensity, habituation, and pain-related somatosensory ERP, the ERFIA method generated highly comparable results to those of commonly used methods. Notably, effects on stimulus intensity and habituation were also observed in non-peak-related latency ranges. Further, cortical processing of actual stimulus intensity depended on the intensity of the previous stimulus, which may reflect pain-memory processing. In conclusion, the ERFIA multilevel method is a promising tool that can be used to study event-related cortical processing. PMID:24224018
Revealing long-range multiparticle collectivity in small collision systems via subevent cumulants
NASA Astrophysics Data System (ADS)
Jia, Jiangyong; Zhou, Mingliang; Trzupek, Adam
2017-09-01
Multiparticle azimuthal cumulants, often used to study collective flow in high-energy heavy-ion collisions, have recently been applied in small collision systems such as p p and p +A to extract the second-order azimuthal harmonic flow v2. Recent observation of four-, six-, and eight-particle cumulants with "correct sign" c2{4 } <0 , c2{6 } >0 , c2{8 } <0 and approximate equality of the inferred single-particle harmonic flow, v2{4 } ≈v2{6 } ≈v2{8 } , have been used as strong evidence for a collective emission of all the soft particles produced in the collisions. We show that these relations in principle could be violated due to the non-Gaussianity in the event-by-event fluctuation of flow and/or nonflow. Furthermore, we show, using p p events generated with the pythia model, that c2{2 k } obtained with the standard cumulant method are dominated by nonflow from dijets. An alternative cumulant method based on two or more η -separated subevents is proposed to suppress the dijet contribution. The new method is shown to be able to recover a flow signal as low as 4% imposed on the pythia events, independently of how the event activity class is defined. Therefore the subevent cumulant method offers a more robust way of studying collectivity based on the existence of long-range azimuthal correlations between multiple distinct η ranges. The prospect of using the subevent cumulants to study collective flow in A +A collisions, in particular its longitudinal dynamics, is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jia, Jiangyong; Zhou, Mingliang; Trzupek, Adam
Multi-particle azimuthal cumulants, often used to study collective flow in high-energy heavy-ion collisions, have recently been applied in small collision systems such as pp and p+A to extract the second-order azimuthal harmonic flow v 2. Recent observation of four-, six- and eight-particle cumulants with “correct sign” c 2{4} < 0, c 2{6} > 0, c 2{8} < 0 and approximate equality of the inferred single-particle harmonic flow, v 2{4} ≈ v 2{6} ≈ v 2{8}, have been used as strong evidence for a collective emission of all soft particles produced in the collisions. In this paper, we show that thesemore » relations in principle could be violated due to the non-Gaussianity in the event-by-event fluctuation of flow and/or non-flow. Furthermore, we show, using pp events generated with the PYTHIA model, that c 2{2k} obtained with standard cumulant method are dominated by non-flow from dijets. An alternative cumulant method based on two or more η-separated subevents is proposed to suppress the dijet contribution. The new method is shown to be able to recover a flow signal as low as 4% imposed on the PYTHIA events, independently of how the event activity class is defined. Therefore the subevent cumulant method offers a more robust way of studying collectivity based on the existence of long-range azimuthal correlations between multiple distinct η ranges. Finally, the prospect of using the subevent cumulants to study collective flow in A+A collisions, in particular its longitudinal dynamics, is discussed.« less
Revealing long-range multiparticle collectivity in small collision systems via subevent cumulants
Jia, Jiangyong; Zhou, Mingliang; Trzupek, Adam
2017-09-25
Multi-particle azimuthal cumulants, often used to study collective flow in high-energy heavy-ion collisions, have recently been applied in small collision systems such as pp and p+A to extract the second-order azimuthal harmonic flow v 2. Recent observation of four-, six- and eight-particle cumulants with “correct sign” c 2{4} < 0, c 2{6} > 0, c 2{8} < 0 and approximate equality of the inferred single-particle harmonic flow, v 2{4} ≈ v 2{6} ≈ v 2{8}, have been used as strong evidence for a collective emission of all soft particles produced in the collisions. In this paper, we show that thesemore » relations in principle could be violated due to the non-Gaussianity in the event-by-event fluctuation of flow and/or non-flow. Furthermore, we show, using pp events generated with the PYTHIA model, that c 2{2k} obtained with standard cumulant method are dominated by non-flow from dijets. An alternative cumulant method based on two or more η-separated subevents is proposed to suppress the dijet contribution. The new method is shown to be able to recover a flow signal as low as 4% imposed on the PYTHIA events, independently of how the event activity class is defined. Therefore the subevent cumulant method offers a more robust way of studying collectivity based on the existence of long-range azimuthal correlations between multiple distinct η ranges. Finally, the prospect of using the subevent cumulants to study collective flow in A+A collisions, in particular its longitudinal dynamics, is discussed.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
...--AA08 Special Local Regulations for Marine Events; Temporary Change of Dates for Recurring Marine Events... change of the enforcement period for a special local regulation of a recurring marine event in the Fifth... rulemaking will be initiated on this matter; rather, the event will be held as detailed in the existing...
Konishi, Hirokazu; Miyauchi, Katsumi; Dohi, Tomotaka; Tsuboi, Shuta; Ogita, Manabu; Naito, Ryo; Kasai, Takatoshi; Tamura, Hiroshi; Okazaki, Shinya; Isoda, Kikuo; Daida, Hiroyuki
2016-04-01
The aim of this study is to compare first- and new-generation drug-eluting stents (DESs) which are implanted in long lesion. Stent length is known to be a predictor of adverse events after percutaneous coronary intervention (PCI), even with the first-generation DESs. The introduction of new-generation DESs has reduced the rates of adverse clinical events. However, the impact of stent length on long-term clinical outcomes is not well known. A total of 1181 consecutive patients who underwent PCI using either a first-generation DES (n = 885) or a new-generation DES (n = 296) between 2004 and 2011 were investigated. In each of the stent groups, the patients were divided into two groups by stent length (>32 and ≤32 mm) and compared. During the follow-up period, the incidence of major adverse cardiac events (MACEs) was significantly higher for patients with long stents implanted than with short stents (P < 0.01; log-rank test) in the first-generation DES group. However, there was no difference in the incidence of MACEs between the long- and short-stent groups in the new-generation DES group (P = 0.24; log-rank test). On multivariate Cox regression analysis, stent length was not associated with adverse events in the new-generation DES groups [hazard ratio (HR) 0.87; 95 % confidence interval (95 % CI) 0.71-1.04; P = 0.14]. Implanted stent length was significantly associated with a higher risk of MACEs in patients who received first-generation DESs, but not in patients who received the new-generation DESs.
Towards an improved inventory of Glacial Lake Outburst Floods in the Himalayas
NASA Astrophysics Data System (ADS)
Veh, Georg; Walz, Ariane; Korup, Oliver; Roessner, Sigrid
2016-04-01
The retreat of glaciers in the Himalayas and the associated release of meltwater have prompted the formation and growth of thousands of glacial lakes in the last decades. More than 2,200 of these lakes have developed in unconsolidated moraine material. These lakes can drain in a single event, producing potentially destructive glacial lake outburst floods (GLOFs). Only 44 GLOFs in the Himalayas have been documented in more detail since the 1930s, and evidence for a change, let alone an increase, in the frequency of these flood events remains elusive. The rare occurrence of GLOFs is counterintuitive to our hypothesis that an increasing amount of glacial lakes has to be consistent with a rising amount of outburst floods. Censoring bias affects the GLOF record, such that mostly larger floods with commensurate impact have been registered. Existing glacial lake inventories are also of limited help for the identification of GLOFs, as they were created in irregular time steps using different methodological approach and covering different regional extents. We discuss the key requirements for generating a more continuous, close to yearly time series of glacial lake evolution for the Himalayan mountain range using remote sensing data. To this end, we use sudden changes in glacial lake areas as the key diagnostic of dam breaks and outburst floods, employing the full archive of cloud-free Landsat data (L5, L7 and L8) from 1988 to 2015. SRTM and ALOS World 3D topographic data further improve the automatic detection of glacial lakes in an alpine landscape that is often difficult to access otherwise. Our workflow comprises expert-based classification of water bodies using thresholds and masks from different spectral indices and band ratios. A first evaluation of our mapping approach suggests that GLOFs reported during the study period could be tracked independently by a significant reduction of lake size between two subsequent Landsat scenes. This finding supports the feasibility of generating a continuous glacial lake database, and thus, of an updated GLOF inventory. We discuss several challenges to our classification method, including complete or partial freezing of lake surfaces, as well as effects of turbidity and mountain shadows. Our future work will use this new inventory to infer the key environmental parameters of GLOF events in the Himalayas and to estimate regional hazard potential from existing lakes.
Veenema, Tener Goodwin; Deruggiero, Katherine; Losinski, Sarah; Barnett, Daniel
Strong leadership is critical in disaster situations when "patient surge" challenges a hospital's capacity to respond and normally acceptable patterns of care are disrupted. Activation of the emergency operations plan triggers an incident command system structure for leadership decision making. Yet, implementation of the emergency operations plan and incident command system protocols is ultimately subject to nursing and hospital leadership at the service- and unit level. The results of these service-/unit-based leadership decisions have the potential to directly impact staff and patient safety, quality of care, and ultimately, patient outcomes. Despite the critical nature of these events, nurse leaders and administrators receive little education regarding leadership and decision making during disaster events. The purpose of this study is to identify essential competencies of nursing and hospital administrators' leadership during disaster events. An integrative mixed-methods design combining qualitative and quantitative approaches to data collection and analysis was used. Five focus groups were conducted with nurse leaders and hospital administrators at a large urban hospital in the Northeastern United States in a collaborative group process to generate relevant leadership competencies. Concept Systems Incorporated was used to sort, prioritize, and analyze the data (http://conceptsystemsinc.com/). The results suggest that participants' institutional knowledge (of existing resources, communications, processes) and prior disaster experience increase leadership competence.
Application of data cubes for improving detection of water cycle extreme events
NASA Astrophysics Data System (ADS)
Teng, W. L.; Albayrak, A.
2015-12-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.
Characteristics and Challenges of Open-Water Swimming Performance: A Review.
Baldassarre, Roberto; Bonifazi, Marco; Zamparo, Paola; Piacentini, Maria Francesca
2017-11-01
Although the popularity of open-water swimming (OWS) events has significantly increased in the last decades, specific studies regarding performance of elite or age-group athletes in these events are scarce. To analyze the existing literature on OWS. Relevant literature was located via computer-generated citations. During August 2016, online computer searches on PubMed and Scopus databases were conducted to locate published research. The number of participants in ultraendurance swimming events has substantially increased in the last 10 y. In elite athletes there is a higher overall competitive level of women than of men. The body composition of female athletes (different percentage and distribution of fat tissue) shows several advantages (more buoyancy and less drag) in aquatic conditions that determine the small difference between males and females. The main physiological characteristics of open-water swimmers (OW swimmers) are the ability to swim at high percentage of [Formula: see text] (80-90%) for many hours. Furthermore, to sustain high velocity for many hours, endurance swimmers need a high propelling efficiency and a low energy cost. Open-water races may be characterized by extreme environmental conditions (water temperature, tides, currents, and waves) that have an overall impact on performance, influencing tactics and pacing. Future studies are needed to study OWS in both training and competition.
Application of Data Cubes for Improving Detection of Water Cycle Extreme Events
NASA Technical Reports Server (NTRS)
Albayrak, Arif; Teng, William
2015-01-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).
76 FR 50669 - Safety Zones; Eleventh Coast Guard District Annual Fireworks Events
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-16
... occurring, add new unlisted annual fireworks events to the regulations, and standardize the format for all... to be added. In addition, information for those events that continue to occur has changed in some... sections will be updated or added as follows: update with current information existing events, add...
10 CFR 72.92 - Design basis external natural events.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Design basis external natural events. 72.92 Section 72.92... Evaluation Factors § 72.92 Design basis external natural events. (a) Natural phenomena that may exist or that... must be adopted for evaluating the design basis external natural events based on the characteristics of...
10 CFR 72.92 - Design basis external natural events.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Design basis external natural events. 72.92 Section 72.92... Evaluation Factors § 72.92 Design basis external natural events. (a) Natural phenomena that may exist or that... must be adopted for evaluating the design basis external natural events based on the characteristics of...
10 CFR 72.92 - Design basis external natural events.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Design basis external natural events. 72.92 Section 72.92... Evaluation Factors § 72.92 Design basis external natural events. (a) Natural phenomena that may exist or that... must be adopted for evaluating the design basis external natural events based on the characteristics of...
Isolation of Novel CreERT2-Driver Lines in Zebrafish Using an Unbiased Gene Trap Approach
Jungke, Peggy; Hammer, Juliane; Hans, Stefan; Brand, Michael
2015-01-01
Gene manipulation using the Cre/loxP-recombinase system has been successfully employed in zebrafish to study gene functions and lineage relationships. Recently, gene trapping approaches have been applied to produce large collections of transgenic fish expressing conditional alleles in various tissues. However, the limited number of available cell- and tissue-specific Cre/CreERT2-driver lines still constrains widespread application in this model organism. To enlarge the pool of existing CreERT2-driver lines, we performed a genome-wide gene trap screen using a Tol2-based mCherry-T2a-CreERT2 (mCT2aC) gene trap vector. This cassette consists of a splice acceptor and a mCherry-tagged variant of CreERT2 which enables simultaneous labeling of the trapping event, as well as CreERT2 expression from the endogenous promoter. Using this strategy, we generated 27 novel functional CreERT2-driver lines expressing in a cell- and tissue-specific manner during development and adulthood. This study summarizes the analysis of the generated CreERT2-driver lines with respect to functionality, expression, integration, as well as associated phenotypes. Our results significantly enlarge the existing pool of CreERT2-driver lines in zebrafish and combined with Cre–dependent effector lines, the new CreERT2-driver lines will be important tools to manipulate the zebrafish genome. PMID:26083735
Generalised synthesis of space-time variability in flood response: Dynamics of flood event types
NASA Astrophysics Data System (ADS)
Viglione, Alberto; Battista Chirico, Giovanni; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter
2010-05-01
A analytical framework is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.
Quantifying space-time dynamics of flood event types
NASA Astrophysics Data System (ADS)
Viglione, Alberto; Chirico, Giovanni Battista; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter
2010-11-01
SummaryA generalised framework of space-time variability in flood response is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.
Life-span retrieval of public events: Reminiscence bump for high-impact events, recency for others.
Tekcan, Ali I; Boduroglu, Aysecan; Mutlutürk, Aysu; Aktan Erciyes, Aslı
2017-10-01
Although substantial evidence exists showing a reliable reminiscence bump for personal events, data regarding retrieval distributions for public events have been equivocal. The primary aim of the present study was to address life-span retrieval distributions of different types of public events in comparison to personal events, and to test whether the existing accounts of the bump can explain the distribution of public events. We asked a large national sample to report the most important, happiest, and saddest personal events and the most important, happiest, saddest, most proud, most fearful, and most shameful public events. We found a robust bump corresponding to the third decade of life for the happiest and the most important positive but not for the saddest and most important negative personal events. For the most important public events, a bump emerged only for the two most frequently mentioned events. Distributions of public events cued with emotions were marked by recency. These results point to potential differences in retrieval of important personal and public events. While the life-script framework well accounts for the findings regarding important personal events, a chronologically retroactive search seem to guide retrieval of public events. Reminiscence bump observed for the two public events suggest that age-at-event affects recall of public events to the degree that the events are high-impact ones that dominate nation's collective memory. Results provide further evidence that the bump is not unitary and points to importance of event type and memory elicitation method with regard to competing explanations of the phenomenon.
Dune recovery after storm erosion on a high-energy beach: Vougot Beach, Brittany (France)
NASA Astrophysics Data System (ADS)
Suanez, Serge; Cariolet, Jean-Marie; Cancouët, Romain; Ardhuin, Fabrice; Delacourt, Christophe
2012-02-01
On 10th March 2008, the high energy storm Johanna hit the French Atlantic coast, generating severe dune erosion on Vougot Beach (Brittany, France). In this paper, the recovery of the dune of Vougot Beach is analysed through a survey of morphological changes and hydrodynamic conditions. Data collection focused on the period immediately following storm Johanna until July 2010, i.e. over two and a half years. Results showed that the dune retreated by a maximum of almost 6 m where storm surge and wave attack were the most energetic. Dune retreat led to the creation of accommodation space for the storage of sediment by widening and elevating space between the pre- and post-storm dune toe, and reducing impacts of the storm surge. Dune recovery started in the month following the storm event and is still ongoing. It is characterised by the construction of "secondary" embryo dunes, which recovered at an average rate of 4-4.5 cm per month, although average monthly volume changes varied from - 1 to 2 m 3.m - 1 . These embryo dunes accreted due to a large aeolian sand supply from the upper tidal beach to the existing foredune. These dune-construction processes were facilitated by growth of vegetation on low-profile embryo dunes promoting backshore accretion. After more than two years of survey, the sediment budget of the beach/dune system showed that more than 10,000 m 3 has been lost by the upper tidal beach. We suggest that seaward return currents generated during the storm of 10th March 2008 are responsible for offshore sediment transport. Reconstitution of the equilibrium beach profile following the storm event may therefore have generated cross-shore sediment redistribution inducing net erosion in the tidal zone.
Generating survival times to simulate Cox proportional hazards models with time-varying covariates.
Austin, Peter C
2012-12-20
Simulations and Monte Carlo methods serve an important role in modern statistical research. They allow for an examination of the performance of statistical procedures in settings in which analytic and mathematical derivations may not be feasible. A key element in any statistical simulation is the existence of an appropriate data-generating process: one must be able to simulate data from a specified statistical model. We describe data-generating processes for the Cox proportional hazards model with time-varying covariates when event times follow an exponential, Weibull, or Gompertz distribution. We consider three types of time-varying covariates: first, a dichotomous time-varying covariate that can change at most once from untreated to treated (e.g., organ transplant); second, a continuous time-varying covariate such as cumulative exposure at a constant dose to radiation or to a pharmaceutical agent used for a chronic condition; third, a dichotomous time-varying covariate with a subject being able to move repeatedly between treatment states (e.g., current compliance or use of a medication). In each setting, we derive closed-form expressions that allow one to simulate survival times so that survival times are related to a vector of fixed or time-invariant covariates and to a single time-varying covariate. We illustrate the utility of our closed-form expressions for simulating event times by using Monte Carlo simulations to estimate the statistical power to detect as statistically significant the effect of different types of binary time-varying covariates. This is compared with the statistical power to detect as statistically significant a binary time-invariant covariate. Copyright © 2012 John Wiley & Sons, Ltd.
Analysis of Emergency Diesel Generators Failure Incidents in Nuclear Power Plants
NASA Astrophysics Data System (ADS)
Hunt, Ronderio LaDavis
In early years of operation, emergency diesel generators have had a minimal rate of demand failures. Emergency diesel generators are designed to operate as a backup when the main source of electricity has been disrupted. As of late, EDGs (emergency diesel generators) have been failing at NPPs (nuclear power plants) around the United States causing either station blackouts or loss of onsite and offsite power. These failures occurred from a specific type called demand failures. This thesis evaluated the current problem that raised concern in the nuclear industry which was averaging 1 EDG demand failure/year in 1997 to having an excessive event of 4 EDG demand failure year which occurred in 2011. To determine the next occurrence of the extreme event and possible cause to an event of such happening, two analyses were conducted, the statistical and root cause analysis. Considering the statistical analysis in which an extreme event probability approach was applied to determine the next occurrence year of an excessive event as well as, the probability of that excessive event occurring. Using the root cause analysis in which the potential causes of the excessive event occurred by evaluating, the EDG manufacturers, aging, policy changes/ maintenance practices and failure components. The root cause analysis investigated the correlation between demand failure data and historical data. Final results from the statistical analysis showed expectations of an excessive event occurring in a fixed range of probability and a wider range of probability from the extreme event probability approach. The root-cause analysis of the demand failure data followed historical statistics for the EDG manufacturer, aging and policy changes/ maintenance practices but, indicated a possible cause regarding the excessive event with the failure components. Conclusions showed the next excessive demand failure year, prediction of the probability and the next occurrence year of such failures, with an acceptable confidence level, was difficult but, it was likely that this type of failure will not be a 100 year event. It was noticeable to see that the majority of the EDG demand failures occurred within the main components as of 2005. The overall analysis of this study provided from percentages, indicated that it would be appropriate to make the statement that the excessive event was caused by the overall age (wear and tear) of the Emergency Diesel Generators in Nuclear Power Plants. Future Work will be to better determine the return period of the excessive event once the occurrence has happened for a second time by implementing the extreme event probability approach.
Epigenetics and Future Generations.
Del Savio, Lorenzo; Loi, Michele; Stupka, Elia
2015-10-01
Recent evidence of intergenerational epigenetic programming of disease risk broadens the scope of public health preventive interventions to future generations, i.e. non existing people. Due to the transmission of epigenetic predispositions, lifestyles such as smoking or unhealthy diet might affect the health of populations across several generations. While public policy for the health of future generations can be justified through impersonal considerations, such as maximizing aggregate well-being, in this article we explore whether there are rights-based obligations supervening on intergenerational epigenetic programming despite the non-identity argument, which challenges this rationale in case of policies that affect the number and identity of future people. We propose that rights based obligations grounded in the interests of non-existing people might fall upon existing people when generations overlap. In particular, if environmental exposure in F0 (i.e. existing people) will affect the health of F2 (i.e. non-existing people) through epigenetic programming, then F1 (i.e. existing and overlapping with both F0 and F2) might face increased costs to address F2's condition in the future: this might generate obligations upon F0 from various distributive principles, such as the principle of equal opportunity for well being. © 2015 John Wiley & Sons Ltd.
Recent applications of THERMUS
NASA Astrophysics Data System (ADS)
Wheaton, S.; Hauer, M.
2011-12-01
Some of the most recent applications of the statistical-thermal model package, THERMUS, are reviewed. These applications focus on fluctuation and correlation observables in an ideal particle and anti-particle gas in limited momentum space segments, as well as in a hadron resonance gas. In the case of the latter, a Monte Carlo event generator, utilising THERMUS functionality and assuming thermal production of hadrons, is discussed. The system under consideration is sampled grand canonically in the Boltzmann approximation. A re-weighting scheme is then introduced to account for conservation of charges (baryon number, strangeness, electric charge) and energy and momentum, effectively allowing for extrapolation of grand canonical results to the micro canonical limit. The approach utilised in this and other applications suggests improvements to existing THERMUS calculations.
Water Ice Clouds in the Martian Atmosphere: A Comparison of Two Methods
NASA Technical Reports Server (NTRS)
Hale, A. S.; Tamppari, L. K.; Christensen, P. R.; Smith, M. D.; Bass, Deborah; Pearl, J. C.
2003-01-01
To date, the only two data sets offer the potential to examine year-to-year changes in cloud features over an entire Martian year: the Viking Infrared Thermal Mapper (IRTM) data set and the Mars Global Surveyor (MGS) Thermal Emission Spectrometer (TES) data set. We have examined the TES data in the same way in which we examined the Viking IRTM data. This provides water-ice cloud information separated in time by 12 Martian years. Since the data are analyzed with the same method, we obtain a very accurate 'apples to apples' comparison, and can generate a historical record of the subtleties of this annual event. Consequently, it is desirable to compare their results to ours to see what differences exist.
Heusser, Linda E.; Hendy, Ingrid L.; Barron, John A.
2015-01-01
The presence of xeric vegetation in SBB coincides with major drought events recorded in tree rings and low lake levels elsewhere in California except for the brief drought between AD 1130–1160. Correlative diatom and terrigenous sediment input proxy records from SBB are largely supportive of the pollen record predominantly linking the MCA with drought and La Niña-like conditions and the LIA with wetter (more El Niño-like) conditions. Differences between paleoclimate proxies (pollen, diatoms, and terrigenous sediment) in SBB exist, however, possibly reflecting the temporal and spatial differences in the generation of each proxy record, as well as their individual sensitivity to climate change.
Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy
2016-01-01
Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.
Partnering for Vaccine Emerging Markets—Berlin, June 10–11, 2013
Onraedt, Annelies
2013-01-01
Phacilitates 1st Partnering event for Vaccine Emerging Markets brought together approximately 100 attendees from developed and developing world vaccine manufacturers, leading non-profit organizations and industry suppliers. The goal was to discuss the vaccine needs in the developing world and how these needs can be met by leveraging collaboration and partnership models, by improving access to existing, new and next generation vaccines, by using novel technologies to drive competitive advantage and economics of vaccine manufacturing and by investing in localized capacity, including capacity for pandemic vaccines. The present article summarizes insights out of 30 oral contributions on how quality and capacity requirements can be balanced with cost by using novel manufacturing technologies and operating models. PMID:23966097
Fulton, John W.; Wagner, Chad R.; Rogers, Megan E.; Zimmerman, Gregory F.
2010-01-01
Based on the statistical targets established, the hydraulic model results suggest that an additional 2428 m2 or a 30-percent increase in suitable mussel habitat could be generated at the replacement-bridge site when compared to the baseline condition associated with the existing bridge at that same location. The study did not address the influences of substrate, acid mine drainage, sediment loads from tributaries, and surface-water/ground-water exchange on mussel habitat. Future studies could include methods for quantifying (1) channel–substrate composition and distribution using tools such as hydroacoustic echosounders specifically designed and calibrated to identify bed composition and mussel populations, (2) surface-water and ground-water interactions, and (3) a high-streamflow event.
Directional Antineutrino Detection
NASA Astrophysics Data System (ADS)
Safdi, Benjamin R.; Suerfu, Burkhant
2015-02-01
We propose the first event-by-event directional antineutrino detector using inverse beta decay (IBD) interactions on hydrogen, with potential applications including monitoring for nuclear nonproliferation, spatially mapping geoneutrinos, characterizing the diffuse supernova neutrino background and searching for new physics in the neutrino sector. The detector consists of adjacent and separated target and capture scintillator planes. IBD events take place in the target layers, which are thin enough to allow the neutrons to escape without scattering elastically. The neutrons are detected in the thicker boron-loaded capture layers. The location of the IBD event and the momentum of the positron are determined by tracking the positron's trajectory through the detector. Our design is a straightforward modification of existing antineutrino detectors; a prototype could be built with existing technology.
Episodic simulation of future events is impaired in mild Alzheimer's disease
Addis, Donna Rose; Sacchetti, Daniel C.; Ally, Brandon A.; Budson, Andrew E.; Schacter, Daniel L.
2009-01-01
Recent neuroimaging studies have demonstrated that both remembering the past and simulating the future activate a core neural network including the medial temporal lobes. Regions of this network, in particular the medial temporal lobes, are prime sites for amyloid deposition and are structurally and functionally compromised in Alzheimer's disease (AD). While we know some functions of this core network, specifically episodic autobiographical memory, are impaired in AD, no study has examined whether future episodic simulation is similarly impaired. We tested the ability of sixteen AD patients and sixteen age-matched controls to generate past and future autobiographical events using an adapted version of the Autobiographical Interview. Participants also generated five remote autobiographical memories from across the lifespan. Event transcriptions were segmented into distinct details, classified as either internal (episodic) or external (non-episodic). AD patients exhibited deficits in both remembering past events and simulating future events, generating fewer internal and external episodic details than healthy older controls. The internal and external detail scores were strongly correlated across past and future events, providing further evidence of the close linkages between the mental representations of past and future. PMID:19497331
A global flash flood forecasting system
NASA Astrophysics Data System (ADS)
Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin
2016-04-01
The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial resolution appropriate to the NWP system. We then demonstrate how these warning areas could eventually complement existing global systems such as the Global Flood Awareness System (GloFAS), to give warnings of flash floods. This work demonstrates the possibility of creating a global flash flood forecasting system based on forecasts from existing global NWP systems. Future developments, in post-processing for example, will need to address an under-prediction bias, for extreme point rainfall, that is innate to current-generation global models.
Using critical realism as a framework in pharmacy education and social pharmacy research.
Oltmann, Carmen; Boughey, Chrissie
2012-01-01
This article challenges the idea that positivism is capable of representing the complexity of social pharmacy and pharmacy education. It is argued that critical realism provides a framework that allows researchers to look at the nature of reality and at mechanisms that produce, or have the tendency to produce, events and experiences of those events. Critical realism is a framework, not a method. It allows researchers to make observations about phenomena and explain the relationships and connections involved. The researcher has to look for mechanisms and structures that could explain why the phenomena, the connections, and the relationships exist (or do not) and then try to show that these mechanisms do exist. This article first contextualizes critical realism, then briefly describes it, and lastly exemplifies the use of critical realism in a discussion of a research project conducted in pharmacy education. Critical realism may be particularly useful in interdisciplinary research, for example, where practitioners and researchers are working together in a social pharmacy or pharmacy education setting. Critical realism requires the practitioners and the researchers to question and make known their assumptions about their own realities and to think of a complex problem or phenomenon in terms of a stratified reality, generative mechanisms, and tendencies. Critical realism may make research more rigorous and also allow researchers to conceive of a greater breadth of research designs for their work. Copyright © 2012 Elsevier Inc. All rights reserved.
CREM monitoring: a wireless RF application
NASA Astrophysics Data System (ADS)
Valencia, J. D.; Burghard, B. J.; Skorpik, J. R.; Silvers, K. L.; Schwartz, M. J.
2005-05-01
Recent security lapses within the Department of Energy laboratories prompted the establishment and implementation of additional procedures and training for operations involving classified removable electronic media (CREM) storage. In addition, the definition of CREM has been expanded and the number of CREM has increased significantly. Procedures now require that all CREM be inventoried and accounted for on a weekly basis. Weekly inventories consist of a physical comparison of each item against the reportable inventory listing. Securing and accounting for CREM is a continuous challenge for existing security systems. To address this challenge, an innovative framework, encompassing a suite of technologies, has been developed by Pacific Northwest National Laboratory (PNNL) to monitor, track, and locate CREM in safes, vaults, and storage areas. This Automated Removable Media Observation and Reporting (ARMOR)framework, described in this paper, is an extension of an existing PNNL program, SecureSafe. The key attributes of systems built around the ARMOR framework include improved accountability, reduced risk of human error, improved accuracy and timeliness of inventory data, and reduced costs. ARMOR solutions require each CREM to be tagged with a unique electronically readable ID code. Inventory data is collected from tagged CREM at regular intervals and upon detection of an access event. Automated inventory collection and report generation eliminates the need for hand-written inventory sheets and allows electronic transfer of the collected inventory data to a modern electronic reporting system. An electronic log of CREM access events is maintained, providing enhanced accountability for daily/weekly checks, routine audits, and follow-up investigations.
Knowledge Representation Standards and Interchange Formats for Causal Graphs
NASA Technical Reports Server (NTRS)
Throop, David R.; Malin, Jane T.; Fleming, Land
2005-01-01
In many domains, automated reasoning tools must represent graphs of causally linked events. These include fault-tree analysis, probabilistic risk assessment (PRA), planning, procedures, medical reasoning about disease progression, and functional architectures. Each of these fields has its own requirements for the representation of causation, events, actors and conditions. The representations include ontologies of function and cause, data dictionaries for causal dependency, failure and hazard, and interchange formats between some existing tools. In none of the domains has a generally accepted interchange format emerged. The paper makes progress towards interoperability across the wide range of causal analysis methodologies. We survey existing practice and emerging interchange formats in each of these fields. Setting forth a set of terms and concepts that are broadly shared across the domains, we examine the several ways in which current practice represents them. Some phenomena are difficult to represent or to analyze in several domains. These include mode transitions, reachability analysis, positive and negative feedback loops, conditions correlated but not causally linked and bimodal probability distributions. We work through examples and contrast the differing methods for addressing them. We detail recent work in knowledge interchange formats for causal trees in aerospace analysis applications in early design, safety and reliability. Several examples are discussed, with a particular focus on reachability analysis and mode transitions. We generalize the aerospace analysis work across the several other domains. We also recommend features and capabilities for the next generation of causal knowledge representation standards.
Single-photon technique for the detection of periodic extraterrestrial laser pulses.
Leeb, W R; Poppe, A; Hammel, E; Alves, J; Brunner, M; Meingast, S
2013-06-01
To draw humankind's attention to its existence, an extraterrestrial civilization could well direct periodic laser pulses toward Earth. We developed a technique capable of detecting a quasi-periodic light signal with an average of less than one photon per pulse within a measurement time of a few tens of milliseconds in the presence of the radiation emitted by an exoplanet's host star. Each of the electronic events produced by one or more single-photon avalanche detectors is tagged with precise time-of-arrival information and stored. From this we compute a histogram displaying the frequency of event-time differences in classes with bin widths on the order of a nanosecond. The existence of periodic laser pulses manifests itself in histogram peaks regularly spaced at multiples of the-a priori unknown-pulse repetition frequency. With laser sources simulating both the pulse source and the background radiation, we tested a detection system in the laboratory at a wavelength of 850 nm. We present histograms obtained from various recorded data sequences with the number of photons per pulse, the background photons per pulse period, and the recording time as main parameters. We then simulated a periodic signal hypothetically generated on a planet orbiting a G2V-type star (distance to Earth 500 light-years) and show that the technique is capable of detecting the signal even if the received pulses carry as little as one photon on average on top of the star's background light.
The event-related potential effects of cognitive conflict in a Chinese character-generation task.
Qiu, Jiang; Zhang, Qinglin; Li, Hong; Luo, Yuejia; Yin, Qinging; Chen, Antao; Yuan, Hong
2007-06-11
High-density event-related potentials were recorded to examine the electrophysiologic correlates of the evaluation of possible answers provided during a Chinese character-generation task. We examined three conditions: the character given was what participants initially generated (Consistent answer), the character given was correct (Unexpected Correct answer), or it was incorrect (Unexpected Incorrect answer). Results showed that Unexpected Correct and Incorrect answers elicited a more negative event-related potential deflection (N320) than did Consistent answers between 300 and 400 ms. Dipole source analysis of difference waves (Unexpected Correct or Incorrect minus Consistent answers) localized the generator of the N320 in the anterior cingulate cortex. The N320 therefore likely reflects the cognitive change or conflict between old and new ways of thinking while identifying and judging characters.
Exploring Statistical Characterizations of Morphologic Change and Variability: Fire Island, New York
NASA Astrophysics Data System (ADS)
Lentz, E. E.; Hapke, C. J.
2012-12-01
A comprehensive understanding of coastal barrier behavior requires high-resolution observations that capture a wide range of morphological changes occurring over a range of spatial and temporal scales. Fire Island National Seashore, located along the coast of Long Island, New York, is a well studied barrier island coast where understanding how morphological changes contribute to barrier island vulnerability have important implications for coastal land management. Previous work has shown that morphologic differences in eastern and western reaches are attributable to the underlying geology and variations sediment transport in the system. In this study, we further explore western and eastern differences and variability with lidar-derived topographic surfaces to provide a unique and comprehensive investigation of dune-beach change at Fire Island, New York. Continuous topographic surfaces generated from 12 lidar surveys collected between 1998 and 2011 are used to examine the three-dimensional variability over a range of time periods over the 50 km long island. Because surveys were collected over a range of seasons and in response to a number of storm events, we explore morphologic configurations reflecting the seasonality, post-storm configuration, and replenishment response to the system through the generation of a representative or average surface. These averaged surfaces provide the context for what would be an expected or typical coastal configuration under certain conditions, and through comparison with an individual event, can be used to derive an event-specific spatial-change signature. To investigate anthropogenic influences, differences in morphology between a survey collected after a substantial beach replenishment project and a typical fair-weather configuration averaged from six surveys are determined. Storm response variations are also explored by assessing differences between Tropical Storm Irene (2011), Nor'Ida (2009), and a typical post-storm configuration averaged from five post-storm surveys. In addition to averaged surfaces, surveys are combined to generate a new raster surface reflecting cell by cell standard deviations over a defined period. Standard deviation surfaces are generated to highlight 1) where areas of highest and lowest morphologic variation are located over the entire period, and 2) whether spatial similarities exist in variability between storm and non-storm morphologies. Results show there are distinct and variable responses in eastern and western reaches attributable to wave climate, profile gradient, and offshore bathymetry, as well as to a general along-coast increase in sediment availability.
Comprehensive Assessment of Models and Events based on Library tools (CAMEL)
NASA Astrophysics Data System (ADS)
Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.
2017-12-01
At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.
Lesson Plan Prototype for International Space Station's Interactive Video Education Events
NASA Technical Reports Server (NTRS)
Zigon, Thomas
1999-01-01
The outreach and education components of the International Space Station Program are creating a number of materials, programs, and activities that educate and inform various groups as to the implementation and purposes of the International Space Station. One of the strategies for disseminating this information to K-12 students involves an electronic class room using state of the art video conferencing technology. K-12 classrooms are able to visit the JSC, via an electronic field trip. Students interact with outreach personnel as they are taken on a tour of ISS mockups. Currently these events can be generally characterized as: Being limited to a one shot events, providing only one opportunity for students to view the ISS mockups; Using a "one to many" mode of communications; Using a transmissive, lecture based method of presenting information; Having student interactions limited to Q&A during the live event; Making limited use of media; and Lacking any formal, performance based, demonstration of learning on the part of students. My project involved developing interactive lessons for K-12 students (specifically 7th grade) that will reflect a 2nd generation design for electronic field trips. The goal of this design will be to create electronic field trips that will: Conform to national education standards; More fully utilize existing information resources; Integrate media into field trip presentations; Make support media accessible to both presenters and students; Challenge students to actively participate in field trip related activities; and Provide students with opportunities to demonstrate learning
Estimating rare events in biochemical systems using conditional sampling.
Sundar, V S
2017-01-28
The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.
Agency alters perceptual decisions about action-outcomes.
Desantis, Andrea; Waszak, Florian; Gorea, Andrei
2016-10-01
Humans experience themselves as agents, capable of controlling their actions and the outcomes they generate (i.e., the sense of agency). Inferences of agency are not infallible. Research shows that we often attribute outcomes to our agency even though they are caused by another agent. Moreover, agents report the sensory events they generate to be less intense compared to the events that are generated externally. These effects have been assessed using highly suprathreshold stimuli and subjective measurements. Consequently, it remains unclear whether experiencing oneself as an agent lead to a decision criterion change and/or a sensitivity change. Here, we investigate this issue. Participants were told that their key presses generated an upward dot motion but that on 30 % of the trials the computer would take over and display a downward motion. The upward/downward dot motion was presented at participant's discrimination threshold. Participants were asked to indicate whether they (upward motion) or the computer (downward motion) generated the motion. This group of participants was compared with a 'no-agency' group who performed the same task except that subjects did not execute any actions to generate the dot motion. We observed that the agency group reported seeing more frequently the motion they expected to generate (i.e., upward motion) than the no-agency group. This suggests that agency distorts our experience of (allegedly) caused events by altering perceptual decision processes, so that, in ambiguous contexts, externally generated events are experienced as the outcomes of one's actions.
NASA Astrophysics Data System (ADS)
Birch, A. L.; Stallard, R. F.; Barnard, H. R.
2017-12-01
While relationships between land use/land cover and hydrology are well studied and understood in temperate parts of the world, little research exists in the humid tropics, where hydrologic research is often decades behind. Specifically, quantitative information on how physical and biological differences across varying land covers influence runoff generation and hydrologic flowpaths in the humid tropics is scarce; frequently leading to poorly informed hydrologic modelling and water policy decision making. This research effort seeks to quantify how tropical land cover change may alter physical hydrologic processes in the economically important Panama Canal Watershed (Republic of Panama) by separating streamflow into its different runoff components using end member mixing analysis. The samples collected for this project come from small headwater catchments of four varying land covers (mature tropical forest, young secondary forest, active pasture, recently clear-cut tropical forest) within the Smithsonian Tropical Research Institute's Agua Salud Project. During the past three years, samples have been collected at the four study catchments from streamflow and from a number of water sources within hillslope transects, and have been analyzed for stable water isotopes, major cations, and major anions. Major ion analysis of these samples has shown distinct geochemical differences for the potential runoff generating end members sampled (soil moisture/ preferential flow, groundwater, overland flow, throughfall, and precipitation). Based on this finding, an effort was made from May-August 2017 to intensively sample streamflow during wet season storm events, yielding a total of 5 events of varying intensity in each land cover/catchment, with sampling intensity ranging from sub-hourly to sub-daily. The focus of this poster presentation will be to present the result of hydrograph separation's done using end member mixing analysis from this May-August 2017 storm dataset. Expected results presented will yield an increase in the quantitative understanding of how land cover may influence physical hydrologic flowpaths and runoff generation in the humid tropics.
Reconstructing the 2015 Flash Flood event of Salgar Colombia, The Case of a Poor Gauged Basin
NASA Astrophysics Data System (ADS)
Velasquez, N.; Zapata, E.; Hoyos Ortiz, C. D.; Velez, J. I.
2017-12-01
Flash floods events associated with severe precipitation events are highly destructive, often resulting in significant human and economic losses. Due to their nature, flash floods trend to occur in medium to small basins located within complex high mountainous regions. In the Colombian Andean region these basins are very common, with the aggravating factor that the vulnerability is considerably high as some important human settlements are located within these basins, frequently occupating flood plains and other flash-flood prone areas. During the dawn of May 18 of 2015 two severe rainfall events generated a flash flood event in the municipality ofSalgar, La Liboriana basin, locatedin the northwestern Colombian Andes, resulting in more than 100 human casualties and significant economic losses. The present work is a reconstruction of the hydrological processes that took place before and during the Liboriana flash flood event, analyzed as a case of poorly gauged basin.The event conditions where recreated based on radar retrievals and a hydrological distributed model, linked with a proposed 1D hydraulic model and simple shallow landslide model. Results suggest that the flash flood event was caused by the occurrence of two successive severe convective events over the same basin, with an important modulation associated with soil characteristics and water storage.Despite of its simplicity, the proposed hydraulic model achieves a good representation of the flooded area during the event, with limitations due to the adopted spatial scale (12.7 meters, from ALOS PALSAR images). Observed landslides were obtained from satellite images; for this case the model simulates skillfully the landslide occurrence regions with small differences in the exact locations.To understand this case, radar data shows to be key due to specific convective cores location and rainfall intensity estimation.In mountainous regions, there exists a significant number of settlements with similar vulnerability and with the same gauging conditions, the use of low-cost modelling strategy could represent a good risk management tool in these regions with low planning capabilities.
Shaw, B E; Chapman, J; Fechter, M; Foeken, L; Greinix, H; Hwang, W; Phillips-Johnson, L; Korhonen, M; Lindberg, B; Navarro, W H; Szer, J
2013-11-01
Safety of living donors is critical to the success of blood, tissue and organ transplantation. Structured and robust vigilance and surveillance systems exist as part of some national entities, but historically no global systems are in place to ensure conformity, harmonisation and the recognition of rare adverse events (AEs). The World Health Assembly has recently resolved to require AE/reaction (AE/R) reporting both nationally and globally. The World Marrow Donor Association (WMDA) is an international organisation promoting the safety of unrelated donors and progenitor cell products for use in haematopoietic progenitor cell (HPC) transplantation. To address this issue, we established a system for collecting, collating, analysing, distributing and reacting to serious adverse events and reactions (SAE/R) in unrelated HPC donors. The WMDA successfully instituted this reporting system with 203 SAE/R reported in 2011. The committee generated two rapid reports, reacting to specific SAE/R, resulting in practice changing policies. The system has a robust governance structure, formal feedback to the WMDA membership and transparent information flows to other agencies, specialist physicians and transplant programs and the general public.
PODIO: An Event-Data-Model Toolkit for High Energy Physics Experiments
NASA Astrophysics Data System (ADS)
Gaede, F.; Hegner, B.; Mato, P.
2017-10-01
PODIO is a C++ library that supports the automatic creation of event data models (EDMs) and efficient I/O code for HEP experiments. It is developed as a new EDM Toolkit for future particle physics experiments in the context of the AIDA2020 EU programme. Experience from LHC and the linear collider community shows that existing solutions partly suffer from overly complex data models with deep object-hierarchies or unfavorable I/O performance. The PODIO project was created in order to address these problems. PODIO is based on the idea of employing plain-old-data (POD) data structures wherever possible, while avoiding deep object-hierarchies and virtual inheritance. At the same time it provides the necessary high-level interface towards the developer physicist, such as the support for inter-object relations and automatic memory-management, as well as a Python interface. To simplify the creation of efficient data models PODIO employs code generation from a simple yaml-based markup language. In addition, it was developed with concurrency in mind in order to support the use of modern CPU features, for example giving basic support for vectorization techniques.
Sixteen-month-olds can use language to update their expectations about the visual world.
Ganea, Patricia A; Fitch, Allison; Harris, Paul L; Kaldy, Zsuzsa
2016-11-01
The capacity to use language to form new representations and to revise existing knowledge is a crucial aspect of human cognition. Here we examined whether infants can use language to adjust their representation of a recently encoded scene. Using an eye-tracking paradigm, we asked whether 16-month-old infants (N=26; mean age=16;0 [months;days], range=14;15-17;15) can use language about an occluded event to inform their expectation about what the world will look like when the occluder is removed. We compared looking time to outcome scenes that matched the language input with looking time to those that did not. Infants looked significantly longer at the event outcome when the outcome did not match the language input, suggesting that they generated an expectation of the outcome based on that input alone. This effect was unrelated to infants' vocabulary size. Thus, using language to adjust expectations about the visual world is present at an early developmental stage even when language skills are rudimentary. Copyright © 2016 Elsevier Inc. All rights reserved.
Chloroplast DNA footprints of postglacial recolonization by oaks
Petit, Rémy J.; Pineau, Emmanuel; Demesure, Brigitte; Bacilieri, Roberto; Ducousso, Alexis; Kremer, Antoine
1997-01-01
Recolonization of Europe by forest tree species after the last glaciation is well documented in the fossil pollen record. This spread may have been achieved at low densities by rare events of long-distance dispersal, rather than by a compact wave of advance, generating a patchy genetic structure through founder effects. In long-lived oak species, this structure could still be discernible by using maternally transmitted genetic markers. To test this hypothesis, a fine-scale study of chloroplast DNA (cpDNA) variability of two sympatric oak species was carried out in western France. The distributions of six cpDNA length variants were analyzed at 188 localities over a 200 × 300 km area. A cpDNA map was obtained by applying geostatistics methods to the complete data set. Patches of several hundred square kilometers exist which are virtually fixed for a single haplotype for both oak species. This local systematic interspecific sharing of the maternal genome strongly suggests that long-distance seed dispersal events followed by interspecific exchanges were involved at the time of colonization, about 10,000 years ago. PMID:11038572
Cancer Progenitor Cells: The Result of an Epigenetic Event?
Lapinska, Karolina; Faria, Gabriela; McGonagle, Sandra; Macumber, Kate Morgan; Heerboth, Sarah; Sarkar, Sibaji
2018-01-01
The concept of cancer stem cells was proposed in the late 1990s. Although initially the idea seemed controversial, the existence of cancer stem cells is now well established. However, the process leading to the formation of cancer stem cells is still not clear and thus requires further research. This article discusses epigenetic events that possibly produce cancer progenitor cells from predisposed cells by the influence of their environment. Every somatic cell possesses an epigenetic signature in terms of histone modifications and DNA methylation, which are obtained during lineage-specific differentiation of pluripotent stem cells, which is specific to that particular tissue. We call this signature an epigenetic switch. The epigenetic switch is not fixed. Our epigenome alters with aging. However, depending on the predisposition of the cells of a particular tissue and their microenvironment, the balance of the switch (histone modifications and the DNA methylation) may be tilted to immortality in a few cells, which generates cancer progenitor cells. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.
An impact-driven dynamo for the early Moon.
Le Bars, M; Wieczorek, M A; Karatekin, O; Cébron, D; Laneuville, M
2011-11-09
The origin of lunar magnetic anomalies remains unresolved after their discovery more than four decades ago. A commonly invoked hypothesis is that the Moon might once have possessed a thermally driven core dynamo, but this theory is problematical given the small size of the core and the required surface magnetic field strengths. An alternative hypothesis is that impact events might have amplified ambient fields near the antipodes of the largest basins, but many magnetic anomalies exist that are not associated with basin antipodes. Here we propose a new model for magnetic field generation, in which dynamo action comes from impact-induced changes in the Moon's rotation rate. Basin-forming impact events are energetic enough to have unlocked the Moon from synchronous rotation, and we demonstrate that the subsequent large-scale fluid flows in the core, excited by the tidal distortion of the core-mantle boundary, could have powered a lunar dynamo. Predicted surface magnetic field strengths are on the order of several microteslas, consistent with palaeomagnetic measurements, and the duration of these fields is sufficient to explain the central magnetic anomalies associated with several large impact basins.
NASA Astrophysics Data System (ADS)
Chang, Chao-Hsi; Wang, Jian-Xiong; Wu, Xing-Gang
2010-06-01
An upgraded (second) version of the package GENXICC (A Generator for Hadronic Production of the Double Heavy Baryons Ξ, Ξ and Ξ by C.H. Chang, J.X. Wang and X.G. Wu [its first version in: Comput. Phys. Comm. 177 (2007) 467]) is presented. Users, with this version being implemented in PYTHIA and a GNU C compiler, may simulate full events of these processes in various experimental environments conveniently. In comparison with the previous version, in order to implement it in PYTHIA properly, a subprogram for the fragmentation of the produced double heavy diquark to the relevant baryon is supplied and the interface of the generator to PYTHIA is changed accordingly. In the subprogram, with explanation, certain necessary assumptions (approximations) are made in order to conserve the momenta and the QCD 'color' flow for the fragmentation. Program summaryProgram title: GENXICC2.0 Catalogue identifier: ADZJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZJ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 102 482 No. of bytes in distributed program, including test data, etc.: 1 469 519 Distribution format: tar.gz Programming language: Fortran 77/90 Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating system: Linux RAM: About 2.0 MByte Classification: 11.2 Catalogue identifier of previous version: ADZJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 177 (2007) 467 Does the new version supersede the previous version?: No Nature of problem: Hadronic production of double heavy baryons Ξ, Ξ and Ξ Solution method: The code is based on NRQCD framework. With proper options, it can generate weighted and un-weighted events of hadronic double heavy baryon production. When the hadronizations of the produced jets and double heavy diquark are taken into account in the production, the upgraded version with proper interface to PYTHIA can generate full events. Reasons for new version: Responding to the feedback from users, we improve the generator mainly by carefully completing the 'final non-perturbative process', i.e. the formulation of the double heavy baryon from relevant intermediate diquark. In the present version, the information for fragmentation about momentum-flow and the color-flow, that is necessary for PYTHIA to generate full events, is retained although reasonable approximations are made. In comparison with the original version, the upgraded one can implement it in PYTHIA properly to do the full event simulation of the double heavy baryon production. Summary of revisions:We try to explain the treatment of the momentum distribution of the process more clearly than the original version, and show how the final baryon is generated through the typical intermediate diquark precisely. We present color flow of the involved processes precisely and the corresponding changes for the program are made. The corresponding changes of the program are explained in the paper. Restrictions: The color flow, particularly, in the piece of code programming of the fragmentation from the produced colorful double heavy diquark into a relevant double heavy baryon, is treated carefully so as to implement it in PYTHIA properly. Running time: It depends on which option is chosen to configure PYTHIA when generating full events and also on which mechanism is chosen to generate the events. Typically, for the most complicated case with gluon-gluon fusion mechanism to generate the mixed events via the intermediate diquark in (cc)[ and (cc)[ states, under the option, IDWTUP=1, to generate 1000 events, takes about 20 hours on a 1.8 GHz Intel P4-processor machine, whereas under the option, IDWTUP=3, even to generate 106 events takes about 40 minutes on the same machine.
ERIC Educational Resources Information Center
Mahon, Karen L.; Shores, Richard E.; Buske, Carla J.
1999-01-01
This article reviews experiments that do not demonstrate the existence of setting events and some that include appropriate procedures for investigating setting events. It considers setting events to be a fourth term of an operant, and suggests that precise measurement and control of the three-term contingency is necessary. (Author/CR)
Method for early detection of cooling-loss events
Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.
2015-06-30
A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.
Method for early detection of cooling-loss events
Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.
2015-12-22
A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.
A Probabilistic Model of Global-Scale Seismology with Veith-Clawson Amplitude Corrections
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.
2013-12-01
We present a probabilistic generative model of global-scale seismology, NET-VISA, that is designed to address the event detection and location problem of seismic monitoring. The model is based on a standard Bayesian framework with prior probabilities for event generation and propagation as well as likelihoods of detection and arrival (or onset) parameters. The model is supplemented with a greedy search algorithm that iteratively improves the predicted bulletin with respect to the posterior probability. Our prior model incorporates both seismic theory and empirical observations as appropriate. For instance, we use empirical observations for the expected rates of earthquake at each point on the earth, while we use the Gutenberg-Richter law for the the expected magnitude distribution of these earthquakes. In this work, we describe an extension of our model where we include the Veith-Clawson (1972) amplitude decline curves in our empirically calibrated arrival amplitude model. While this change doesn't alter the overall event-detection results, we have chosen to keep the Veith-Clawson curves since they are more seismically accurate. We also describe a recent change to our search algorithm, whereby we now consider multiple hypotheses when we encounter a series of closely spaced arrivals which could be explained by either a single event or multiple co-located events. This change has led to a sharp improvement in our results on large after-shock sequences. We use the analyst-curated LEB bulletin or the REB bulletin, which is the published product of the IDC, as a reference and measure the overlap (percentage of reference events that are matched) and inconsistency (percentage of test bulletin events that don't match anything in the reference) of a one-to-one matching between the test and the reference bulletins. In the table below we show results for NET-VISA and SEL3, which is produced by the existing GA software, for the whole of 2009. These results show that NET-VISA, which is restricted to use arrivals with a 6 hour lag (in order to be comparable to SEL3), reduces the number of missed events by a factor of 2.5 while simultaneously reducing the rate of spurious events. Further, these "spurious" NET-VISA events, in fact, include many real events which are missed by the human analysts. When we compare the NET-VISA events, with arrivals from at least 3 stations (to be comparable to LEB), with NEIC events (in the ISC catalog) over the continental United States, as well as NNC events over Central Asia, we find that NET-VISA identifies 1.5 to 2 times the number of events that the IDC analysts find. Most of these additional events are in the 2--4 mb or ML range. Our experiments also confirm that NET-VISA accurately located each of the recent nuclear explosions to within 5 km of the LEB location. For large after-shock sequences, NET-VISA has been shown to be very efficient as well as accurate. For example on the Tohoku sequence (March 10 -- 14, 2011), NET-VISA (running time 2.57 days) had an overlap of 82.7 % with LEB and inconsistency of 26.8 % versus SEL3's overlap of 71.9 % and inconsistency of 40 %.
Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs
NASA Astrophysics Data System (ADS)
Purba, J. H.
2018-02-01
Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, Andy; /Edinburgh U.; Butterworth, Jonathan
We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard-scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays;more » the inclusion of QED radiation and beyond-Standard-Model processes. We describe the principal features of the Ariadne, Herwig++, Pythia 8 and Sherpa generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists wanting a deeper insight into the tools available for signal and background simulation at the LHC.« less
Magrabi, Farah; Liaw, Siaw Teng; Arachi, Diana; Runciman, William; Coiera, Enrico; Kidd, Michael R
2016-11-01
To identify the categories of problems with information technology (IT), which affect patient safety in general practice. General practitioners (GPs) reported incidents online or by telephone between May 2012 and November 2013. Incidents were reviewed against an existing classification for problems associated with IT and the clinical process impacted. 87 GPs across Australia. Types of problems, consequences and clinical processes. GPs reported 90 incidents involving IT which had an observable impact on the delivery of care, including actual patient harm as well as near miss events. Practice systems and medications were the most affected clinical processes. Problems with IT disrupted clinical workflow, wasted time and caused frustration. Issues with user interfaces, routine updates to software packages and drug databases, and the migration of records from one package to another generated clinical errors that were unique to IT; some could affect many patients at once. Human factors issues gave rise to some errors that have always existed with paper records but are more likely to occur and cause harm with IT. Such errors were linked to slips in concentration, multitasking, distractions and interruptions. Problems with patient identification and hybrid records generated errors that were in principle no different to paper records. Problems associated with IT include perennial risks with paper records, but additional disruptions in workflow and hazards for patients unique to IT, occasionally affecting multiple patients. Surveillance for such hazards may have general utility, but particularly in the context of migrating historical records to new systems and software updates to existing systems. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Acoustic emission during fatigue of porous-coated Ti-6Al-4V implant alloy.
Kohn, D H; Ducheyne, P; Awerbuch, J
1992-01-01
Acoustic emission (AE) events and event intensities (e.g., event amplitude, counts, duration, and energy counts) were recorded and analyzed during fatigue loading of uncoated and porous-coated Ti-6Al-4V. AE source location, spatial filtering, event, and event intensity distributions were used to detect, monitor, analyze, and predict failures. AE provides the ability to spatially and temporally locate multiple fatigue cracks, in real time. Fatigue of porous-coated Ti-6Al-4V is governed by a sequential, multimode fracture process of: transverse fracture in the porous coating; sphere/sphere and sphere/substrate debonding; substrate fatigue crack initiation; slow and rapid substrate fatigue crack propagation. Because of the porosity of the coating, the different stages of fracture within the coating occur in a discontinuous fashion. Therefore, the AE events generated are intermittent and the onset of each mode of fracture in the porous coating can be detected by increases in AE event rate. Changes in AE event rate also correspond to changes in crack extension rate, and may therefore be used to predict failure. AE offers two distinct advantages over conventional optical and microscopic methods of analyzing fatigue cracks--it is more sensitive and it can determine the time history of damage progression. The magnitude of the AE event intensities increased with increasing stress. Failure mechanisms are best differentiated by analyzing AE event amplitudes. Intergranular fracture and microvoid coalescence generated the highest AE event amplitudes (100 dB), whereas, plastic flow and friction generated the lowest AE event amplitudes (55-65 dB). Fractures in the porous coating were characterized by AE event amplitudes of less than 80 dB.
A model of human event detection in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1978-01-01
It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.
Learning Orthographic Structure with Sequential Generative Neural Networks
ERIC Educational Resources Information Center
Testolin, Alberto; Stoianov, Ivilin; Sperduti, Alessandro; Zorzi, Marco
2016-01-01
Learning the structure of event sequences is a ubiquitous problem in cognition and particularly in language. One possible solution is to learn a probabilistic generative model of sequences that allows making predictions about upcoming events. Though appealing from a neurobiological standpoint, this approach is typically not pursued in…
Stormflow generation: a meta-analysis of field studies and research catchments
NASA Astrophysics Data System (ADS)
Barthold, Frauke; Elsenbeer, Helmut
2014-05-01
Runoff characteristics are expressions of runoff generation mechanisms. In this study, we want to test the hypothesis if storm hydrographs of catchments with prevailing near-surface flow paths are dominated by new water. We aim to test this hypothesis using published data from the scientific literature. We developed a classification system based on three runoff characteristics: (1) hydrograph response (HR: slowly or quickly), (2) the temporal source of water that dominates the hydrograph (TS: pre-event vs. event water) and (3) the flow paths that the water takes until it is released to the stream (FP: subsurface vs. surface flow paths). We then performed a literature survey to collect information on these runoff characteristics for small, forested headwater catchments that served as study areas in runoff generation studies and assigned each study catchment to one of the 8 classes. For this purpose, we designed a procedure to objectively diagnose the predominant conceptual model of storm flow generation in each catchment and assess its temporal and spatial relevance for the catchment. Finally, we performed an explorative analysis of the classified research catchments and summarized field evidence. Our literature survey yielded a sample of 22 research catchments that fell within our defined criteria (small, naturally forested catchments which served as study areas in stormflow generation studies). We applied our classification procedure to all of these catchments. Among them were 14 catchments for which our meta-analysis yielded a complete set of stormflow characteristics resulting in one of the 8 model concepts and were assigned into our classification scheme. Of the 14 classified research catchments, 10 were dominated by subsurface flow paths while 4 were dominated by overland flow. The data also indicate that the spatial and temporal relevance is high for catchments with subsurface flow paths while often weak for surface flow paths dominated catchments. The catalogue of catchments supports our hypothesis; however, it is afflicted with a relative high degree of uncertainty. Two theories exist that may explain the imbalance between surface and subsurface dominated catchments: (1) the selection of research sites for stormflow generation studies was guided by the leading research question in hydrology, i.e. to address the "old water paradox", and (2) catchments with prevailing subsurface flow paths are much more common in nature. In a next step, the proposed catalogue of research catchments allows correlation of environmental characteristics with runoff characteristics to address questions of catchment organization and similarity. However, the successful application and relevance of such an approach depends on the range of conceptual models for which field support exist. Our results prompt us to highlight future research needs: (1) in order to cover a broader range of combinations of runoff characteristics a careful selection of research sites is necessary and (2) propose guidelines for field studies in order achieve higher comparability of resulting conceptual models of research sites and increase the spatial and temporal relevance of the dominant conceptual model.
Synthesizing Dynamic Programming Algorithms from Linear Temporal Logic Formulae
NASA Technical Reports Server (NTRS)
Rosu, Grigore; Havelund, Klaus
2001-01-01
The problem of testing a linear temporal logic (LTL) formula on a finite execution trace of events, generated by an executing program, occurs naturally in runtime analysis of software. We present an algorithm which takes an LTL formula and generates an efficient dynamic programming algorithm. The generated algorithm tests whether the LTL formula is satisfied by a finite trace of events given as input. The generated algorithm runs in linear time, its constant depending on the size of the LTL formula. The memory needed is constant, also depending on the size of the formula.
The mission events graphic generator software: A small tool with big results
NASA Technical Reports Server (NTRS)
Lupisella, Mark; Leibee, Jack; Scaffidi, Charles
1993-01-01
Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.
NASA Astrophysics Data System (ADS)
Mladenović, Ana; Trivić, Branislav; Cvetković, Vladica
2015-04-01
In this study, we report evidence about coupling between tectonic and magmatic processes in a complex orogenic system. The study focuses on the Kopaonik Mts. situated between the Dinarides and the Carpatho-Balkanides (Southern Serbia), and a perfect area for investigating tectono-magmatic evolution. We combine a new data set on tectonic paleostress tensors with the existing information on Cenozoic magmatic rocks in the wider Kopaonik Mts. area. The paleostress study revealed the presence of four brittle deformational phases. The established link between fault mechanism and igneous processes suggests that two large tectono-magmatic events occurred in this area. The Late Eocene-Early Miocene tectono-magmatic event was generally characterized by transpressional tectonics that provided conditions for formation of basaltic underplating and subsequent lower crustal melting and generation of I-type magmas. Due to predominant compression in the first half of this event, these magmas could not reach the upper crustal levels. Later on, limited extensional pulses that occurred before the end of this event opened pathways for newly formed mantle melts to reach shallower crustal levels and mix with the evolving I-type magmas. The second event is Middle-Late Miocene in age. It was first associated with clear extensional conditions that caused advancing of basaltic melts to mid-crustal levels. This, in turn, induced the elevation of geotherms, melting of shallow crust and S-type granite formation. This event terminated with transpression that produced small volumes of basaltic melts and finally closed the igneous scene in this part of the Balkan Peninsula. Although we agree that the growth of igneous bodies is usually internally controlled and can be independent from the ambient structural pattern, we have strong reasons to believe that the integration of regional scale observations of fault kinematics with crucial petrogenetic information can be used for establishing spatial-temporal relationships between brittle tectonics and magmatism.
Haigh, Ivan D.; Wadey, Matthew P.; Wahl, Thomas; Ozsoy, Ozgun; Nicholls, Robert J.; Brown, Jennifer M.; Horsburgh, Kevin; Gouldby, Ben
2016-01-01
In this paper we analyse the spatial footprint and temporal clustering of extreme sea level and skew surge events around the UK coast over the last 100 years (1915–2014). The vast majority of the extreme sea level events are generated by moderate, rather than extreme skew surges, combined with spring astronomical high tides. We distinguish four broad categories of spatial footprints of events and the distinct storm tracks that generated them. There have been rare events when extreme levels have occurred along two unconnected coastal regions during the same storm. The events that occur in closest succession (<4 days) typically impact different stretches of coastline. The spring/neap tidal cycle prevents successive extreme sea level events from happening within 4–8 days. Finally, the 2013/14 season was highly unusual in the context of the last 100 years from an extreme sea level perspective. PMID:27922630
Dalyander, P. Soupy; Butman, Bradford
2015-01-01
This study investigates the relationship between spatial and temporal patterns of wave-driven sediment mobility events on the U.S. East Coast continental shelf and the characteristics of the storms responsible for them. Mobility events, defined as seafloor wave stress exceedance of the critical stress of 0.35 mm diameter sand (0.2160 Pa) for 12 or more hours, were identified from surface wave observations at National Data Buoy Center buoys in the Middle Atlantic Bight (MAB) and South Atlantic Bight (SAB) over the period of 1997-2007. In water depths ranging from 36-48 m, there were 4-9 mobility events/year of 1-2 days duration. Integrated wave stress during events (IWAVES) was used as a combined metric of wave-driven mobility intensity and duration. In the MAB, over 67% of IWAVES was caused by extratropical storms, while in the SAB, greater than 66% of IWAVES was caused by tropical storms. On average, mobility events were caused by waves generated by storms located 800+ km away. Far-field hurricanes generated swell 2-4 days before the waves caused mobility on the shelf. Throughout most of the SAB, mobility events were driven by storms to the south, east, and west. In the MAB and near Cape Hatteras, winds from more northerly storms and low-pressure extratropical systems in the mid-western U.S. also drove mobility events. Waves generated by storms off the SAB generated mobility events along the entire U.S. East Coast shelf north to Cape Cod, while Cape Hatteras shielded the SAB area from swell originating to the north offshore of the MAB.
Event-driven contrastive divergence for spiking neuromorphic systems.
Neftci, Emre; Das, Srinjoy; Pedroni, Bruno; Kreutz-Delgado, Kenneth; Cauwenberghs, Gert
2013-01-01
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However, the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The recurrent activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP) carries out the weight updates in an online, asynchronous fashion. We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.
Event-driven contrastive divergence for spiking neuromorphic systems
Neftci, Emre; Das, Srinjoy; Pedroni, Bruno; Kreutz-Delgado, Kenneth; Cauwenberghs, Gert
2014-01-01
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However, the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The recurrent activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP) carries out the weight updates in an online, asynchronous fashion. We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality. PMID:24574952
Todorovic Balint, Milena; Jelicic, Jelena; Mihaljevic, Biljana; Kostic, Jelena; Stanic, Bojana; Balint, Bela; Pejanovic, Nadja; Lucic, Bojana; Tosic, Natasa; Marjanovic, Irena; Stojiljkovic, Maja; Karan-Djurasevic, Teodora; Perisic, Ognjen; Rakocevic, Goran; Popovic, Milos; Raicevic, Sava; Bila, Jelena; Antic, Darko; Andjelic, Bosko; Pavlovic, Sonja
2016-01-01
The existence of a potential primary central nervous system lymphoma-specific genomic signature that differs from the systemic form of diffuse large B cell lymphoma (DLBCL) has been suggested, but is still controversial. We investigated 19 patients with primary DLBCL of central nervous system (DLBCL CNS) using the TruSeq Amplicon Cancer Panel (TSACP) for 48 cancer-related genes. Next generation sequencing (NGS) analyses have revealed that over 80% of potentially protein-changing mutations were located in eight genes (CTNNB1, PIK3CA, PTEN, ATM, KRAS, PTPN11, TP53 and JAK3), pointing to the potential role of these genes in lymphomagenesis. TP53 was the only gene harboring mutations in all 19 patients. In addition, the presence of mutated TP53 and ATM genes correlated with a higher total number of mutations in other analyzed genes. Furthermore, the presence of mutated ATM correlated with poorer event-free survival (EFS) (p = 0.036). The presence of the mutated SMO gene correlated with earlier disease relapse (p = 0.023), inferior event-free survival (p = 0.011) and overall survival (OS) (p = 0.017), while mutations in the PTEN gene were associated with inferior OS (p = 0.048). Our findings suggest that the TP53 and ATM genes could be involved in the molecular pathophysiology of primary DLBCL CNS, whereas mutations in the PTEN and SMO genes could affect survival regardless of the initial treatment approach. PMID:27164089
Automated event generation for loop-induced processes
Hirschi, Valentin; Mattelaer, Olivier
2015-10-22
We present the first fully automated implementation of cross-section computation and event generation for loop-induced processes. This work is integrated in the MadGraph5_aMC@NLO framework. We describe the optimisations implemented at the level of the matrix element evaluation, phase space integration and event generation allowing for the simulation of large multiplicity loop-induced processes. Along with some selected differential observables, we illustrate our results with a table showing inclusive cross-sections for all loop-induced hadronic scattering processes with up to three final states in the SM as well as for some relevant 2 → 4 processes. Furthermore, many of these are computed heremore » for the first time.« less
Applications of computer-graphics animation for motion-perception research
NASA Technical Reports Server (NTRS)
Proffitt, D. R.; Kaiser, M. K.
1986-01-01
The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.
Searches for new quarks and leptons in Z boson decays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Kooten, R.J.
1990-06-01
Searches for the decay of Z bosons into pairs of new quarks and leptons in a data sample including 455 hadronic Z decays are presented. The Z bosons were produced in electon-positron annihilations at the SLAC Linear Collider operating in the center-of-mass energy range from 89.2 to 93.0 GeV. The Standard Model provides no prediction for fermion masses and does not exclude new generations of fermions. The existence and masses of these new particles may provide valuable information to help understand the pattern of fermion masses, and physics beyond the Standard Model. Specific searches for top quarks and sequential fourthmore » generation charge--1/3(b{prime}) quarks are made considering a variety of possible standard and non-standard decay modes. In addition, searches for sequential fourth generation massive neutrinos {nu}{sub 4} and their charged lepton partners L{sup {minus}} are pursued. The {nu}{sub 4} may be stable or decay through mixing to the lighter generations. The data sample is examined for new particle topologies of events with high-momentum isolated tracks, high-energy isolated photons, spherical event shapes, and detached vertices. No evidence is observed for the production of new quarks and leptons. 95% confidence lower mass limits of 40.7 GeV/c{sup 2} for the top quark and 42.0 GeV/c{sup 2} for the b{prime}-quark mass are obtained regardless of the branching fractions to the considered decay modes. A significant range of mixing matrix elements of {nu}{sub 4} to other generation neutrinos for a {nu}{sub 4} mass from 1 GeV/c{sup 2} to 43 GeV/c{sup 2} is excluded at 95% confidence level. Measurements of the upper limit of the invisible width of the Z exclude additional values of the {nu}{sub 4} mass and mixing matrix elements, and also permit the exclusion of a region in the L{sup {minus}} mass versus {nu}{sub 4} mass plane.« less
A probabilistic framework for single-station location of seismicity on Earth and Mars
NASA Astrophysics Data System (ADS)
Böse, M.; Clinton, J. F.; Ceylan, S.; Euchner, F.; van Driel, M.; Khan, A.; Giardini, D.; Lognonné, P.; Banerdt, W. B.
2017-01-01
Locating the source of seismic energy from a single three-component seismic station is associated with large uncertainties, originating from challenges in identifying seismic phases, as well as inevitable pick and model uncertainties. The challenge is even higher for planets such as Mars, where interior structure is a priori largely unknown. In this study, we address the single-station location problem by developing a probabilistic framework that combines location estimates from multiple algorithms to estimate the probability density function (PDF) for epicentral distance, back azimuth, and origin time. Each algorithm uses independent and complementary information in the seismic signals. Together, the algorithms allow locating seismicity ranging from local to teleseismic quakes. Distances and origin times of large regional and teleseismic events (M > 5.5) are estimated from observed and theoretical body- and multi-orbit surface-wave travel times. The latter are picked from the maxima in the waveform envelopes in various frequency bands. For smaller events at local and regional distances, only first arrival picks of body waves are used, possibly in combination with fundamental Rayleigh R1 waveform maxima where detectable; depth phases, such as pP or PmP, help constrain source depth and improve distance estimates. Back azimuth is determined from the polarization of the Rayleigh- and/or P-wave phases. When seismic signals are good enough for multiple approaches to be used, estimates from the various methods are combined through the product of their PDFs, resulting in an improved event location and reduced uncertainty range estimate compared to the results obtained from each algorithm independently. To verify our approach, we use both earthquake recordings from existing Earth stations and synthetic Martian seismograms. The Mars synthetics are generated with a full-waveform scheme (AxiSEM) using spherically-symmetric seismic velocity, density and attenuation models of Mars that incorporate existing knowledge of Mars internal structure, and include expected ambient and instrumental noise. While our probabilistic framework is developed mainly for application to Mars in the context of the upcoming InSight mission, it is also relevant for locating seismic events on Earth in regions with sparse instrumentation.
Modelling the enigmatic Late Pliocene Glacial Event - Marine Isotope Stage M2
Dolan, Aisling M.; Haywood, Alan M.; Hunter, Stephen J.; Tindall, Julia C.; Dowsett, Harry J.; Hill, Daniel J.; Pickering, Steven J.
2015-01-01
The Pliocene Epoch (5.2 to 2.58 Ma) has often been targeted to investigate the nature of warm climates. However, climate records for the Pliocene exhibit significant variability and show intervals that apparently experienced a cooler than modern climate. Marine Isotope Stage (MIS) M2 (~ 3.3 Ma) is a globally recognisable cooling event that disturbs an otherwise relatively (compared to present-day) warm background climate state. It remains unclear whether this event corresponds to significant ice sheet build-up in the Northern and Southern Hemisphere. Estimates of sea level for this interval vary, and range from modern values to estimates of 65 m sea level fall with respect to present day. Here we implement plausible M2 ice sheet configurations into a coupled atmosphere–ocean climate model to test the hypothesis that larger-than-modern ice sheet configurations may have existed at M2. Climate model results are compared with proxy climate data available for M2 to assess the plausibility of each ice sheet configuration. Whilst the outcomes of our data/model comparisons are not in all cases straight forward to interpret, there is little indication that results from model simulations in which significant ice masses have been prescribed in the Northern Hemisphere are incompatible with proxy data from the North Atlantic, Northeast Arctic Russia, North Africa and the Southern Ocean. Therefore, our model results do not preclude the possibility of the existence of larger ice masses during M2 in the Northern or Southern Hemisphere. Specifically they are not able to discount the possibility of significant ice masses in the Northern Hemisphere during the M2 event, consistent with a global sea-level fall of between 40 m and 60 m. This study highlights the general need for more focused and coordinated data generation in the future to improve the coverage and consistency in proxy records for M2, which will allow these and future M2 sensitivity tests to be interrogated further.
NASA Astrophysics Data System (ADS)
Masaoka, Naoya; Kosugi, Ken'ichirou; Yamakawa, Yosuke; Mizuyama, Takahisa; Tsutsumi, Daizo
2013-04-01
Heterogeneous hydrological properties in a foot slope area of mountainous hillslopes should be assessed to understand hydrological phenomena and their effects on discharge and sediment transport. In this study, we analyzed the high-resolution and three-dimensional water movement data to clarify the hydrological process, including heterogeneous phenomena, in detail. We continuously monitored the soil matric pressure head, psi, using 111 tensiometers installed at grid intervals of approximately 1 meter within the soil mantle at the study hillslope. Under a no-rainfall condition, the existence of perennial groundwater seepage flow was detected by exfiltration flux and temporal psi waveforms, which showed delayed responses, only to heavy storm events, and gradual recession limbs. The seepage water spread in the downslope direction and supplied water constantly to the lower section of the slope. At some points in the center of the slope, a perched saturated area was detected in the middle of soil layer, while psi exhibited negative values above the bedrock surface. These phenomena could be inferred partly from the bedrock topography and the distribution of soil hydraulic conductivity assumed from the result of penetration test. At the peak of a rainfall event, on the other hand, continuous high pressure zones (i.e., psi > 50 cmH2O) were generated in the right and left sections of the slope. Both of these high pressure zones converged at the lower region, showing a sharp psi spike up to 100 cmH2O. Along the high pressure zones, flux vectors showed large values and water exfiltration, indicating the occurrence of preferential flow. Moreover, the preferential flow occurred within the area beneath the perched water, indicating the existence of a weathered bedrock layer. This layer had low permeability, which prevented the vertical infiltration of water in the upper part of the layer, but had high permeability as a result of the fractures distributed heterogeneously inside the layer. These fractures acted as a preferential flow channel and flushed the water derived from lateral flow accumulated from the upslope area during the rainfall event. These phenomena occurring at the peak of rainfall event could not be inferred from the parameters derived from the penetration test.
NASA Technical Reports Server (NTRS)
Steinman, Jeffrey S. (Inventor)
1998-01-01
The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.
BBN: Description of the PLUM System as Used for MUC-4
1992-01-01
in the MUC-4 corpus’ . Here are the 8 parse fragments generated by FPP for the first sentence of TST2- MUC4 -0048 : ("SALVADORAN PRESIDENT-ELECT ALFREDO...extensive patterns for fragment combination . Figure 2 shows a graphical version of the semantics generated for the first fragment of S1 in TST2- MUC4 ...trigger. Following is the discourse event structure for the first event in TST2- MUC4 -0048 : Event MURDER Trigger fragments: "SALVADORAN PRESIDENT
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
The Evaluation of a Temporal Reasoning System in Processing Clinical Discharge Summaries
Zhou, Li; Parsons, Simon; Hripcsak, George
2008-01-01
Context TimeText is a temporal reasoning system designed to represent, extract, and reason about temporal information in clinical text. Objective To measure the accuracy of the TimeText for processing clinical discharge summaries. Design Six physicians with biomedical informatics training served as domain experts. Twenty discharge summaries were randomly selected for the evaluation. For each of the first 14 reports, 5 to 8 clinically important medical events were chosen. The temporal reasoning system generated temporal relations about the endpoints (start or finish) of pairs of medical events. Two experts (subjects) manually generated temporal relations for these medical events. The system and expert-generated results were assessed by four other experts (raters). All of the twenty discharge summaries were used to assess the system’s accuracy in answering time-oriented clinical questions. For each report, five to ten clinically plausible temporal questions about events were generated. Two experts generated answers to the questions to serve as the gold standard. We wrote queries to retrieve answers from system’s output. Measurements Correctness of generated temporal relations, recall of clinically important relations, and accuracy in answering temporal questions. Results The raters determined that 97% of subjects’ 295 generated temporal relations were correct and that 96.5% of the system’s 995 generated temporal relations were correct. The system captured 79% of 307 temporal relations determined to be clinically important by the subjects and raters. The system answered 84% of the temporal questions correctly. Conclusion The system encoded the majority of information identified by experts, and was able to answer simple temporal questions. PMID:17947618
The natural mathematics of behavior analysis.
Li, Don; Hautus, Michael J; Elliffe, Douglas
2018-04-19
Models that generate event records have very general scope regarding the dimensions of the target behavior that we measure. From a set of predicted event records, we can generate predictions for any dependent variable that we could compute from the event records of our subjects. In this sense, models that generate event records permit us a freely multivariate analysis. To explore this proposition, we conducted a multivariate examination of Catania's Operant Reserve on single VI schedules in transition using a Markov Chain Monte Carlo scheme for Approximate Bayesian Computation. Although we found systematic deviations between our implementation of Catania's Operant Reserve and our observed data (e.g., mismatches in the shape of the interresponse time distributions), the general approach that we have demonstrated represents an avenue for modelling behavior that transcends the typical constraints of algebraic models. © 2018 Society for the Experimental Analysis of Behavior.
NASA Astrophysics Data System (ADS)
Wicaksono, A. D.
2017-06-01
Since the last few years, Indonesia has experienced important events that bring significant changes to the social, political and economic life. The changes directly or indirectly impact the field of planning. With the challenging condition which grows fast and is more complex ahead, and the greater demands on the role of planning, it is required that planning should have higher quality. This paper seeks to answer some questions as follows: (i) How are changes in paradigm and also the development of planning model for the current transition era?, (ii) What is the best way to improve the quality of planning control on the last generation planning model to realize sustainable city?. Analysis steps that will be used to achieve the paper objectives are: (i) Review of planning and sustainable cities theory, (ii) Pattern recognition, (iii) Identifying control mechanisms and sustainable urban forms, (iv) conceptualization. Based on discussion about sustainable cities and control mechanism, some conclusions can be generated as follows: (i) The third generation planning model is based on the theory of expanded system, emphasizing on the constraint of capacity and the ability of planners within the context of larger environment, (ii) There are various theoretical studies that recommend prescriptive model or solution for sustainable urban form and structure. The concepts of Sustainable Cities can be grouped in Neotraditional Development, Urban Containment, Compact City and The Eco-City. The four models above have criteria, namely (i) high density; (ii) a high level of diversity; (iii) mixed land use; (iv) compactness; (5) sustainable transport; (6) passive solar design; (7) Greening Ecological Design. The three main activities in control mechanisms are: Monitoring and Recommendation; a comparative review of the facts (conditions that exist or are developing) with the purpose (expected conditions, set out in urban planning) and recommendations; Evaluation, a review on the intended purposes and can be followed up with revised purposes; Intervention/Actions toward existing conditions.
Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.; ...
2015-10-01
Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phasesmore » are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.« less
Bayesian Inference for Signal-Based Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.
2015-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http
Code of Federal Regulations, 2014 CFR
2014-07-01
... to minimize emissions from startup, shutdown, and malfunction events. 270.235 Section 270.235... from startup, shutdown, and malfunction events. (a) Facilities with existing permits—(1) Revisions to... from startup, shutdown, and malfunction events under any of the following options when requesting...
Code of Federal Regulations, 2012 CFR
2012-07-01
... to minimize emissions from startup, shutdown, and malfunction events. 270.235 Section 270.235... from startup, shutdown, and malfunction events. (a) Facilities with existing permits—(1) Revisions to... from startup, shutdown, and malfunction events under any of the following options when requesting...
Code of Federal Regulations, 2011 CFR
2011-07-01
... to minimize emissions from startup, shutdown, and malfunction events. 270.235 Section 270.235... from startup, shutdown, and malfunction events. (a) Facilities with existing permits—(1) Revisions to... from startup, shutdown, and malfunction events under any of the following options when requesting...
Code of Federal Regulations, 2013 CFR
2013-07-01
... to minimize emissions from startup, shutdown, and malfunction events. 270.235 Section 270.235... from startup, shutdown, and malfunction events. (a) Facilities with existing permits—(1) Revisions to... from startup, shutdown, and malfunction events under any of the following options when requesting...
Non-symbolic halving in an Amazonian indigene group
McCrink, Koleen; Spelke, Elizabeth S.; Dehaene, Stanislas; Pica, Pierre
2014-01-01
Much research supports the existence of an Approximate Number System (ANS) that is recruited by infants, children, adults, and non-human animals to generate coarse, non-symbolic representations of number. This system supports simple arithmetic operations such as addition, subtraction, and ordering of amounts. The current study tests whether an intuition of a more complex calculation, division, exists in an indigene group in the Amazon, the Mundurucu, whose language includes no words for large numbers. Mundurucu children were presented with a video event depicting a division transformation of halving, in which pairs of objects turned into single objects, reducing the array's numerical magnitude. Then they were tested on their ability to calculate the outcome of this division transformation with other large-number arrays. The Mundurucu children effected this transformation even when non-numerical variables were controlled, performed above chance levels on the very first set of test trials, and exhibited performance similar to urban children who had access to precise number words and a surrounding symbolic culture. We conclude that a halving calculation is part of the suite of intuitive operations supported by the ANS. PMID:23587042
Lisiecki, R S; Voigt, H F
1995-08-01
A 2-channel action-potential generator system was designed for use in testing neurophysiologic data acquisition/analysis systems. The system consists of a personal computer controlling an external hardware unit. This system is capable of generating 2 channels of simulated action potential (AP) waveshapes. The AP waveforms are generated from the linear combination of 2 principal-component template functions. Each channel generates randomly occurring APs with a specified rate ranging from 1 to 200 events per second. The 2 trains may be independent of one another or the second channel may be made to be excited or inhibited by the events from the first channel with user-specified probabilities. A third internal channel may be made to excite or inhibit events in both of the 2 output channels with user-specified rate parameters and probabilities. The system produces voltage waveforms that may be used to test neurophysiologic data acquisition systems for recording from 2 spike trains simultaneously and for testing multispike-train analysis (e.g., cross-correlation) software.
Applications of flood depth from rapid post-event footprint generation
NASA Astrophysics Data System (ADS)
Booth, Naomi; Millinship, Ian
2015-04-01
Immediately following large flood events, an indication of the area flooded (i.e. the flood footprint) can be extremely useful for evaluating potential impacts on exposed property and infrastructure. Specifically, such information can help insurance companies estimate overall potential losses, deploy claims adjusters and ultimately assists the timely payment of due compensation to the public. Developing these datasets from remotely sensed products seems like an obvious choice. However, there are a number of important drawbacks which limit their utility in the context of flood risk studies. For example, external agencies have no control over the region that is surveyed, the time at which it is surveyed (which is important as the maximum extent would ideally be captured), and how freely accessible the outputs are. Moreover, the spatial resolution of these datasets can be low, and considerable uncertainties in the flood extents exist where dry surfaces give similar return signals to water. Most importantly of all, flood depths are required to estimate potential damages, but generally cannot be estimated from satellite imagery alone. In response to these problems, we have developed an alternative methodology for developing high-resolution footprints of maximum flood extent which do contain depth information. For a particular event, once reports of heavy rainfall are received, we begin monitoring real-time flow data and extracting peak values across affected areas. Next, using statistical extreme value analyses of historic flow records at the same measured locations, the return periods of the maximum event flow at each gauged location are estimated. These return periods are then interpolated along each river and matched to JBA's high-resolution hazard maps, which already exist for a series of design return periods. The extent and depth of flooding associated with the event flow is extracted from the hazard maps to create a flood footprint. Georeferenced ground, aerial and satellite images are used to establish defence integrity, highlight breach locations and validate our footprint. We have implemented this method to create seven flood footprints, including river flooding in central Europe and coastal flooding associated with Storm Xaver in the UK (both in 2013). The inclusion of depth information allows damages to be simulated and compared to actual damage and resultant loss which become available after the event. In this way, we can evaluate depth-damage functions used in catastrophe models and reduce their associated uncertainty. In further studies, the depth data could be used at an individual property level to calibrate property type specific depth-damage functions.
Walter, Fabian; Amundson, Jason M.; O'Neel, Shad; Truffer, Martin; Fahnestock, Mark; Fricker, Helen A.
2012-01-01
We investigated seismic signals generated during a large-scale, multiple iceberg calving event that occurred at Jakobshavn Isbræ, Greenland, on 21 August 2009. The event was recorded by a high-rate time-lapse camera and five broadband seismic stations located within a few hundred kilometers of the terminus. During the event two full-glacier-thickness icebergs calved from the grounded (or nearly grounded) terminus and immediately capsized; the second iceberg to calve was two to three times smaller than the first. The individual calving and capsize events were well-correlated with the radiation of low-frequency seismic signals (<0.1 Hz) dominated by Love and Rayleigh waves. In agreement with regional records from previously published ‘glacial earthquakes’, these low-frequency seismic signals had maximum power and/or signal-to-noise ratios in the 0.05–0.1 Hz band. Similarly, full waveform inversions indicate that these signals were also generated by horizontal single forces acting at the glacier terminus. The signals therefore appear to be local manifestations of glacial earthquakes, although the magnitudes of the signals (twice-time integrated force histories) were considerably smaller than previously reported glacial earthquakes. We thus speculate that such earthquakes may be a common, if not pervasive, feature of all full-glacier-thickness calving events from grounded termini. Finally, a key result from our study is that waveform inversions performed on low-frequency, calving-generated seismic signals may have only limited ability to quantitatively estimate mass losses from calving. In particular, the choice of source time function has little impact on the inversion but dramatically changes the earthquake magnitude. Accordingly, in our analysis, it is unclear whether the smaller or larger of the two calving icebergs generated a larger seismic signal.
NASA Astrophysics Data System (ADS)
Chang, Chao-Hsi; Wang, Jian-Xiong; Wu, Xing-Gang
2006-02-01
The generator BCVEGPY is upgraded by improving some of its features and by adding the hadroproduction of the P-wave excited B states (denoted by BcJ,L=1∗ or by hB_c and χB_c). In order to make the generator more efficient, we manipulate the amplitude as compact as possible with special effort. The correctness of the program is tested by various checks. We denote it as BCVEGPY2.0. As for the added part of the P-wave production, only the dominant gluon-gluon fusion mechanism ( gg→BcJ,L=1∗+c¯+b) is taken into account. Moreover, in the program, not only the ability to compute the contributions from the color-singlet components ( to the P-wave production but also the ability to compute the contributions from the color-octet components ( are available. With BCVEGPY2.0 the contributions from the two 'color components' to the production of each of the P-wave states may be computed separately by an option, furthermore, besides individually the event samples of the S-wave and P-wave ( cb¯)-heavy-quarkonium in various correct (realistic) mixtures can be generated by relevant options too. Program summaryTitle of program: BCVEGPY Version: 2.0 (December, 2004) Catalogue identifier: ADWQ Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWQ Program obtained from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference to original program: ADTJ (BCVEGPY1.0) Reference in CPC: Comput. Phys. Comm. 159 (2004) 192 Does the new version supersede the old program: yes Computer: Any computer with FORTRAN 77 or 90 compiler. The program has been tested on HP-SC45 Sigma-X parallel computer, Linux PCs and Windows PCs with Visual Fortran Operating systems: UNIX, Linux and Windows Programming language used: FORTRAN 77/90 Memory required to execute with typical data: About 2.0 MB No. of lines in distributed program, including test data, etc.: 124 297 No. of bytes in distributed program, including test data, etc.: 1 137 177 Distribution format: tar.g2 Nature of physical problem: Hadronic production of B meson itself and its excited states. Method of solution: The code with option can generate weighted and unweighted events. For jet hadronization, an interface to PYTHIA is provided. Reason for the new version: There are two reasons. One is to provide additional codes for the hadronic production of P-wave excited B states: the four via color-singlet P-wave state directly and the two via color-octet S-wave state accordingly. The other one is to decompose the color-flow factor for the amplitude by an approximate way, that is adopted in PYTHIA. Summary of Revisions: (1) The integration efficiency over the momentum fractions of the initial partons x and x are improved; (2) The amplitudes for the hadronic production of the color-singlet components corresponding to the four P-wave states, BcJ,L=1∗ or P1 and P3 ( J=0,1,2), are included; (3) The amplitudes for P-wave production via the two color-octet components |((S1)g> and |((S3)g> are included; (4) For comparison, the S-wave ( S1 and S3) hadronic production via the light quark-antiquark annihilation mechanism is also included; (5) For convenience, 24 data files to record the information of the generated events in one run are added; (6) An additional file, parameter.for, is added to set the initial values of the parameters; (7) Two new parameters 'IMIX' (IMIX = 0 or 1) and 'IMIXTYPE' (IMIXTYPE = 1, = 2 or = 3) are added to meet the needs of generating the events for simulating 'mixing' or 'separate' event samples for various B and its excited states correctly; (8) One switch, 'IVEGGRADE', is added to determine whether to use the existed importance sampling function to generate a more precise importance sampling function or not; (9) Two parameters, 'IOUTPDF' and 'IPDFNUM', are added to determine which type of PDFs to use; (10) The color-flow decomposition for the amplitudes is rewritten by an approximate way, that is adopted in PYTHIA. Restrictions on the complexity of the problem: The hadronic production of (cb¯)-quarkonium in S-wave and P-wave states via the mechanism of gluon-gluon fusion are given by the 'complete calculation' approach of the leading order QCD. The contributions from the other mechanisms for P-wave production which are small comparatively are not included. Typical running time: Generally speaking, it depends on which option is used to drive PYTHIA when generating the B events. Typically, for the hadronic production of the S-wave (cb¯)-quarkonium, if the PYTHIA parameter IDWTUP = 1, then it takes about 20 hours on a 1.8 GHz Intel P4-processor machine to generate 1000 events; however if IDWTUP = 3, to generate 10 6 events, it takes only about 40 minutes. For the hadronic production of the P-wave (cb¯)-quarkonium, the necessary time will be almost two times longer than the S-wave quarkonium production.
From empirical data to time-inhomogeneous continuous Markov processes.
Lencastre, Pedro; Raischel, Frank; Rogers, Tim; Lind, Pedro G
2016-03-01
We present an approach for testing for the existence of continuous generators of discrete stochastic transition matrices. Typically, existing methods to ascertain the existence of continuous Markov processes are based on the assumption that only time-homogeneous generators exist. Here a systematic extension to time inhomogeneity is presented, based on new mathematical propositions incorporating necessary and sufficient conditions, which are then implemented computationally and applied to numerical data. A discussion concerning the bridging between rigorous mathematical results on the existence of generators to its computational implementation is presented. Our detection algorithm shows to be effective in more than 60% of tested matrices, typically 80% to 90%, and for those an estimate of the (nonhomogeneous) generator matrix follows. We also solve the embedding problem analytically for the particular case of three-dimensional circulant matrices. Finally, a discussion of possible applications of our framework to problems in different fields is briefly addressed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
...The Nuclear Regulatory Commission (NRC) is amending its regulations to provide alternate fracture toughness requirements for protection against pressurized thermal shock (PTS) events for pressurized water reactor (PWR) pressure vessels. This final rule provides alternate PTS requirements based on updated analysis methods. This action is desirable because the existing requirements are based on unnecessarily conservative probabilistic fracture mechanics analyses. This action reduces regulatory burden for those PWR licensees who expect to exceed the existing requirements before the expiration of their licenses, while maintaining adequate safety, and may choose to comply with the final rule as an alternative to complying with the existing requirements.
Analysing home-ownership of couples: the effect of selecting couples at the time of the survey.
Mulder, C H
1996-09-01
"The analysis of events encountered by couple and family households may suffer from sample selection bias when data are restricted to couples existing at the moment of interview. The paper discusses the effect of sample selection bias on event history analyses of buying a home [in the Netherlands] by comparing analyses performed on a sample of existing couples with analyses of a more complete sample including past as well as current partner relationships. The results show that, although home-buying in relationships that have ended differs clearly from behaviour in existing relationships, sample selection bias is not alarmingly large." (SUMMARY IN FRE) excerpt
NASA Astrophysics Data System (ADS)
Tarbotton, C.; Walters, R. A.; Goff, J. R.; Dominey-Howes, D.; Turner, I. L.
2012-12-01
As communities become increasingly aware of the risks posed by tsunamis, it is important to develop methods for predicting the damage they can cause to the built environment. This will provide the information needed to make informed decisions regarding land-use, building codes, and evacuation. At present, a number of tsunami-building vulnerability assessment models are available, however, the relative infrequency and destructive nature of tsunamis has long made it difficult to obtain the data necessary to adequately validate and compare them. Further complicating matters is that the inundation of a tsunami in the built environment is very difficult model, as is the response of a building to the hydraulic forces that a tsunami generates. Variations in building design and condition will significantly affect a building's susceptibility to damage. Likewise, factors affecting the flow conditions at a building (i.e. surrounding structures and topography), will greatly affect its exposure. This presents significant challenges for practitioners, as they are often left in the dark on how to use hazard modeling and vulnerability assessment techniques together to conduct the community-scale impact studies required for tsunami planning. This paper presents the results of an in-depth case study of Yuriage, Miyagi Prefecture - a coastal city in Japan that was badly damaged by the 2011 Tohoku tsunami. The aim of the study was twofold: 1) To test and compare existing tsunami vulnerability assessment models and 2) To more effectively utilize hydrodynamic models in the context of tsunami impact studies. Following the 2011 Tohoku event, an unprecedented quantity of field data, imagery and video emerged. Yuriage in particular, features a comprehensive set of street level Google Street View imagery, available both before and after the event. This has enabled the collection of a large dataset describing the characteristics of the buildings existing before the event as well the subsequent damage that they sustained during. These data together with the detailed results from hydrodynamic models have been used to provide the building, damage and hazard data necessary to rigorously test and compare existing vulnerability assessments techniques. The result is a much-improved understanding of the capabilities of existing vulnerability assessment techniques, as well as important improvements to their assessment framework This provides much needed guidance to practitioners on how to conduct tsunami impact assessments in the future. Furthermore, the study introduces some new methods of integrating hydrodynamic models into vulnerability assessment models, offering guidance on how to more effectively model tsunami inundation in the built environment.
Construction of regulatory networks using expression time-series data of a genotyped population.
Yeung, Ka Yee; Dombek, Kenneth M; Lo, Kenneth; Mittler, John E; Zhu, Jun; Schadt, Eric E; Bumgarner, Roger E; Raftery, Adrian E
2011-11-29
The inference of regulatory and biochemical networks from large-scale genomics data is a basic problem in molecular biology. The goal is to generate testable hypotheses of gene-to-gene influences and subsequently to design bench experiments to confirm these network predictions. Coexpression of genes in large-scale gene-expression data implies coregulation and potential gene-gene interactions, but provide little information about the direction of influences. Here, we use both time-series data and genetics data to infer directionality of edges in regulatory networks: time-series data contain information about the chronological order of regulatory events and genetics data allow us to map DNA variations to variations at the RNA level. We generate microarray data measuring time-dependent gene-expression levels in 95 genotyped yeast segregants subjected to a drug perturbation. We develop a Bayesian model averaging regression algorithm that incorporates external information from diverse data types to infer regulatory networks from the time-series and genetics data. Our algorithm is capable of generating feedback loops. We show that our inferred network recovers existing and novel regulatory relationships. Following network construction, we generate independent microarray data on selected deletion mutants to prospectively test network predictions. We demonstrate the potential of our network to discover de novo transcription-factor binding sites. Applying our construction method to previously published data demonstrates that our method is competitive with leading network construction algorithms in the literature.
Carr, Karen D.; Norman, John C.; Huye, Leslie; Hegde, Meenakshi
2015-01-01
Abstract Compensation is a critical process for the unbiased analysis of flow cytometry data. Numerous compensation strategies exist, including the use of bead‐based products. The purpose of this study was to determine whether beads, specifically polystyrene microspheres (PSMS) compare to the use of primary leukocytes for single color based compensation when conducting polychromatic flow cytometry. To do so, we stained individual tubes of both PSMS and leukocytes with panel specific antibodies conjugated to fluorochromes corresponding to fluorescent channels FL1‐FL10. We compared the matrix generated by PSMS to that generated using peripheral blood mononuclear cells (PBMC). Ideal for compensation is a sample with both a discrete negative population and a bright positive population. We demonstrate that PSMS display autofluorescence properties similar to PBMC. When comparing PSMS to PBMC for compensation PSMS yielded more evenly distributed and discrete negative and positive populations to use for compensation. We analyzed three donors' PBMC stained with our 10‐color T cell subpopulation panel using compensation generated by PSMS vs.PBMC and detected no significant differences in the population distribution. Panel specific antibodies bound to PSMS represent an invaluable valid tool to generate suitable compensation matrices especially when sample material is limited and/or the sample requires analysis of dynamically modulated or rare events. © 2015 The Authors. Cytometry Part A Published by Wiley Periodicals, Inc. PMID:26202733
Gallo, David A.; Korthauer, Laura E.; McDonough, Ian M.; Teshale, Salom; Johnson, Elizabeth L.
2013-01-01
This study investigated whether the age-related positivity effect strengthens specific event details in autobiographical memory. Participants retrieved past events or imagined future events in response to neutral or emotional cue words. Older adults rated each kind of event more positively than younger adults, demonstrating an age-related positivity effect. We next administered a source memory test. Participants were given the same cue words and tried to retrieve the previously generated event and its source (past or future). Accuracy on this source test should depend on the recollection of specific details about the earlier generated events, providing a more objective measure of those details than subjective ratings. We found that source accuracy was greater for positive than negative future events in both age groups, suggesting that positive future events were more detailed. In contrast, valence did not affect source accuracy for past events in either age group, suggesting that positive and negative past events were equally detailed. Although aging can bias people to focus on positive aspects of experience, this bias does not appear to strengthen the availability of details for positive relative to negative past events. PMID:21919591
Compact conscious animal positron emission tomography scanner
Schyler, David J.; O'Connor, Paul; Woody, Craig; Junnarkar, Sachin Shrirang; Radeka, Veljko; Vaska, Paul; Pratte, Jean-Francois; Volkow, Nora
2006-10-24
A method of serially transferring annihilation information in a compact positron emission tomography (PET) scanner includes generating a time signal for an event, generating an address signal representing a detecting channel, generating a detector channel signal including the time and address signals, and generating a composite signal including the channel signal and similarly generated signals. The composite signal includes events from detectors in a block and is serially output. An apparatus that serially transfers annihilation information from a block includes time signal generators for detectors in a block and an address and channel signal generator. The PET scanner includes a ring tomograph that mounts onto a portion of an animal, which includes opposing block pairs. Each of the blocks in a block pair includes a scintillator layer, detection array, front-end array, and a serial encoder. The serial encoder includes time signal generators and an address signal and channel signal generator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madlazim,, E-mail: m-lazim@physics.its.ac.id; Hariyono, E., E-mail: m-lazim@physics.its.ac.id
The purpose of the study was to estimate P-wave rupture durations (T{sub dur}), dominant periods (T{sub d}) and exceeds duration (T{sub 50Ex}) simultaneously for local events, shallow earthquakes which occurred off the coast of Indonesia. Although the all earthquakes had parameters of magnitude more than 6,3 and depth less than 70 km, part of the earthquakes generated a tsunami while the other events (Mw=7.8) did not. Analysis using Joko Tingkir of the above stated parameters helped understand the tsunami generation of these earthquakes. Measurements from vertical component broadband P-wave quake velocity records and determination of the above stated parameters canmore » provide a direct procedure for assessing rapidly the potential for tsunami generation. The results of the present study and the analysis of the seismic parameters helped explain why the events generated a tsunami, while the others did not.« less
NASA Astrophysics Data System (ADS)
Wang, Hui; di Gate, Russell J.; Seeman, Nadrian C.
1996-09-01
A synthetic strand of RNA has been designed so that it can adopt two different topological states (a circle and a trefoil knot) when ligated into a cyclic molecule. The RNA knot and circle have been characterized by their behavior in gel electrophoresis and sedimentation experiments. This system allows one to assay for the existence of an RNA topoisomerase, because the two RNA molecules can be interconverted only by a strand passage event. We find that the interconversion of these two species can be catalyzed by Escherichia coli DNA topoisomerase III, indicating that this enzyme can act as an RNA topoisomerase. The conversion of circles to knots is accompanied by a small amount of RNA catenane generation. These findings suggest that strand passage must be considered a potential component of the folding and modification of RNA structures.
Rare behavior of growth processes via umbrella sampling of trajectories
NASA Astrophysics Data System (ADS)
Klymko, Katherine; Geissler, Phillip L.; Garrahan, Juan P.; Whitelam, Stephen
2018-03-01
We compute probability distributions of trajectory observables for reversible and irreversible growth processes. These results reveal a correspondence between reversible and irreversible processes, at particular points in parameter space, in terms of their typical and atypical trajectories. Thus key features of growth processes can be insensitive to the precise form of the rate constants used to generate them, recalling the insensitivity to microscopic details of certain equilibrium behavior. We obtained these results using a sampling method, inspired by the "s -ensemble" large-deviation formalism, that amounts to umbrella sampling in trajectory space. The method is a simple variant of existing approaches, and applies to ensembles of trajectories controlled by the total number of events. It can be used to determine large-deviation rate functions for trajectory observables in or out of equilibrium.
Mu, Zhendong; Yin, Jinhai; Hu, Jianfeng
2018-01-01
In this paper, a person authentication system that can effectively identify individuals by generating unique electroencephalogram signal features in response to self-face and non-self-face photos is presented. In order to achieve a good stability performance, the sequence of self-face photo including first-occurrence position and non-first-occurrence position are taken into account in the serial occurrence of visual stimuli. In addition, a Fisher linear classification method and event-related potential technique for feature analysis is adapted to yield remarkably better outcomes than that by most of the existing methods in the field. The results have shown that the EEG-based person authentications via brain-computer interface can be considered as a suitable approach for biometric authentication system.
Onraedt, Annelies
2013-09-01
Phacilitates 1st Partnering event for Vaccine Emerging Markets brought together approximately 100 attendees from developed and developing world vaccine manufacturers, leading non-profit organizations and industry suppliers. The goal was to discuss the vaccine needs in the developing world and how these needs can be met by leveraging collaboration and partnership models, by improving access to existing, new and next generation vaccines, by using novel technologies to drive competitive advantage and economics of vaccine manufacturing and by investing in localized capacity, including capacity for pandemic vaccines. The present article summarizes insights out of 30 oral contributions on how quality and capacity requirements can be balanced with cost by using novel manufacturing technologies and operating models.
Improving quality of science through better animal welfare: the NC3Rs strategy.
Prescott, Mark J; Lidster, Katie
2017-03-22
Good animal welfare is linked to the quality of research data derived from laboratory animals, their validity as models of human disease, the number of animals required to reach statistical significance and the reproducibility of in vivo studies. Identifying new ways of understanding and improving animal welfare, and promoting these in the scientific community, is therefore a key part of the work of the National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs). Our strategy for animal welfare includes funding research to generate an evidence base to support refinements, office-led data sharing to challenge existing practices, events and networks to raise awareness of the evidence base, and the creation of online and other resources to support practical implementation of refinement opportunities.
Triggering of the Largest Deccan Eruptions by the Chicxulub Impact
NASA Astrophysics Data System (ADS)
Richards, M. A.; Alvarez, W.; Self, S.; Karlstrom, L.; Renne, P. R.; Manga, M.; Sprain, C. J.; Smit, J.; Vanderkluysen, L.; Gibson, S. A.
2015-12-01
Modern constraints on the timing of the Cretaceous-Paleogene (K-Pg) mass extinction and the Chicxulub impact, together with a particularly voluminous and apparently brief eruptive pulse toward the end of the "main-stage" eruptions of the Deccan continental flood basalt province, suggest that these three events may have occurred within less than about a hundred thousand years of each other. Partial melting induced by the Chicxulub event does not provide an energetically plausible explanation for this remarkable coincidence, and both geochronologic and magnetic-polarity data show that Deccan volcanism was underway well before Chicxulub/K-Pg time. However, historical data show that in some cases eruptions from existing volcanic systems are triggered by earthquakes. Seismic modeling of the ground motion due to the Chicxulub impact suggests that the resulting Mw~11 earthquake could have generated seismic energy densities of at least 0.1-1.0 J/m3 throughout the upper ~200 km of the Earth's mantle, sufficient to trigger volcanic eruptions worldwide based upon comparison with historical examples. Triggering may have been caused by a transient increase in the effective permeability of the existing deep magmatic system beneath the Deccan province, or mantle plume "head." We suggest that the Chicxulub impact triggered the enormous Poladpur, Ambenali, and Mahabaleshwar (Wai sub-group) lava flows that may account for >70% of the Deccan Traps main-stage eruptions. This hypothesis is consistent with independent stratigraphic, geochronologic, geochemical, and tectonic constraints, which combine to indicate that at approximately Chicxulub/K-Pg time a huge pulse of mantle plume-derived magma passed through the crust with little interaction, and erupted to form the most extensive and voluminous lava flows known on Earth. This impact-induced pulse of volcanism may have enhanced the K-Pg extinction event, and/or suppressed post-extinction biotic recovery. High-precision radioisotopic dating of the main-phase Deccan lavas promise a direct test of this hypothesis.
Considering context: reliable entity networks through contextual relationship extraction
NASA Astrophysics Data System (ADS)
David, Peter; Hawes, Timothy; Hansen, Nichole; Nolan, James J.
2016-05-01
Existing information extraction techniques can only partially address the problem of exploiting unreadable-large amounts text. When discussion of events and relationships is limited to simple, past-tense, factual descriptions of events, current NLP-based systems can identify events and relationships and extract a limited amount of additional information. But the simple subset of available information that existing tools can extract from text is only useful to a small set of users and problems. Automated systems need to find and separate information based on what is threatened or planned to occur, has occurred in the past, or could potentially occur. We address the problem of advanced event and relationship extraction with our event and relationship attribute recognition system, which labels generic, planned, recurring, and potential events. The approach is based on a combination of new machine learning methods, novel linguistic features, and crowd-sourced labeling. The attribute labeler closes the gap between structured event and relationship models and the complicated and nuanced language that people use to describe them. Our operational-quality event and relationship attribute labeler enables Warfighters and analysts to more thoroughly exploit information in unstructured text. This is made possible through 1) More precise event and relationship interpretation, 2) More detailed information about extracted events and relationships, and 3) More reliable and informative entity networks that acknowledge the different attributes of entity-entity relationships.
Nonlinear scaling of the Unit Hydrograph Peaking Factor for dam safety
NASA Astrophysics Data System (ADS)
Pradhan, N. R.; Loney, D.
2017-12-01
Existing U.S. Army Corps of Engineers (USACE) policy suggests unit hydrograph peaking factor (UHPF), the ratio of an observed and modeled event unit hydrograph peak, range between 1.25 and 1.50 to ensure dam safety. It is pertinent to investigate the impact of extreme flood events on the validity of this range through physically based rainfall-runoff models not available during the planning and design of most USACE dams. The UHPF range was analyzed by deploying the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model in the Goose Creek, VA, watershed to develop a UHPF relationship with excess rainfall across various return-period events. An effective rainfall factor (ERF) is introduced to validate existing UHPF guidance as well as provide a nonlinear UHPF scaling relation when effective rainfall does not match that of the UH design event.
Mesoscale Convective Complexes (MCCs) over the Indonesian Maritime Continent during the ENSO events
NASA Astrophysics Data System (ADS)
Trismidianto; Satyawardhana, H.
2018-05-01
This study analyzed the mesoscale convective complexes (MCCs) over the Indonesian Maritime Continent (IMC) during the El Niño/Southern Oscillation (ENSO) events for the the15-year period from 2001 to 2015. The MCCs identified by infrared satellite imagery that obtained from the Himawari generation satellite data. This study has reported that the frequencies of the MCC occurrences at the El Niño and La Niña were higher than that of neutral conditions during DJF. Peak of MCC occurrences during DJF at La Niña and neutral condition is in February, while El Niño is in January. ENSO strongly affects the occurrence of MCC during the DJF season. The existences of the MCC were also accompanied by increased rainfall intensity at the locations of the MCC occurrences for all ENSO events. During JJA seasons, the MCC occurrences are always found during neutral conditions, El Niño and La Niña in Indian Ocean. MCC occurring during the JJA season on El Niño and neutral conditions averaged much longer than during the DJF season. In contrast, MCCs occurring in La Niña conditions during the JJA season are more rapidly extinct than during the DJF. It indicates that the influence of MCC during La Niña during the DJF season is stronger than during the JJA season.
Event-based total suspended sediment particle size distribution model
NASA Astrophysics Data System (ADS)
Thompson, Jennifer; Sattar, Ahmed M. A.; Gharabaghi, Bahram; Warner, Richard C.
2016-05-01
One of the most challenging modelling tasks in hydrology is prediction of the total suspended sediment particle size distribution (TSS-PSD) in stormwater runoff generated from exposed soil surfaces at active construction sites and surface mining operations. The main objective of this study is to employ gene expression programming (GEP) and artificial neural networks (ANN) to develop a new model with the ability to more accurately predict the TSS-PSD by taking advantage of both event-specific and site-specific factors in the model. To compile the data for this study, laboratory scale experiments using rainfall simulators were conducted on fourteen different soils to obtain TSS-PSD. This data is supplemented with field data from three construction sites in Ontario over a period of two years to capture the effect of transport and deposition within the site. The combined data sets provide a wide range of key overlooked site-specific and storm event-specific factors. Both parent soil and TSS-PSD in runoff are quantified by fitting each to a lognormal distribution. Compared to existing regression models, the developed model more accurately predicted the TSS-PSD using a more comprehensive list of key model input parameters. Employment of the new model will increase the efficiency of deployment of required best management practices, designed based on TSS-PSD, to minimize potential adverse effects of construction site runoff on aquatic life in the receiving watercourses.
The 22 March 2014 Oso landslide, Washington, USA
NASA Astrophysics Data System (ADS)
Wartman, Joseph; Montgomery, David R.; Anderson, Scott A.; Keaton, Jeffrey R.; Benoît, Jean; dela Chapelle, John; Gilbert, Robert
2016-01-01
The Oso, Washington, USA, landslide occurred on the morning of Saturday, 22 March 2014 and claimed the lives of 43 people. The landslide began within an 200-m-high hillslope comprised of unconsolidated glacial and previous landslide/colluvial deposits; it continued as a debris avalanche/debris flow that rapidly inundated a neighborhood of 35 single-family residences. An intense three-week rainfall that immediately preceded the event most likely played a role in triggering the landslide; and other factors that likely contributed to destabilization of the landslide mass include alteration of the local groundwater recharge and hydrogeological regime from previous landsliding, weakening and alteration of the landslide mass caused by previous landsliding, and changes in stress distribution resulting from removal and deposition of material from earlier landsliding. Field reconnaissance following the event revealed six distinctive zones and several subzones that are characterized on the basis of geomorphic expression, styles of deformation, geologic materials, and the types, size, and orientation of vegetation. Seismic recording of the landslide indicate that the event was marked by several vibration-generating episodes of mass movement. We hypothesize that the landslide occurred in two stages, with the first being a sequential remobilization of existing slide masses from the most recent (2006) landslide and from an ancient slide that triggered a devastating debris avalanche/debris flow. The second stage involved headward extension into previously unfailed material that occurred in response to unloading and redirection of stresses.
High event rate ROICs (HEROICs) for astronomical UV photon counting detectors
NASA Astrophysics Data System (ADS)
Harwit, Alex; France, Kevin; Argabright, Vic; Franka, Steve; Freymiller, Ed; Ebbets, Dennis
2014-07-01
The next generation of astronomical photocathode / microchannel plate based UV photon counting detectors will overcome existing count rate limitations by replacing the anode arrays and external cabled electronics with anode arrays integrated into imaging Read Out Integrated Circuits (ROICs). We have fabricated a High Event Rate ROIC (HEROIC) consisting of a 32 by 32 array of 55 μm square pixels on a 60 μm pitch. The pixel sensitivity (threshold) has been designed to be globally programmable between 1 × 103 and 1 × 106 electrons. To achieve the sensitivity of 1 × 103 electrons, parasitic capacitances had to be minimized and this was achieved by fabricating the ROIC in a 65 nm CMOS process. The ROIC has been designed to support pixel counts up to 4096 events per integration period at rates up to 1 MHz per pixel. Integration time periods can be controlled via an external signal with a time resolution of less than 1 microsecond enabling temporally resolved imaging and spectroscopy of astronomical sources. An electrical injection port is provided to verify functionality and performance of each ROIC prior to vacuum integration with a photocathode and microchannel plate amplifier. Test results on the first ROICs using the electrical injection port demonstrate sensitivities between 3 × 103 and 4 × 105 electrons are achieved. A number of fixes are identified for a re-spin of this ROIC.
Helioviewer.org: An Open-source Tool for Visualizing Solar Data
NASA Astrophysics Data System (ADS)
Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.
2009-05-01
As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.
ERIC Educational Resources Information Center
Corning, Amy D.
2010-01-01
Research on memory of public events consistently reveals generational effects, where individuals remember best the events from their "critical years" of adolescence and early adulthood--a phenomenon attributed to privileged encoding or retrieval of memories due to primacy of experience. Prior research, however, has not decoupled the…
The Age Parameters of the Starting Demographic Events across Russian Generations
ERIC Educational Resources Information Center
Mitrofanova, E. S.
2016-01-01
This article presents comparisons of the ages and facts of starting demographic events in Russia based on the findings of three large-scale surveys: the European Social Survey, 2006; the Generations and Gender Survey, 2004, 2007, and 2011; and Person, Family, Society, 2013. This study focuses on the intergenerational and gender differences in the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draelos, Timothy J.; Ballard, Sanford; Young, Christopher J.
Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phasesmore » are considered. In addition, once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified.« less
Contraceptive Hormone Use and Cardiovascular Disease
Shufelt, Chrisandra L.; Noel Bairey Merz, C.
2009-01-01
Contraceptive hormones, most commonly prescribed as oral contraceptives (OC), are a widely utilized method to prevent ovulation, implantation and therefore pregnancy. The Women’s Health Initiative demonstrated cardiovascular risk linked to menopausal hormone therapy among women without pre-existing cardiovascular disease, prompting review of the safety, efficacy and side effects of other forms of hormone therapy. A variety of basic science, animal and human data suggest that contraceptive hormones have anti-atheromatous effects, however relatively less is known regarding the impact on atherosclerosis, thrombosis, vasomotion and arrhythmogenesis. Newer generation OC formulations currently in use indicate no increased myocardial infarction (MI) risk for current users, but a persistent increased risk of venous thrombo-embolism (VTE). There are no cardiovascular data available for the newest generation contraceptive hormone formulations, including those that contain newer progestins that lower blood pressure, as well as the non-oral routes (topical and vaginal). Current guidelines indicate that, as with all medication, contraceptive hormones should be selected and initiated by weighing risks and benefits for the individual patient. Women 35 years and older should be assessed for cardiovascular risk factors including hypertension, smoking, diabetes, nephropathy and other vascular diseases including migraines, prior to use. Existing data are mixed with regard to possible protection from OC for atherosclerosis and cardiovascular events; longer-term cardiovascular follow-up of menopausal women with regard to prior OC use, including subgroup information regarding adequacy of ovulatory cycling, the presence of hyperandrogenic conditions, and the presence of prothrombotic genetic disorders is needed to address this important issue. PMID:19147038
Noise suppression in surface microseismic data
Forghani-Arani, Farnoush; Batzle, Mike; Behura, Jyoti; Willis, Mark; Haines, Seth S.; Davidson, Michael
2012-01-01
We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform. We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform.
NASA Astrophysics Data System (ADS)
Benjamin, J.; Rosser, N. J.; Dunning, S.; Hardy, R. J.; Karim, K.; Szczucinski, W.; Norman, E. C.; Strzelecki, M.; Drewniak, M.
2014-12-01
Risk assessments of the threat posed by rock avalanches rely upon numerical modelling of potential run-out and spreading, and are contingent upon a thorough understanding of the flow dynamics inferred from deposits left by previous events. Few records exist of multiple rock avalanches with boundary conditions sufficiently consistent to develop a set of more generalised rules for behaviour across events. A unique cluster of 20 large (3 x 106 - 94 x 106 m3) rock avalanche deposits along the Vaigat Strait, West Greenland, offers a unique opportunity to model a large sample of adjacent events sourced from a stretch of coastal mountains of relatively uniform geology and structure. Our simulations of these events were performed using VolcFlow, a geophysical mass flow code developed to simulate volcanic debris avalanches. Rheological calibration of the model was performed using a well-constrained event at Paatuut (AD 2000). The best-fit simulation assumes a constant retarding stress with a collisional stress coefficient (T0 = 250 kPa, ξ = 0.01), and simulates run-out to within ±0.3% of that observed. Despite being widely used to simulate rock avalanche propagation, other models, that assume either a Coulomb frictional or a Voellmy rheology, failed to reproduce the observed event characteristics and deposit distribution at Paatuut. We applied this calibration to 19 other events, simulating rock avalanche motion across 3D terrain of varying levels of complexity. Our findings illustrate the utility and sensitivity of modelling a single rock avalanche satisfactorily as a function of rheology, alongside the validity of applying the same parameters elsewhere, even within similar boundary conditions. VolcFlow can plausibly account for the observed morphology of a series of deposits emplaced by events of different types, although its performance is sensitive to a range of topographic and geometric factors. These exercises show encouraging results in the model's ability to simulate a series of events using a single set of parameters obtained by back-analysis of the Paatuut event alone. The results also hold important implications for our process understanding of rock avalanches in confined fjord settings, where correctly modelling material flux at the point of entry into the water is critical in tsunami generation.
Generating political will for safe motherhood in Indonesia.
Shiffman, Jeremy
2003-03-01
In 1987 an international conference brought global attention to an issue that previously had been ignored: the world's alarmingly high number of maternal deaths in childbirth. The conference ended with a declaration calling for a reduction in maternal mortality by at least half by the year 2000. As the deadline approached, safe motherhood activists lamented the fact that the world was nowhere near to achieving this objective. They attributed this failure to a variety of causes, but were in agreement that the medical technology was available to prevent maternal deaths in childbirth, and the key was generating the political will to make such technology widely available to women in developing countries.What 'political will' means, however, has been left as an unopened black box. What causes governments to give priority to the issue of safe motherhood, given that national political systems are burdened with thousands of issues to sort through each year? In marked contrast to our extensive knowledge about the medical interventions necessary to prevent maternal death, we know little about the political interventions necessary to increase the likelihood that national leaders pay meaningful attention to the issue. Drawing from a scholarly literature on agenda setting, this paper identifies four factors that heighten the likelihood that an issue will rise to national-level attention: the existence of clear indicators showing that a problem exists; the presence of effective political entrepreneurs to push the cause; the organization of attention-generating focusing events that promote widespread concern for the issue; and the availability of politically palatable policy alternatives that enable national leaders to understand that the problem is surmountable. The paper presents a case study of the emergence, waning and re-generation of political priority for safe motherhood in Indonesia over the decade 1987-1997, to highlight how these four factors interacted to raise safe motherhood from near obscurity in the country to national-level prominence. While there are contextual factors that make this case unique, some elements are applicable to all developing countries. The paper draws out these dimensions in the hope that greater knowledge surrounding how political will actually has been generated can help shape strategic action to address this much neglected global problem.
Query2Question: Translating Visualization Interaction into Natural Language.
Nafari, Maryam; Weaver, Chris
2015-06-01
Richly interactive visualization tools are increasingly popular for data exploration and analysis in a wide variety of domains. Existing systems and techniques for recording provenance of interaction focus either on comprehensive automated recording of low-level interaction events or on idiosyncratic manual transcription of high-level analysis activities. In this paper, we present the architecture and translation design of a query-to-question (Q2Q) system that automatically records user interactions and presents them semantically using natural language (written English). Q2Q takes advantage of domain knowledge and uses natural language generation (NLG) techniques to translate and transcribe a progression of interactive visualization states into a visual log of styled text that complements and effectively extends the functionality of visualization tools. We present Q2Q as a means to support a cross-examination process in which questions rather than interactions are the focus of analytic reasoning and action. We describe the architecture and implementation of the Q2Q system, discuss key design factors and variations that effect question generation, and present several visualizations that incorporate Q2Q for analysis in a variety of knowledge domains.
Mobile mapping of sporting event spectators using bluetooth sensors: tour of flanders 2011.
Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico
2012-10-22
Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time.
Mobile Mapping of Sporting Event Spectators Using Bluetooth Sensors: Tour of Flanders 2011
Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico
2012-01-01
Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time. PMID:23202044
Directed Fluid Transport and Mixing with Biomimetic Cilia Arrays
NASA Astrophysics Data System (ADS)
Shields, A. R.; Evans, B. A.; Carstens, B. L.; Falvo, M. R.; Washburn, S.; Superfine, R.
2009-03-01
We present results on the long-range, directed fluid transport and fluidic mixing produced by the collective beating of arrays of biomimetic cilia. These artificial cilia are arrays of free-standing nanorods roughly the size of biological cilia, which we fabricate from a polymer-magnetic nanoparticle composite material and actuate with permanent magnets to mimic biological cilia. Biological cilia have evolved to produce microscale fluid transport and are increasingly being recognized as critical components in a wide range of biological systems. However, despite much effort cilia generated fluid flows remain an area of active study. In the last decade, cilia-driven fluid flow in the embryonic node of vertebrates has been implicated as the initial left-right symmetry breaking event in these embryos. With silia we generate directional fluid transport by mimicking the tilted conical beating of these nodal cilia. By seeding fluorescent microparticles into the fluid we have noted the existence of two distinct flow regimes. The fluid flow is directional and coherent above the cilia tips, while between the cilia tips and the floor particle motion is complicated and suggestive of chaotic advection.
Earthquake recording at the Stanford DAS Array with fibers in existing telecomm conduits
NASA Astrophysics Data System (ADS)
Biondi, B. C.; Martin, E. R.; Yuan, S.; Cole, S.; Karrenbach, M. H.
2017-12-01
The Stanford Distributed Acoustic Sensing Array (SDASA-1) has been continuously recording seismic data since September 2016 on 2.5 km of single mode fiber optics in existing telecommunications conduits under Stanford's campus. The array is figure-eight shaped and roughly 600 m along its widest side with a channel spacing of roughly 8 m. This array is easy to maintain and is nonintrusive, making it well suited to urban environments, but it sacrifices some cable-to-ground coupling compared to more traditional seismometers. We have been testing its utility for earthquake recording, active seismic, and ambient noise interferometry. This talk will focus on earthquake observations. We will show comparisons between the strain rates measured throughout the DAS array and the particle velocities measured at the nearby Jasper Ridge Seismic Station (JRSC). In some of these events, we will point out directionality features specific to DAS that can require slight modifications in data processing. We also compare repeatability of DAS and JRSC recordings of blasts from a nearby quarry. Using existing earthquake databases, we have created a small catalog of DAS earthquake observations by pulling records of over 700 Northern California events spanning Sep. 2016 to Jul. 2017 from both the DAS data and JRSC. On these events we have tested common array methods for earthquake detection and location including beamforming and STA/LTA analysis in time and frequency. We have analyzed these events to approximate thresholds on what distances and magnitudes are clearly detectible by the DAS array. Further analysis should be done on detectability with methods tailored to small events (for example, template matching). In creating this catalog, we have developed open source software available for free download that can manage large sets of continuous seismic data files (both existing files, and files as they stream in). This software can both interface with existing earthquake networks, and efficiently extract earthquake recordings from many continuous recordings saved on the users machines.
Investigation of runoff generation from anthropogenic sources with dissolved xenobiotics
NASA Astrophysics Data System (ADS)
Krein, A.; Pailler, J.; Guignard, C.; Iffly, J.; Pfister, L.; Hoffmann, L.
2009-04-01
In the experimental Mess basin (35 km2, Luxembourg) dissolved xenobiotics in surface water are used to study the influences of anthropogenic sources like separated sewer systems on runoff generation. Emerging contaminants like pharmaceuticals are of growing interest because of their use in large quantities in human and veterinary medicine. The amounts reaching surface waters depend on rainfall patterns, hydraulic conditions, consumption, metabolism, degradation, and disposal. The behaviour of endocrine disruptors including pharmaceuticals in the aquatic environment is widely unknown. The twelve molecules analyzed belong to three families: the estrogens, the antibiotics (sulfonamides, tetracyclines), and the painkillers (ibuprofen, diclofenac). Xenobiotics can be used as potential environmental tracers for untreated sewerage. Our results show that the concentrations are highly variable during flood events. The highest concentrations are reached in the first flush period, mainly during the rising limb of the flood hydrographs. As a result of the kinematic wave effect the concentration peak occurs in some cases a few hours after the discharge maximum. In floodwater (eleven floods, 66 samples) the highest concentrations were measured for ibuprofen (g/l range), estrone, and diclofenac (all ng/l range). From the tetracycline group, essentially tetracycline itself is of relevance, while the sulfonamides are mainly represented by sulfamethoxazole (all in ng/l range). In the Mess River the pharmaceuticals fluxes during flood events proved to be influenced by hydrological conditions. Different pharmaceuticals showed their concentration peaks during different times of a flood event. An example is the estrone peak that - during summer flash floods - often occurred one to two hours prior to the largest concentrations of the painkillers. This suggests for more sources than the sole storm drainage through the spillway of the single sewage water treatment plant, different transport velocities for single compounds or the existence of substance separating buffer storage in the stream network. In conditions of low intensity rainfall events and a few days of antecedent dry weather, acute peaks of pollution are discharged in the receiving waters. The influence of housing areas, main roads and sewer systems are obvious. These are characterized by rapid source depletion. Precipitation events of very small intensity and amount make themselves visible often as single peak storm events, which result predominantly from the sealed surface of this area. More accurate assessment of pollutant loads entering urban receiving water bodies is needed for improving urban storm water management and meeting water quality regulations.
NASA Astrophysics Data System (ADS)
Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.
2013-07-01
The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).
Systematic identification and analysis of frequent gene fusion events in metabolic pathways
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.
Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These setsmore » were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.« less
Systematic identification and analysis of frequent gene fusion events in metabolic pathways
Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.; ...
2016-06-24
Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These setsmore » were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.« less
Downgrading, downsizing, degazettement, and reclassification of protected areas in Brazil.
Bernard, E; Penna, L A O; Araújo, E
2014-08-01
Protected areas (PAs) are key elements for biodiversity conservation and ecosystem services. Brazil has the largest PA system in the world, covering approximately 220 million ha. This system expanded rapidly in the mid-1990s to the mid-2000s. Recent events in Brazil, however, have led to an increase in PA downgrading, downsizing, and degazettement (PADDD). Does this reflect a shift in the country's PA policy? We analyzed the occurrence, frequency, magnitude, type, spatial distribution, and causes of changes in PA boundaries and categories in Brazil. We identified 93 PADDD events from 1981 to 2012. Such events increased in frequency since 2008 and were ascribed primarily to generation and transmission of electricity in Amazonia. In Brazilian parks and reserves, 7.3 million ha were affected by PADDD events, and of these, 5.2 million ha were affected by downsizing or degazetting. Moreover, projects being considered by the Federal Congress may degazette 2.1 million ha of PA in Amazonia alone. Relaxing the protection status of existing PAs is proving to be politically easy in Brazil, and the recent increase in frequency and extension of PADDD reflects a change in governmental policy. By taking advantage of chronic deficiencies in financial and personnel resources and surveillance, disputes over land tenure, and the slowness of the Brazilian justice, government agencies have been implementing PADDD without consultation of civil society. If parks and reserves are to maintain their integrity, there will need to be investments in Brazilian PAs and a better understanding of the benefits PAs provide. © 2014 Society for Conservation Biology.
Public Outreach Guerilla Style: Just Add Science to Existing Events
NASA Astrophysics Data System (ADS)
Gelderman, Richard
2016-01-01
We report on a campaign to use the visual appeal of astronomy as a gateway drug to inject public outreach into settings where people aren't expecting an encounter with science. Our inspiration came from the team at guerillascience.org, who have earned a reputation for creating, at sites around the world, "experiences and events that are unexpected, thought-provoking, but, above all, that delight and entertain." Our goal is to insert astronomy into existing festivals of music, culture, and art; county and state fairs; sporting events; and local farmer's markets. With volunteers and near-zero budgets, we have been able to meaningfully engage with audience members who would never willingly attend an event advertised as science related. By purposefully relating astronomy to the non-science aspects of the event that caused the audience members to attend, new learning experiences are created that alter the often negative pre-conceived notions about science that many of them held before our encounter.
Complete event simulations of nuclear fission
NASA Astrophysics Data System (ADS)
Vogt, Ramona
2015-10-01
For many years, the state of the art for treating fission in radiation transport codes has involved sampling from average distributions. In these average fission models energy is not explicitly conserved and everything is uncorrelated because all particles are emitted independently. However, in a true fission event, the energies, momenta and multiplicities of the emitted particles are correlated. Such correlations are interesting for many modern applications. Event-by-event generation of complete fission events makes it possible to retain the kinematic information for all particles emitted: the fission products as well as prompt neutrons and photons. It is therefore possible to extract any desired correlation observables. Complete event simulations can be included in general Monte Carlo transport codes. We describe the general functionality of currently available fission event generators and compare results for several important observables. This work was performed under the auspices of the US DOE by LLNL, Contract DE-AC52-07NA27344. We acknowledge support of the Office of Defense Nuclear Nonproliferation Research and Development in DOE/NNSA.
Measurement of Neutrino-Induced Coherent Pion Production and the Diffractive Background in MINERvA
NASA Astrophysics Data System (ADS)
Gomez, Alicia; Minerva Collaboration
2015-04-01
Neutrino-induced coherent charged pion production is a unique neutrino-nucleus scattering process in which a muon and pion are produced while the nucleus is left in its ground state. The MINERvA experiment has made a model-independent differential cross section measurement of this process on carbon by selecting events with a muon and a pion, no evidence of nuclear break-up, and small momentum transfer to the nucleus | t | . A similar process which is a background to the measurement on carbon is diffractive pion production off the free protons in MINERvA's scintillator. This process is not modeled in the neutrino event generator GENIE. At low | t | these events have a similar final state to the aforementioned process. A study to quantify this diffractive event contribution to the background is done by emulating these diffractive events by reweighting all other GENIE-generated background events to the predicted | t | distribution of diffractive events, and then scaling to the diffractive cross section.
''Smart'' watchdog safety switch
Kronberg, J.W.
1991-10-01
A method and apparatus for monitoring a process having a periodic output so that the process equipment is not damaged in the event of a controller failure, comprising a low-pass and peak clipping filter, an event detector that generates an event pulse for each valid change in magnitude of the filtered periodic output, a timing pulse generator, a counter that increments upon receipt of any timing pulse and resets to zero on receipt of any event pulse, an alarm that alerts when the count reaches some preselected total count, and a set of relays that opens to stop power to process equipment. An interface module can be added to allow the switch to accept a variety of periodic output signals. 21 figures.
"Smart" watchdog safety switch
Kronberg, James W.
1991-01-01
A method and apparatus for monitoring a process having a periodic output so that the process equipment is not damaged in the event of a controller failure, comprising a low-pass and peak clipping filter, an event detector that generates an event pulse for each valid change in magnitude of the filtered periodic output, a timing pulse generator, a counter that increments upon receipt of any timing pulse and resets to zero on receipt of any event pulse, an alarm that alerts when the count reaches some preselected total count, and a set of relays that opens to stop power to process equipment. An interface module can be added to allow the switch to accept a variety of periodic output signals.
Integrated visualization of remote sensing data using Google Earth
NASA Astrophysics Data System (ADS)
Castella, M.; Rigo, T.; Argemi, O.; Bech, J.; Pineda, N.; Vilaclara, E.
2009-09-01
The need for advanced visualization tools for meteorological data has lead in the last years to the development of sophisticated software packages either by observing systems manufacturers or by third-party solution providers. For example, manufacturers of remote sensing systems such as weather radars or lightning detection systems include zoom, product selection, archive access capabilities, as well as quantitative tools for data analysis, as standard features which are highly appreciated in weather surveillance or post-event case study analysis. However, the fact that each manufacturer has its own visualization system and data formats hampers the usability and integration of different data sources. In this context, Google Earth (GE) offers the possibility of combining several graphical information types in a unique visualization system which can be easily accessed by users. The Meteorological Service of Catalonia (SMC) has been evaluating the use of GE as a visualization platform for surveillance tasks in adverse weather events. First experiences are related to the integration in real-time of remote sensing data: radar, lightning, and satellite. The tool shows the animation of the combined products in the last hour, giving a good picture of the meteorological situation. One of the main advantages of this product is that is easy to be installed in many computers and does not need high computational requirements. Besides this, the capability of GE provides information about the most affected areas by heavy rain or other weather phenomena. On the opposite, the main disadvantage is that the product offers only qualitative information, and quantitative data is only available though the graphical display (i.e. trough color scales but not associated to physical values that can be accessed by users easily). The procedure developed to run in real time is divided in three parts. First of all, a crontab file launches different applications, depending on the data type (satellite, radar, or lightning) to be treated. For each type of data, the time of launching is different, and goes from 5 (satellite and lightning) to 6 minutes (radar). The second part is the use of IDL and ENVI programs, which search in each archive file the last images in one hour. In the case of lightning data, the files are generated for the procedure, while for the others the procedure searches for existing imagery. Finally, the procedure generates metadata information required by GE, kml files, and sends them to the internal server. At the same time, in the local computer where GE is running, there exists kml files which update the information referring to the server ones. Another application that has been evaluated is the analysis of past events. In this sense, further work is devoted to develop access procedures to archived data via cgi scripts in order to retrieve and convert the information in a format suitable for GE. The presentation includes examples of the evaluation of the use of GE, and a brief comparison with other existing visualization systems available within the SMC.
40 CFR 194.54 - Scope of compliance assessments.
Code of Federal Regulations, 2010 CFR
2010-07-01
... processes and events that may occur over the regulatory time frame; (2) Identifies the processes, events, or... effects on the disposal system of: (1) Existing boreholes in the vicinity of the disposal system, with...
Zhou, Qinghua; Xiao, Fuliang; Yang, Chang; ...
2017-05-22
Electrostatic electron cyclotron harmonic (ECH) waves generated by the electron loss cone distribution can produce efficient scattering loss of plasma sheet electrons, which has a significant effect on the dynamics in the outer magnetosphere. Here we report two ECH emission events around the same location L≈ 5.7–5.8, MLT ≈ 12 from Van Allen Probes on 11 February (event A) and 9 January 2014 (event B), respectively. The spectrum of ECH waves was centered at the lower half of the harmonic bands during event A, but the upper half during event B. The observed electron phase space density in both eventsmore » is fitted by the subtracted bi-Maxwellian distribution, and the fitting functions are used to evaluate the local growth rates of ECH waves based on a linear theory for homogeneous plasmas. ECH waves are excited by the loss cone instability of 50 eV–1 keV electrons in the lower half of harmonic bands in the low-density plasmasphere in event A, and 1–10 keV electrons in the upper half of harmonic bands in a relatively high-density region in event B. Here, the current results successfully explain observations and provide a first direct evidence on how ECH waves are generated in the lower and upper half of harmonic frequency bands.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Qinghua; Xiao, Fuliang; Yang, Chang
Electrostatic electron cyclotron harmonic (ECH) waves generated by the electron loss cone distribution can produce efficient scattering loss of plasma sheet electrons, which has a significant effect on the dynamics in the outer magnetosphere. Here we report two ECH emission events around the same location L≈ 5.7–5.8, MLT ≈ 12 from Van Allen Probes on 11 February (event A) and 9 January 2014 (event B), respectively. The spectrum of ECH waves was centered at the lower half of the harmonic bands during event A, but the upper half during event B. The observed electron phase space density in both eventsmore » is fitted by the subtracted bi-Maxwellian distribution, and the fitting functions are used to evaluate the local growth rates of ECH waves based on a linear theory for homogeneous plasmas. ECH waves are excited by the loss cone instability of 50 eV–1 keV electrons in the lower half of harmonic bands in the low-density plasmasphere in event A, and 1–10 keV electrons in the upper half of harmonic bands in a relatively high-density region in event B. Here, the current results successfully explain observations and provide a first direct evidence on how ECH waves are generated in the lower and upper half of harmonic frequency bands.« less
Micro-evolution due to pollution: possible consequences for ecosystem responses to toxic stress.
Medina, Matías H; Correa, Juan A; Barata, Carlos
2007-05-01
Polluting events can change community structure and ecosystem functioning. Selection of genetically inherited tolerance on exposed populations, here referred as micro-evolution due to pollution, has been recognized as one of the causes of these changes. However, there is a gap between studies addressing this process and those assessing effects at higher levels of biological organization. In this review we attempt to address these evolutionary considerations into the ecological risk assessment (ERA) of polluting events and to trigger the discussion about the consequences of this process for the ecosystem response to toxic stress. We provide clear evidence that pollution drives micro-evolutionary processes in several species. When this process occurs, populations inhabiting environments that become polluted may persist. However, due to the existence of ecological costs derived from the loss of genetic variability, negative pleiotropy with fitness traits and/or from physiological alterations, micro-evolution due to pollution may alter different properties of the affected populations. Despite the existence of empirical evidence showing that safety margins currently applied in the ERA process may account for pollution-induced genetic changes in tolerance, information regarding long-term ecological consequences at higher levels of biological organization due to ecological costs is not explicitly considered in these procedures. In relation to this, we present four testable hypotheses considering that micro-evolution due to pollution acts upon the variability of functional response traits of the exposed populations and generates changes on their functional effect traits, therefore, modifying the way species exploit their ecological niches and participate in the overall ecosystem functioning.
NASA Astrophysics Data System (ADS)
Dorato, Mauro
The literature on the compatibility between the time of our experience-characterized by passage or becoming-and time as is represented within spacetime theories has been affected by a persistent failure to get a clear grasp of the notion of becoming, both in its relation to an ontology of events "spread" in a four-dimensional manifold, and in relation to temporally asymmetric physical processes. In the first part of my paper I try to remedy this situation by offering what I consider a clear and faithful explication of becoming, valid independently of the particular spacetime setting in which we operate. Along the way, I will show why the metaphysical debate between the so-called "presentists" and "eternalists" is completely irrelevant to the question of becoming, as the debate itself is generated by a failure to distinguish between a tensed and a tenseless sense of "existence". After a much needed distinction between absolute and relational becoming, I then show in what sense classical (non-quantum) spacetime physics presupposes both types of becoming, for the simple reason that spacetime physics presupposes an ontology of (timelike-separated) events. As a consequence, not only does it turn out that using physics to try to provide empirical evidence for the existence of becoming amounts to putting the cart before the horses, but also that the order imposed by "the arrow of becoming" is more fundamental than any other physical arrow of time, despite the fact that becoming cannot be used to explain why entropy grows, or retarded electromagnetic radiation prevails versus advanced radiation.
Zieff, Susan G; Kim, Mi-Sook; Wilson, Jackson; Tierney, Patrick
2014-02-01
Temporary parks such as the monthly event, Sunday Streets SF, support public health goals by using existing infrastructure and street closures to provide physical activity in neighborhoods underserved for recreational resources. Sunday Streets creates routes to enhance community connection. Six hundred and thirty-nine participants at 3 Sunday Streets events were surveyed using a 36-item instrument of open- and closed-ended questions about overall physical activity behavior, physical activity while at Sunday Streets, experience of the events, and demographic data. Overall, Sunday Streets participants are physically active (79% engage in activity 3-7 days/week) and approximately represent the ethnic minority distribution of the city. There were significant differences between first-time attendees and multiple-event attendees by duration of physical activity at the event (55.83 minutes vs. 75.13 minutes) and by frequency of physical activity bouts per week (3.69 vs. 4.22). Both groups emphasized the positive experience and safe environment as reasons to return to the event; for first-time attendees, the social environment was another reason to return. Temporary parks like Sunday Streets have the potential to provide healthful, population-wide physical activity using existing streets. The trend toward increased activity by multiple-event attendees suggests the importance of a regular schedule of events.
Radvansky, Gabriel A.; D’Mello, Sidney K.; Abbott, Robert G.; ...
2016-01-27
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covertmore » event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant’s current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person’s prior experience. Hence, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events.« less
Radvansky, Gabriel A.; D’Mello, Sidney K.; Abbott, Robert G.; Bixler, Robert E.
2016-01-01
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covert event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant’s current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person’s prior experience. Thus, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events. PMID:26858673
Radvansky, Gabriel A; D'Mello, Sidney K; Abbott, Robert G; Bixler, Robert E
2016-01-01
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covert event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant's current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person's prior experience. Thus, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radvansky, Gabriel A.; D’Mello, Sidney K.; Abbott, Robert G.
The Fluid Events Model is aimed at predicting changes in the actions people take on a moment-by-moment basis. In contrast with other research on action selection, this work does not investigate why some course of action was selected, but rather the likelihood of discontinuing the current course of action and selecting another in the near future. This is done using both task-based and experience-based factors. Prior work evaluated this model in the context of trial-by-trial, independent, interactive events, such as choosing how to copy a figure of a line drawing. In this paper, we extend this model to more covertmore » event experiences, such as reading narratives, as well as to continuous interactive events, such as playing a video game. To this end, the model was applied to existing data sets of reading time and event segmentation for written and picture stories. It was also applied to existing data sets of performance in a strategy board game, an aerial combat game, and a first person shooter game in which a participant’s current state was dependent on prior events. The results revealed that the model predicted behavior changes well, taking into account both the theoretically defined structure of the described events, as well as a person’s prior experience. Hence, theories of event cognition can benefit from efforts that take into account not only how events in the world are structured, but also how people experience those events.« less
Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berk, Alexander; Hawes, Frederick; Fox, Marsha
2016-03-15
Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development ofmore » fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field measurement program in collaboration with the Remote Sensing and Exploitation group at Sandia National Laboratories (SNL) in which data from their ongoing polarimetric field and laboratory measurement program will be shared and, to the extent allowed, tailored for model validation in exchange for model predictions under conditions and for geometries outside of their measurement domain.« less
NASA Astrophysics Data System (ADS)
Barnet, J.; Littler, K.; Kroon, D.; Leng, M. J.; Westerhold, T.; Roehl, U.; Zachos, J. C.
2017-12-01
The "greenhouse" world of the latest Cretaceous-Early Paleogene ( 70-34 Ma) was characterised by multi-million year variability in climate and the carbon-cycle. Throughout this interval the pervasive imprint of orbital-cyclicity, particularly eccentricity and precession, is visible in elemental and stable isotope data obtained from multiple deep-sea sites. Periodic "hyperthermal" events, occurring largely in-step with these orbital cycles, have proved particularly enigmatic, and may be the closest, albeit imperfect, analogues for anthropogenic climate change. This project utilises CaCO3-rich marine sediments recovered from ODP Site 1262 at a paleo-depth of 3600 m on the Walvis Ridge, South Atlantic, of late Maastrichtian-mid Paleocene age ( 67-60 Ma). We have derived high-resolution (2.5-4 kyr) carbon and oxygen isotope data from the epifaunal benthic foraminifera species Nuttallides truempyi. Combining the new record with the existing Late Paleocene-Early Eocene record generated from the same site by Littler et al. (2014), yields a single-site reference curve detailing 13.5 million years of orbital cyclicity in paleoclimate and carbon cycle from the latest Cretaceous to near the peak warmth of the Early Paleogene greenhouse. Spectral analysis of this new combined dataset allows us to identify long (405-kyr) eccentricity, short (100-kyr) eccentricity, and precession (19-23-kyr) as the principle forcing mechanisms governing pacing of the background climate and carbon-cycle during this time period, with a comparatively weak obliquity (41-kyr) signal. Cross-spectral analysis suggests that changes in climate lead the carbon cycle throughout most of the record, emphasising the role of the release of temperature-sensitive carbon stores as a positive feedback to an initial warming induced by changes in orbital configuration. The expression of comparatively understudied Early Paleocene events, including the Dan-C2 Event, Latest Danian Event, and Danian/Selandian Transition Event, are also identified within this new record, confirming the global nature and orbital pacing of the Latest Danian Event and Danian/Selandian Transition Event, but questioning the Dan-C2 event as a global hyperthermal.
Sun, Li; Wong, Ka Chun; Wei, Peng; Ye, Sheng; Huang, Hao; Yang, Fenhuan; Westerdahl, Dane; Louie, Peter K K; Luk, Connie W Y; Ning, Zhi
2016-02-05
This study presents the development and evaluation of a next generation air monitoring system with both laboratory and field tests. A multi-parameter algorithm was used to correct for the impact of environmental conditions on the electrochemical sensors for carbon monoxide (CO) and nitrogen dioxide (NO2) pollutants. The field evaluation in an urban roadside environment in comparison to designated monitors showed good agreement with measurement error within 5% of the pollutant concentrations. Multiple sets of the developed system were then deployed in the Hong Kong Marathon 2015 forming a sensor-based network along the marathon route. Real-time air pollution concentration data were wirelessly transmitted and the Air Quality Health Index (AQHI) for the Green Marathon was calculated, which were broadcast to the public on an hourly basis. The route-specific sensor network showed somewhat different pollutant patterns than routine air monitoring, indicating the immediate impact of traffic control during the marathon on the roadside air quality. The study is one of the first applications of a next generation sensor network in international sport events, and it demonstrated the usefulness of the emerging sensor-based air monitoring technology in rapid network deployment to supplement existing air monitoring.
Multi-Topic Tracking Model for dynamic social network
NASA Astrophysics Data System (ADS)
Li, Yuhua; Liu, Changzheng; Zhao, Ming; Li, Ruixuan; Xiao, Hailing; Wang, Kai; Zhang, Jun
2016-07-01
The topic tracking problem has attracted much attention in the last decades. However, existing approaches rarely consider network structures and textual topics together. In this paper, we propose a novel statistical model based on dynamic bayesian network, namely Multi-Topic Tracking Model for Dynamic Social Network (MTTD). It takes influence phenomenon, selection phenomenon, document generative process and the evolution of textual topics into account. Specifically, in our MTTD model, Gibbs Random Field is defined to model the influence of historical status of users in the network and the interdependency between them in order to consider the influence phenomenon. To address the selection phenomenon, a stochastic block model is used to model the link generation process based on the users' interests to topics. Probabilistic Latent Semantic Analysis (PLSA) is used to describe the document generative process according to the users' interests. Finally, the dependence on the historical topic status is also considered to ensure the continuity of the topic itself in topic evolution model. Expectation Maximization (EM) algorithm is utilized to estimate parameters in the proposed MTTD model. Empirical experiments on real datasets show that the MTTD model performs better than Popular Event Tracking (PET) and Dynamic Topic Model (DTM) in generalization performance, topic interpretability performance, topic content evolution and topic popularity evolution performance.
Sun, Li; Wong, Ka Chun; Wei, Peng; Ye, Sheng; Huang, Hao; Yang, Fenhuan; Westerdahl, Dane; Louie, Peter K.K.; Luk, Connie W.Y.; Ning, Zhi
2016-01-01
This study presents the development and evaluation of a next generation air monitoring system with both laboratory and field tests. A multi-parameter algorithm was used to correct for the impact of environmental conditions on the electrochemical sensors for carbon monoxide (CO) and nitrogen dioxide (NO2) pollutants. The field evaluation in an urban roadside environment in comparison to designated monitors showed good agreement with measurement error within 5% of the pollutant concentrations. Multiple sets of the developed system were then deployed in the Hong Kong Marathon 2015 forming a sensor-based network along the marathon route. Real-time air pollution concentration data were wirelessly transmitted and the Air Quality Health Index (AQHI) for the Green Marathon was calculated, which were broadcast to the public on an hourly basis. The route-specific sensor network showed somewhat different pollutant patterns than routine air monitoring, indicating the immediate impact of traffic control during the marathon on the roadside air quality. The study is one of the first applications of a next generation sensor network in international sport events, and it demonstrated the usefulness of the emerging sensor-based air monitoring technology in rapid network deployment to supplement existing air monitoring. PMID:26861336
Investigation of reversed flow channel events by the ICI-3 sounding rocket
NASA Astrophysics Data System (ADS)
Moen, J. I.; Dabakk, Y.; Oksavik, K.; Bekkeng, T.; Bekkeng, J. K.; Lorentzen, D. A.; Baddeley, L. J.; Abe, T.; Saito, Y.; Ogawa, Y.; Robert, P.; Yoeman, T.
2012-12-01
Transient flow channel events are a key characteristic of solar wind - magnetosphere coupling to the cusp ionosphere. One class of flow channels, Reversed Flow Events (RFE),was first discovered by the EISCAT Svalbard Radar and later also documented by the SuperDARN radar system. An RFE is typically a 100-200 km wide longitudinally elongated flow channel near the cusp inflow region, inside which the flow direction is opposite to the large scale ionospheric background convection. These events are hence associated with strong flow shears, and this category of flow events has been attributed to Birkeland current arcs. There are two possible explanations for their existence: (1) the RFE channel may be a region where two MI current loops, forced by independent voltage generators, couple through a poorly conducting ionosphere and (2) the reversed flow channel may be the ionospheric footprint of an inverted V-type coupling region. Electron beams of <1 keV will not give rise to significant conductivity gradients, and the form of a discontinuity in the magnetospheric electric field will be conserved when mapped down to the ionosphere, although reduced in amplitude.On 3 December, 2011 the Investigation of Cusp Irregularities 3 (ICI-3) sounding rocket was successfully launched from Ny-Ålesund, Svalbard to intersect an RFE event. The payload was equipped with Langmuir probes, AC and DC electric field and magnetic field experiments, and a low energy electron spectrometer (10 eV-10 keV). The auroral activity and flow context during the flight was provided by ground based optics, the EISCAT Svalbard Radar and the SuperDARN HF radars. In this talk we will present the ICI-3 test of the two physical explanations given above for the RFE phenomenon, and we will provide a quantitative measure of the Kelvin-Helmholtz instability growth rate associated with the flow shears.
Denardo, Scott J; Vock, David M; Schmalfuss, Carsten M; Young, Gregory D; Tcheng, James E; O'Connor, Christopher M
2016-07-01
Contrast media administered during cardiac catheterization can affect hemodynamic variables. However, little is documented about the effects of contrast on hemodynamics in heart failure patients or the prognostic value of baseline and changes in hemodynamics for predicting subsequent adverse events. In this prospective study of 150 heart failure patients, we measured hemodynamics at baseline and after administration of iodixanol or iopamidol contrast. One-year Kaplan-Meier estimates of adverse event-free survival (death, heart failure hospitalization, and rehospitalization) were generated, grouping patients by baseline measures of pulmonary capillary wedge pressure (PCWP) and cardiac index (CI), and by changes in those measures after contrast administration. We used Cox proportional hazards modeling to assess sequentially adding baseline PCWP and change in CI to 5 validated risk models (Seattle Heart Failure Score, ESCAPE [Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness], CHARM [Candesartan in Heart Failure: Assessment of Reduction in Mortality and Morbidity], CORONA [Controlled Rosuvastatin Multinational Trial in Heart Failure], and MAGGIC [Meta-Analysis Global Group in Chronic Heart Failure]). Median contrast volume was 109 mL. Both contrast media caused similarly small but statistically significant changes in most hemodynamic variables. There were 39 adverse events (26.0%). Adverse event rates increased using the composite metric of baseline PCWP and change in CI (P<0.01); elevated baseline PCWP and decreased CI after contrast correlated with the poorest prognosis. Adding both baseline PCWP and change in CI to the 5 risk models universally improved their predictive value (P≤0.02). In heart failure patients, the administration of contrast causes small but significant changes in hemodynamics. Calculating baseline PCWP with change in CI after contrast predicts adverse events and increases the predictive value of existing models. Patients with elevated baseline PCWP and decreased CI after contrast merit greatest concern. © 2016 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Maeda, Y.; Kumagai, H.; Londono, J. M.; Lopez, C. M.; Castaño, L. M.; Beatriz, B.; García, L.
2017-12-01
Nevado del Ruiz is an active volcano in Colombia, which continues eruption activity and has been monitored by 13 broadband and 3 short-period seismic stations. In 2015-2016, a joint Japan-Colombia team installed an automatic event detection and location system based on the amplitude source location (ASL) method. Kumagai et al. (IAVCEI, 2017) indicated the existence of a magma conduit extending from the NW flank to the summit based on ASL analyses of various seismic signals including long-period (LP) and very long period (VLP) events and tremors in a 5-10 Hz frequency band. In this study, we analyzed the VLP events by waveform inversion using eight summit stations in a frequency band of 0.3-0.7 Hz. We selected 14 VLP events from May to December 2016 based on signal-to-noise ratios and simplicity of the waveforms. We assumed a homogeneous P-wave velocity of 3.5 km/s with topography in the calculation of the Green functions. We conducted frequency-domain waveform inversion assuming a tensile crack source and investigated the best location and orientation of the crack by a grid search. The inversion results pointed to a low-angle ( 30°) NW-dipping crack near the top of the conduit (approximately 1 km below the summit). The estimated source time functions displayed two or three cycles of oscillations with the seismic moment of order of 1010-1011 N m. For these 14 events, the ASLs from the 5-10 Hz frequency band were also near the top of the conduit. These results suggest the VLP and high-frequency signals are generated by an oscillation of the crack-like conduit near the summit, which may be triggered by a volume change of magma ascending in the conduit.
The generation of spring peak flows by short-term meteorological events
Harold F. Haupt
1968-01-01
Spring peak flows recorded over a 25-year period in Benton Creek, a small forested watershed in northern Idaho, were studied in their relation to meteorological events. More peak flows were generated by rain-on-snow than by clear-weather snowmelt; the two types of peaks differ in magnitude and in other characteristics. Two rather simple techniques were used to...
Multi-Sensor Data Fusion Project
2000-02-28
seismic network by detecting T phases generated by underground events ( generally earthquakes ) and associating these phases to seismic events. The...between underwater explosions (H), underground sources, mostly earthquake - generated (7), and noise detections (N). The phases classified as H are the only...processing for infrasound sensors is most similar to seismic array processing with the exception that the detections are based on a more sophisticated
ERIC Educational Resources Information Center
Starr, Lisa R.; Hammen, Constance; Brennan, Patricia A.; Najman, Jake M.
2013-01-01
Previous research demonstrates that carriers of the short allele of the serotonin transporter gene (5-HTTLPR) show both greater susceptibility to depression in response to stressful life events and higher rates of generation of stressful events in response to depression. The current study examines relational security (i.e., self-reported beliefs…
Jeunehomme, Olivier; D'Argembeau, Arnaud
2016-01-01
Recent research suggests that episodic future thoughts can be formed through the same dual mechanisms, direct and generative, as autobiographical memories. However, the prevalence and determinants of the direct production of future event representations remain unclear. Here, we addressed this issue by collecting self-reports of production modes, response times (RTs), and verbal protocols for the production past and future events in the word cueing paradigm. Across three experiments, we found that both past and future events were frequently reported to come directly to mind in response to the cue, and RTs confirmed that events were produced faster for direct than for generative responses. When looking at the determinants of direct responses, we found that most past and future events that were directly produced had already been thought of on a previous occasion, and the frequency of previous thoughts predicted the occurrence of direct access. The direct production of autobiographical thoughts was also more frequent for past and future events that were judged important and emotionally intense. Collectively, these findings provide novel evidence that the direct production of episodic future thoughts is frequent in the word cueing paradigm and often involves the activation of personally significant "memories of the future."
Predicting geomorphically-induced flood risk for the Nepalese Terai communities
NASA Astrophysics Data System (ADS)
Dingle, Elizabeth; Creed, Maggie; Attal, Mikael; Sinclair, Hugh; Mudd, Simon; Borthwick, Alistair; Dugar, Sumit; Brown, Sarah
2017-04-01
Rivers sourced from the Himalaya irrigate the Indo-Gangetic Plain via major river networks that support 10% of the global population. However, many of these rivers are also the source of devastating floods. During the 2014 Karnali River floods in west Nepal, the Karnali rose to around 16 m at Chisapani (where it enters the Indo-Gangetic Plain), 1 m higher than the previous record in 1983; the return interval for this event was estimated to be 1000 years. Flood risk may currently be underestimated in this region, primarily because changes to the channel bed are not included when identifying areas at risk of flooding from events of varying recurrence intervals. Our observations in the field, corroborated by satellite imagery, show that river beds are highly mobile and constantly evolve through each monsoon. Increased bed levels due to sediment aggradation decreases the capacity of the river, increasing significantly the risk of devastating flood events; we refer to these as 'geomorphically-induced floods'. Major, short-lived episodes of sediment accumulation in channels are caused by stochastic variability in sediment flux generated by storms, earthquakes and glacial outburst floods from upstream parts of the catchment. Here, we generate a field-calibrated, geomorphic flood risk model for varying upstream scenarios, and predict changing flood risk for the Karnali River. A numerical model is used to carry out a sensitivity analysis of changes in channel geometry (particularly aggradation or degradation) based on realistic flood scenarios. In these scenarios, water and sediment discharge are varied within a range of plausible values, up to extreme sediment and water fluxes caused by widespread landsliding and/or intense monsoon precipitation based on existing records. The results of this sensitivity analysis will be used to inform flood hazard maps of the Karnali River floodplain and assess the vulnerability of the populations in the region.
Scott, W.E.; McGimsey, R.G.
1994-01-01
The 1989-1990 eruption of Redoubt Volcano spawned about 20 areally significant tephra-fall deposits between December 14, 1989 and April 26, 1990. Tephra plumes rose to altitudes of 7 to more than 10 km and were carried mainly northward and eastward by prevailing winds, where they substantially impacted air travel, commerce, and other activities. In comparison to notable eruptions of the recent past, the Redoubt events produced a modest amount of tephra-fall deposits - 6 ?? 107 to 5 ?? 1010 kg for individual events and a total volume (dense-rock equivalent) of about 3-5 ?? 107 m3 of andesite and dacite. Two contrasting tephra types were generated by these events. Pumiceous tephra-fall deposits of December 14 and 15 were followed on December 16 and all later events by fine-grained lithic-crystal tephra deposits, much of which fell as particle aggregates. The change in the character of the tephra-fall deposits reflects their fundamentally different modes of origin. The pumiceous deposits were produced by magmatically driven explosions. The finegrained lithic-crystal deposits were generated by two processes. Hydrovolcanic vent explosions generated tephrafall deposits of December 16 and 19. Such explosions continued as a tephra source, but apparently with diminishing importance, during events of January and February. Ash clouds of lithic pyroclastic flows generated by collapse of actively growing lava domes probably contributed to tephra-fall deposits of all events from January 2 to April 26, and were the sole source of tephra fall for at least the last 4 deposits. ?? 1994.
Lawler, D M; Petts, G E; Foster, I D L; Harper, S
2006-05-01
Turbidity is an important water quality variable, through its relation to light suppression, BOD impact, sediment-associated contaminant transport, and suspended sediment effects on organisms and habitats. Yet few published field investigations of wet-weather turbidity dynamics, through several individual and sequenced rainstorms in extremely urbanised headwater basins, have emerged. This paper aims to address this gap through a turbidity analysis of multiple storm events in spring 2001 in an urban headwater basin (57 km2) of the River Tame, central England, the most urbanised basin for its size in the UK ( approximately 42%). Data were collected at 15-min frequency at automated monitoring stations for rainfall, streamflow and six water quality variables (turbidity, EC, temperature, DO, pH, ammonia). Disturbance experiments also allowed estimates of bed sediment storage to be obtained. Six important and unusual features of the storm event turbidity response were apparent: (1) sluggish early turbidity response, followed by a turbidity 'rush'; (2) quasi-coincident flow and turbidity peaks; (3) anti-clockwise hysteresis in the discharge-turbidity relationship on all but one event, resulting from Falling-LImb Turbidity Extensions (FLITEs); (4) increases in peak turbidity levels through storm sequences; (5) initial micro-pulses (IMP) in turbidity; and (6) secondary turbidity peaks (STP) or 'turbidity shoulders' (TS). These features provided very little evidence of a true 'first-flush' effect: instead, substantial suspended solids transport continued right through the flow recessions, and little storm-event sediment exhaustion was evident. A new, dimensionless, hysteresis index, HI(mid), is developed to quantify the magnitude and direction of hysteresis in a simple, clear, direct and intuitive manner. This allowed the degree of departure from the classic 'first-flush', clockwise hysteresis models to be assessed. Of the 15 turbidity events considered, 10 coincided with ammonia spikes of up to 6.25 mg l(-1) at Water Orton (the downstream station): this suggests that spills from combined sewer overflows (CSO) or waste water treatment works (WwTWs) are significant in the throughput of turbid waters here. Substantial ammonia peaks related most strongly to total storm rainfall receipt, of four rainfall variables considered, and significant ammonia peaks were generated even from low-magnitude storms (rainfall totals <4 mm), indicating that spills are a frequent occurrence. Local bed sediment stores appear to be limited, suggesting that other distal sediment sources, such as road networks and old mineworkings are possibly more important. Biofilms may also play a part in delaying sediment release until late in the hydrograph, and in suppressing late spring turbidity levels. Existing first-flush models appear to be an oversimplification here. Such urban headwater basin responses can provide useful insights into the generation of contaminant waves, and offer vital early-warning systems for pollution events propagating downstream.
ERIC Educational Resources Information Center
Anderson, Scott; Raasch, Kevin
2002-01-01
Provides an evaluation template for student activities professionals charged with evaluating competitive event scheduling software. Guides staff in making an informed decision on whether to retain event management technology provided through an existing vendor or choose "best-of-breed" scheduling software. (EV)
10 CFR 72.92 - Design basis external natural events.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Design basis external natural events. 72.92 Section 72.92... Evaluation Factors § 72.92 Design basis external natural events. (a) Natural phenomena that may exist or that... potential effects on the safe operation of the ISFSI or MRS. The important natural phenomena that affect the...
10 CFR 72.92 - Design basis external natural events.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Design basis external natural events. 72.92 Section 72.92... Evaluation Factors § 72.92 Design basis external natural events. (a) Natural phenomena that may exist or that... potential effects on the safe operation of the ISFSI or MRS. The important natural phenomena that affect the...
Risk Perceptions on Hurricanes: Evidence from the U.S. Stock Market
Feria-Domínguez, José Manuel; Paneque, Pilar; Gil-Hurtado, María
2017-01-01
This article examines the market reaction of the main Property and Casualty (P & C) insurance companies listed in the New York Stock Exchange (NYSE) to seven most recent hurricanes that hit the East Coast of the United States from 2005 to 2012. For this purpose, we run a standard short horizon event study in order to test the existence of abnormal returns around the landfalls. P & C companies are one of the most affected sectors by such events because of the huge losses to rebuild, help and compensate the inhabitants of the affected areas. From the financial investors’ perception, this kind of events implies severe losses, which could influence the expected returns. Our research highlights the existence of significant cumulative abnormal returns around the landfall event window in most of the hurricanes analyzed, except for the Katrina and Sandy Hurricanes. PMID:28587237
Risk Perceptions on Hurricanes: Evidence from the U.S. Stock Market.
Feria-Domínguez, José Manuel; Paneque, Pilar; Gil-Hurtado, María
2017-06-05
This article examines the market reaction of the main Property and Casualty (P & C) insurance companies listed in the New York Stock Exchange (NYSE) to seven most recent hurricanes that hit the East Coast of the United States from 2005 to 2012. For this purpose, we run a standard short horizon event study in order to test the existence of abnormal returns around the landfalls. P & C companies are one of the most affected sectors by such events because of the huge losses to rebuild, help and compensate the inhabitants of the affected areas. From the financial investors' perception, this kind of events implies severe losses, which could influence the expected returns. Our research highlights the existence of significant cumulative abnormal returns around the landfall event window in most of the hurricanes analyzed, except for the Katrina and Sandy Hurricanes.
A model of human decision making in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1982-01-01
Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.