Science.gov

Sample records for event builder networks

  1. Network Performance Testing for the BaBar Event Builder

    SciTech Connect

    Pavel, Tomas J

    1998-11-17

    We present an overview of the design of event building in the BABAR Online, based upon TCP/IP and commodity networking technology. BABAR is a high-rate experiment to study CP violation in asymmetric e{sup +}e{sup {minus}} collisions. In order to validate the event-builder design, an extensive program was undertaken to test the TCP performance delivered by various machine types with both ATM OC-3 and Fast Ethernet networks. The buffering characteristics of several candidate switches were examined and found to be generally adequate for our purposes. We highlight the results of this testing and present some of the more significant findings.

  2. CMS DAQ event builder based on gigabit ethernet

    SciTech Connect

    Pieri, M.; Maron, G.; Brett, A.; Cano, E.; Cittolin, S.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Reino Garrido, R.; Gulmini, M.; Gutleber, J.; Jacobs, C.; Meijers, F.; Meschi, E.; Oh, A.; Orsini, L.; Pollet, L.; Racz, A.; Rosinsky, P.; Sakulin, H.; Schwick, C.; /UC, San Diego /INFN, Legnaro /CERN /UCLA /Santiago de Compostela U. /Lisbon, LIFEP /Fermilab /MIT /Boskovic Inst., Zagreb

    2006-06-01

    The CMS Data Acquisition system is designed to build and filter events originating from approximately 500 data sources from the detector at a maximum Level 1 trigger rate of 100 kHz and with an aggregate throughput of 100 GByte/s. For this purpose different architectures and switch technologies have been evaluated. Events will be built in two stages: the first stage, the FED Builder, will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The next stage, the Readout Builder, will perform the building of full events. The requirement of one Readout Builder is to build events at 12.5 kHz with average size of 16 kBytes from 64 sources. In this paper we present the prospects of a Readout Builder based on TCP/IP over Gigabit Ethernet. Various Readout Builder architectures that we are considering are discussed. The results of throughput measurements and scaling performance are outlined as well as the preliminary estimates of the final performance. All these studies have been carried out at our test-bed farms that are made up of a total of 130 dual Xeon PCs interconnected with Myrinet and Gigabit Ethernet networking and switching technologies.

  3. A New Event Builder for CMS Run II

    NASA Astrophysics Data System (ADS)

    Albertsson, K.; Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Infiniband FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. We present performance measurements from small-scale prototypes and from the full-scale production system.

  4. A new event builder for CMS Run II

    SciTech Connect

    Albertsson, K.; Andre, J-M; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G-L; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-01-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Inniband FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. In conclusion, ee present performance measurements from small-scale prototypes and from the full-scale production system.

  5. A New Event Builder for CMS Run II

    SciTech Connect

    Albertsson, K.; et al.

    2015-12-23

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Inniband FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. We present performance measurements from small-scale prototypes and from the full-scale production system.

  6. CMS DAQ event builder based on Gigabit Ethernet

    SciTech Connect

    Bauer, G.; Boyer, V.; Branson, J.; Brett, A.; Cano, E.; Carboni, A.; Ciganek, M.; Cittolin, S.; Erhan, Samim; Gigi, D.; Glege, F.; /CERN /INFN, Legnaro /CERN /CERN /Kyungpook Natl. U. /MIT /UC, San Diego /CERN

    2007-04-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  7. A new event builder for CMS Run II

    DOE PAGESBeta

    Albertsson, K.; Andre, J-M; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G-L; Deldicque, C.; Dobson, M.; et al

    2015-01-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Innibandmore » FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. In conclusion, ee present performance measurements from small-scale prototypes and from the full-scale production system.« less

  8. Event builder and level 3 at the CDF experiment

    SciTech Connect

    G. Gomez-Ceballos, A. Belloni and A. Bolshov

    2003-10-30

    The Event Builder and Level3 systems constitute critical components of the DAQ in the CDF experiment at Fermilab. These systems are responsible for collecting data fragments from the front end electronics, assembling the data into complete event records, reconstructing the events, and forming the final trigger decision. With Tevatron Run IIa in progress, the systems have been running successfully at high throughput rates, the design utilizing scalable architecture and distributed event processing to meet the requirements. A brief description current performance in Run IIa and possible upgrade for Run IIb is presented.

  9. CDF DAQ upgrade and CMS DAQ R and D: event builder tests using an ATM switch

    SciTech Connect

    Bauer, G.; Daniels, T.; Kelley, K.

    1996-12-31

    The present data acquisition system of the CDF experiment has to be upgraded for the higher luminosities expected during the Run 11 (1999+) data-taking period. The core of the system, consisting of a control network based on reflective memories will remain the same. The network used for data transfers, however, will have to be changed. We have investigated ATM as a possible replacement technology for the current Ultranet switch. We present preliminary results on this new ATM-based event builder system.

  10. Results from a data acquisition system prototype project using a switch-based event builder

    SciTech Connect

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D. ); Bowden, M.; Booth, A. ); Cancelo, G. )

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs.

  11. Portable software for distributed readout controllers and event builders in FASTBUS and VME

    SciTech Connect

    Pordes, R.; Berg, D.; Berman, E.; Bernett, M.; Brown, D.; Constanta-Fanourakis, P.; Dorries, T.; Haire, M.; Joshi, U.; Kaczar, K.; Mackinnon, B.; Moore, C.; Nicinski, T.; Oleynik, G.; Petravick, D.; Sergey, G.; Slimmer, D.; Streets, J.; Votava, M.; White, V.

    1989-12-01

    We report on software developed as part of the PAN-DA system to support the functions of front end readout controllers and event builders in multiprocessor, multilevel, distributed data acquisition systems. For the next generation data acquisition system we have undertaken to design and implement software tools that are easily transportable to new modules. The first implementation of this software is for Motorola 68K series processor boards in FASTBUS and VME and will be used in the Fermilab accelerator run at the beginning of 1990. We use a Real Time Kernel Operating System. The software provides general connectivity tools for control, diagnosis and monitoring. 17 refs., 7 figs.

  12. 76 FR 2145 - Masco Builder Cabinet Group Including On-Site Leased Workers From Reserves Network, Jackson, OH...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-12

    ... Federal Register on December 11, 2009 (74 FR 65798). The Department has received information that the... Reserves Network, Jackson, OH; Masco Builder Cabinet Group, Waverly, OH; Masco Builder Cabinet Group, Seal... workers of Masco Building Cabinet Group in Waverly, Ohio, Seal Township, Ohio, and Seaman, Ohio....

  13. RefNetBuilder: a platform for construction of integrated reference gene regulatory networks from expressed sequence tags

    PubMed Central

    2011-01-01

    Background Gene Regulatory Networks (GRNs) provide integrated views of gene interactions that control biological processes. Many public databases contain biological interactions extracted from experimentally validated literature reports, but most furnish only information for a few genetic model organisms. In order to provide a bioinformatic tool for researchers who work with non-model organisms, we developed RefNetBuilder, a new platform that allows construction of putative reference pathways or GRNs from expressed sequence tags (ESTs). Results RefNetBuilder was designed to have the flexibility to extract and archive pathway or GRN information from public databases such as the Kyoto Encyclopedia of Genes and Genomes (KEGG). It features sequence alignment tools such as BLAST to allow mapping ESTs to pathways and GRNs in model organisms. A scoring algorithm was incorporated to rank and select the best match for each query EST. We validated RefNetBuilder using DNA sequences of Caenorhabditis elegans, a model organism having manually curated KEGG pathways. Using the earthworm Eisenia fetida as an example, we demonstrated the functionalities and features of RefNetBuilder. Conclusions The RefNetBuilder provides a standalone application for building reference GRNs for non-model organisms on a number of operating system platforms with standard desktop computer hardware. As a new bioinformatic tool aimed for constructing putative GRNs for non-model organisms that have only ESTs available, RefNetBuilder is especially useful to explore pathway- or network-related information in these organisms. PMID:22166047

  14. The reef builder gastropod Dendropoma petreaum - A proxy of short and long term climatic events in the Eastern Mediterranean

    NASA Astrophysics Data System (ADS)

    Sisma-Ventura, Guy; Guzner, Barak; Yam, Ruth; Fine, Maoz; Shemesh, Aldo

    2009-08-01

    High-resolution δ 18O and δ 13C records obtained from seven cores were drilled from ledges of the reef builder gastropod Dendropomapetreaum and used to reconstruct variations in the Levantine basin sea surface temperature, hydrology and productivity during the past 500 years. The δ 18O of the aragonite shell of living D . petreaum indicate that skeletal deposition occurs under isotopic equilibrium and faithfully record the temperature and surface water δ 18O during summer and autumn. The mean down core δ 18O record clearly captures global and local climatic events, such as the Little Ice Age (LIA) and the recent warming of surface waters in the Eastern Mediterranean. Comparison to the Western Mediterranean vermetid δ 18O record reveals changes in the freshwater/evaporation budgets of the two basins during cold and warm periods. The Eastern basin had lower surface temperatures and excess evaporation during the LIA and experienced a relatively larger warming and/or a decrease in freshwater/evaporation during the past 70 years. The D . petraeum δ 13C is strongly related to δ 13C of dissolved inorganic carbon and to the primary productivity of the surface water. The mean down core δ 13C record exhibits enrichment during the LIA maximum and a strong depletion trend during the last century. The LIA δ 13C enrichment is attributed to an increase in primary production and high nutrient levels which resulted from increased vertical mixing and upwelling. The last century δ 13C depletion is mostly related to the increased anthropogenic emissions of 13C depleted carbon dioxide and to a certain decrease in primary production. The data indicate that D. petraeum isotopic signatures are unique proxies for last 500 years high-resolution reconstruction of paleo-oceanographic environments in the Mediterranean and potentially in the sub-tropical Atlantic regions.

  15. Host Event Based Network Monitoring

    SciTech Connect

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  16. Climate Networks and Extreme Events

    NASA Astrophysics Data System (ADS)

    Kurths, J.

    2014-12-01

    We analyse some climate dynamics from a complex network approach. This leads to an inverse problem: Is there a backbone-like structure underlying the climate system? For this we propose a method to reconstruct and analyze a complex network from data generated by a spatio-temporal dynamical system. This approach enables us to uncover relations to global circulation patterns in oceans and atmosphere. The global scale view on climate networks offers promising new perspectives for detecting dynamical structures based on nonlinear physical processes in the climate system. Moreover, we evaluate different regional climate models from this aspect. This concept is also applied to Monsoon data in order to characterize the regional occurrence of extreme rain events and its impact on predictability. Changing climatic conditions have led to a significant increase in magnitude and frequency of spatially extensive extreme rainfall events in the eastern Central Andes of South America. These events impose substantial natural hazards for population, economy, and ecology by floods and landslides. For example, heavy floods in Bolivia in early 2007 affected more than 133.000 households and produced estimated costs of 443 Mio. USD. Here, we develop a general framework to predict extreme events by combining a non-linear synchronization technique with complex networks. We apply our method to real-time satellite-derived rainfall data and are able to predict a large amount of extreme rainfall events. Our study reveals a linkage between polar and subtropical regimes as responsible mechanism: Extreme rainfall in the eastern Central Andes is caused by the interplay of northward migrating frontal systems and a low-level wind channel from the western Amazon to the subtropics, providing additional moisture. Frontal systems from the Antarctic thus play a key role for sub-seasonal variability of the South American Monsoon System.

  17. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  18. Controlling extreme events on complex networks

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-08-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network ``mobile'' can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.

  19. Drastic events make evolving networks

    NASA Astrophysics Data System (ADS)

    Ausloos, M.; Lambiotte, R.

    2007-05-01

    Co-authorship networks of neighbouring scientific disciplines, i.e. granular (G) media and networks (N) are studied in order to observe drastic structural changes in evolving networks. The data is taken from arXives. The system is described as coupled networks. By considering the 1995-2005 time interval and scanning the author-article network evolution with a mobile time window, we focus on the properties of the links, as well as on the time evolution of the nodes. They can be in three states, N, G or multi-disciplinary (M). This leads to drastic jumps in a so-called order parameter, i.e. the link proportion of a given type, forming the main island, that reminds of features appearing at percolation and during metastable (aggregation-desaggregation) processes. The data analysis also focuses on the way different kinds (N, G or M) of authors collaborate, and on the kind of the resulting collaboration.

  20. Controlling extreme events on complex networks

    PubMed Central

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-01-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network “mobile” can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed. PMID:25131344

  1. Controlling extreme events on complex networks.

    PubMed

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-01-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network "mobile" can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed. PMID:25131344

  2. Emergence of event cascades in inhomogeneous networks.

    PubMed

    Onaga, Tomokatsu; Shinomoto, Shigeru

    2016-01-01

    There is a commonality among contagious diseases, tweets, and neuronal firings that past events facilitate the future occurrence of events. The spread of events has been extensively studied such that the systems exhibit catastrophic chain reactions if the interaction represented by the ratio of reproduction exceeds unity; however, their subthreshold states are not fully understood. Here, we report that these systems are possessed by nonstationary cascades of event-occurrences already in the subthreshold regime. Event cascades can be harmful in some contexts, when the peak-demand causes vaccine shortages, heavy traffic on communication lines, but may be beneficial in other contexts, such that spontaneous activity in neural networks may be used to generate motion or store memory. Thus it is important to comprehend the mechanism by which such cascades appear, and consider controlling a system to tame or facilitate fluctuations in the event-occurrences. The critical interaction for the emergence of cascades depends greatly on the network structure in which individuals are connected. We demonstrate that we can predict whether cascades may emerge, given information about the interactions between individuals. Furthermore, we develop a method of reallocating connections among individuals so that event cascades may be either impeded or impelled in a network. PMID:27625183

  3. Acoustic network event classification using swarm optimization

    NASA Astrophysics Data System (ADS)

    Burman, Jerry

    2013-05-01

    Classifying acoustic signals detected by distributed sensor networks is a difficult problem due to the wide variations that can occur in the transmission of terrestrial, subterranean, seismic and aerial events. An acoustic event classifier was developed that uses particle swarm optimization to perform a flexible time correlation of a sensed acoustic signature to reference data. In order to mitigate the effects from interference such as multipath, the classifier fuses signatures from multiple sensors to form a composite sensed acoustic signature and then automatically matches the composite signature with reference data. The approach can classify all types of acoustic events but is particularly well suited to explosive events such as gun shots, mortar blasts and improvised explosive devices that produce an acoustic signature having a shock wave component that is aperiodic and non-linear. The classifier was applied to field data and yielded excellent results in terms of reconstructing degraded acoustic signatures from multiple sensors and in classifying disparate acoustic events.

  4. Neural network classification of questionable EGRET events

    NASA Technical Reports Server (NTRS)

    Meetre, C. A.; Norris, J. P.

    1992-01-01

    High energy gamma rays (greater than 20 MeV) pair producing in the spark chamber of the Energetic Gamma Ray Telescope Experiment (EGRET) give rise to a characteristic but highly variable 3-D locus of spark sites, which must be processed to decide whether the event is to be included in the database. A significant fraction (about 15 percent or 10(exp 4) events/day) of the candidate events cannot be categorized (accept/reject) by an automated rule-based procedure; they are therefore tagged, and must be examined and classified manually by a team of expert analysts. We describe a feedforward, back-propagation neural network approach to the classification of the questionable events. The algorithm computes a set of coefficients using representative exemplars drawn from the preclassified set of questionable events. These coefficients map a given input event into a decision vector that, ideally, describes the correct disposition of the event. The net's accuracy is then tested using a different subset of preclassified events. Preliminary results demonstrate the net's ability to correctly classify a large proportion of the events for some categories of questionables. Current work includes the use of much larger training sets to improve the accuracy of the net.

  5. AGU Hosts Networking Event for Female Scientists

    NASA Astrophysics Data System (ADS)

    McEntee, Chris

    2013-01-01

    At Fall Meeting this year I had the pleasure of cohosting a new event, a Networking Reception for Early Career Female Scientists and Students, with Jane Lubchenco, under secretary of Commerce for Oceans and Atmosphere and National Oceanic and Atmospheric Administration administrator, and Marcia McNutt, director of the U.S. Geological Survey. AGU recognizes the importance of having a diverse pool of new researchers who can enrich Earth and space sciences with their skills and innovation. That's why one of our four strategic goals is to help build the global talent pool and provide early-career scientists with networking opportunities like this one.

  6. eProject Builder

    Energy Science and Technology Software Center (ESTSC)

    2014-06-01

    eProject Builder enables Energy Services Companies (ESCOs) and their contracting agencies to: 1. upload and track project-level Information 2. generate basic project reports required by local, state, and/or federal agencies 3. benchmark new Energy Savings Performance Contract (ESPC) projects against historical data

  7. Algorithms for builder guidelines

    SciTech Connect

    Balcomb, J.D.; Lekov, A.B.

    1989-06-01

    The Builder Guidelines are designed to make simple, appropriate guidelines available to builders for their specific localities. Builders may select from passive solar and conservation strategies with different performance potentials. They can then compare the calculated results for their particular house design with a typical house in the same location. Algorithms used to develop the Builder Guidelines are described. The main algorithms used are the monthly solar ratio (SLR) method for winter heating, the diurnal heat capacity (DHC) method for temperature swing, and a new simplified calculation method (McCool) for summer cooling. This paper applies the algorithms to estimate the performance potential of passive solar strategies, and the annual heating and cooling loads of various combinations of conservation and passive solar strategies. The basis of the McCool method is described. All three methods are implemented in a microcomputer program used to generate the guideline numbers. Guidelines for Denver, Colorado, are used to illustrate the results. The structure of the guidelines and worksheet booklets are also presented. 5 refs., 3 tabs.

  8. TrustBuilder2

    Energy Science and Technology Software Center (ESTSC)

    2007-07-20

    TrustBuilder2 is a flexible framework for supporting research in the area trust negotiation protocols, designed to allow researchers to quickly prototype and experiment with various approaches to trust negotiation. In Trustbuilder2, the primary components of a trust negotiation system are represented using abstract interfaces. Any or all of these components can be implemented or extended by users of the TrustBuilder2 system, thereby making the system's functionality easily extensible. The TrustBuilder2 configuration files can be modifiedmore » to load these custom components in place of the default system components; this facilitates the use of new features without modifications to the underlying runtime system. In our implementation, we provide support for one negotiation strategy, a policy compliance checker based on Jess (the Java Expert System Shell), query interfaces enabling access to disk-based credential and policy repositories, a credential chain construction algorithm, two credential chain verification routines, and both graphical and text-based logging facilities. Trustbuilder2 also supports the interposition of user-defined plug-ins at communication points between system components to allow for easy monitoring of system activity or the modification of messages passed between components.« less

  9. Digital Learning Network Event with Robotics Engineer Jonathan Rogers

    NASA Video Gallery

    Robotics engineer Jonathan Rogers and Public Affairs Officer Kylie Clem participate in a Digital Learning Network educational event, answering questions from students at Montgomery Middle School in...

  10. Composite beam builder

    NASA Technical Reports Server (NTRS)

    Poveromo, L. M.; Muench, W. K.; Marx, W.; Lubin, G.

    1981-01-01

    The building block approach to large space structures is discussed, and the progress made in constructing aluminum beams is noted. It is pointed out that composites will also be required in space structures because they provide minimal distortion characteristics during thermal transients. A composite beam builder currently under development is discussed, with attention given to cap forming and the fastening of cross-braces. The various composite materials being considered are listed, along with certain of their properties. The need to develop continuous forming stock up to 300 m long is stressed.

  11. Man-machine interface builders at the Advanced Photon Source

    SciTech Connect

    Anderson, M.D.

    1991-12-31

    Argonne National Laboratory is constructing a 7-GeV Advanced Photon Source for use as a synchrotron radiation source in basic and applied research. The controls and computing environment for this accelerator complex includes graphical operator interfaces to the machine based on Motif, X11, and PHIGS/PEX. Construction and operation of the control system for this accelerator relies upon interactive interface builder and diagram/editor type tools, as well as a run-time environment for the constructed displays which communicate with the physical machine via network connections. This paper discusses our experience with several commercial CUI builders, the inadequacies found in these, motivation for the development of an application- specific builder, and design and implementation strategies employed in the development of our own Man-Machine Interface builder. 5 refs.

  12. Man-machine interface builders at the Advanced Photon Source

    SciTech Connect

    Anderson, M.D.

    1991-01-01

    Argonne National Laboratory is constructing a 7-GeV Advanced Photon Source for use as a synchrotron radiation source in basic and applied research. The controls and computing environment for this accelerator complex includes graphical operator interfaces to the machine based on Motif, X11, and PHIGS/PEX. Construction and operation of the control system for this accelerator relies upon interactive interface builder and diagram/editor type tools, as well as a run-time environment for the constructed displays which communicate with the physical machine via network connections. This paper discusses our experience with several commercial CUI builders, the inadequacies found in these, motivation for the development of an application- specific builder, and design and implementation strategies employed in the development of our own Man-Machine Interface builder. 5 refs.

  13. Builders Challenge Quality Criteria Support Document

    SciTech Connect

    2009-06-01

    This document provides guidance to U.S. home builders participating in Builders Challenge. To qualify for the Builders Challenge, a home must score 70 or less on the EnergySmart Home Scale (E-Scale). Homes also must meet the Builders Challenge Quality Cri

  14. Synthetic Event Reconstruction Experiments for Defining Sensor Network Characteristics

    SciTech Connect

    Lundquist, J K; Kosovic, B; Belles, R

    2005-12-15

    An event reconstruction technology system has been designed and implemented at Lawrence Livermore National Laboratory (LLNL). This system integrates sensor observations, which may be sparse and/or conflicting, with transport and dispersion models via Bayesian stochastic sampling methodologies to characterize the sources of atmospheric releases of hazardous materials. We demonstrate the application of this event reconstruction technology system to designing sensor networks for detecting and responding to atmospheric releases of hazardous materials. The quantitative measure of the reduction in uncertainty, or benefit of a given network, can be utilized by policy makers to determine the cost/benefit of certain networks. Herein we present two numerical experiments demonstrating the utility of the event reconstruction methodology for sensor network design. In the first set of experiments, only the time resolution of the sensors varies between three candidate networks. The most ''expensive'' sensor network offers few advantages over the moderately-priced network for reconstructing the release examined here. The second set of experiments explores the significance of the sensors detection limit, which can have a significant impact on sensor cost. In this experiment, the expensive network can most clearly define the source location and source release rate. The other networks provide data insufficient for distinguishing between two possible clusters of source locations. When the reconstructions from all networks are aggregated into a composite plume, a decision-maker can distinguish the utility of the expensive sensor network.

  15. Signaling communication events in a computer network

    DOEpatents

    Bender, Carl A.; DiNicola, Paul D.; Gildea, Kevin J.; Govindaraju, Rama K.; Kim, Chulho; Mirza, Jamshed H.; Shah, Gautam H.; Nieplocha, Jaroslaw

    2000-01-01

    A method, apparatus and program product for detecting a communication event in a distributed parallel data processing system in which a message is sent from an origin to a target. A low-level application programming interface (LAPI) is provided which has an operation for associating a counter with a communication event to be detected. The LAPI increments the counter upon the occurrence of the communication event. The number in the counter is monitored, and when the number increases, the event is detected. A completion counter in the origin is associated with the completion of a message being sent from the origin to the target. When the message is completed, LAPI increments the completion counter such that monitoring the completion counter detects the completion of the message. The completion counter may be used to insure that a first message has been sent from the origin to the target and completed before a second message is sent.

  16. A convolutional neural network neutrino event classifier

    DOE PAGESBeta

    Aurisano, A.; Radovic, A.; Rocco, D.; Himmel, A.; Messier, M. D.; Niner, E.; Pawloski, G.; Psihas, F.; Sousa, A.; Vahle, P.

    2016-09-01

    Here, convolutional neural networks (CNNs) have been widely applied in the computer vision community to solve complex problems in image recognition and analysis. We describe an application of the CNN technology to the problem of identifying particle interactions in sampling calorimeters used commonly in high energy physics and high energy neutrino physics in particular. Following a discussion of the core concepts of CNNs and recent innovations in CNN architectures related to the field of deep learning, we outline a specific application to the NOvA neutrino detector. This algorithm, CVN (Convolutional Visual Network) identifies neutrino interactions based on their topology withoutmore » the need for detailed reconstruction and outperforms algorithms currently in use by the NOvA collaboration.« less

  17. Seismic event classification using Self-Organizing Neural Networks

    SciTech Connect

    Maurer, W.J.; Dowla, F.U.; Jarpe, S.P.

    1991-10-15

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. We have studied Self Organizing Neural Networks (SONNs) for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs were developed and tested with a moderately large set of real seismic events. Given the detection of a seismic event and the corresponding signal, we compute the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This preprocessed input is fed into the SONNs. The overall results based on 111 events (43 training and 68 test events) show that SONNs are able to group events that ``look`` similar. We also find that the ART algorithm has an advantage in that the types of cluster groups do not need to be predefined. When a new type of event is detected, the ART network is able to handle the event rather gracefully. The results from the SONNs together with an expert seismologist`s classification are then used to derive event classification probabilities. A strategy to integrate a SONN into the interpretation of seismic events is also proposed.

  18. System diagnostic builder

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Burke, Roger

    1992-01-01

    The System Diagnostic Builder (SDB) is an automated software verification and validation tool using state-of-the-art Artificial Intelligence (AI) technologies. The SDB is used extensively by project BURKE at NASA-JSC as one component of a software re-engineering toolkit. The SDB is applicable to any government or commercial organization which performs verification and validation tasks. The SDB has an X-window interface, which allows the user to 'train' a set of rules for use in a rule-based evaluator. The interface has a window that allows the user to plot up to five data parameters (attributes) at a time. Using these plots and a mouse, the user can identify and classify a particular behavior of the subject software. Once the user has identified the general behavior patterns of the software, he can train a set of rules to represent his knowledge of that behavior. The training process builds rules and fuzzy sets to use in the evaluator. The fuzzy sets classify those data points not clearly identified as a particular classification. Once an initial set of rules is trained, each additional data set given to the SDB will be used by a machine learning mechanism to refine the rules and fuzzy sets. This is a passive process and, therefore, it does not require any additional operator time. The evaluation component of the SDB can be used to validate a single software system using some number of different data sets, such as a simulator. Moreover, it can be used to validate software systems which have been re-engineered from one language and design methodology to a totally new implementation.

  19. Extreme events in multilayer, interdependent complex networks and control

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Zhong; Huang, Zi-Gang; Zhang, Hai-Feng; Eisenberg, Daniel; Seager, Thomas P.; Lai, Ying-Cheng

    2015-11-01

    We investigate the emergence of extreme events in interdependent networks. We introduce an inter-layer traffic resource competing mechanism to account for the limited capacity associated with distinct network layers. A striking finding is that, when the number of network layers and/or the overlap among the layers are increased, extreme events can emerge in a cascading manner on a global scale. Asymptotically, there are two stable absorption states: a state free of extreme events and a state of full of extreme events, and the transition between them is abrupt. Our results indicate that internal interactions in the multiplex system can yield qualitatively distinct phenomena associated with extreme events that do not occur for independent network layers. An implication is that, e.g., public resource competitions among different service providers can lead to a higher resource requirement than naively expected. We derive an analytical theory to understand the emergence of global-scale extreme events based on the concept of effective betweenness. We also articulate a cost-effective control scheme through increasing the capacity of very few hubs to suppress the cascading process of extreme events so as to protect the entire multi-layer infrastructure against global-scale breakdown.

  20. Network Event Recording Device: An automated system for Network anomaly detection, and notification. Draft

    SciTech Connect

    Simmons, D.G.; Wilkins, R.

    1994-09-01

    The goal of the Network Event Recording Device (NERD) is to provide a flexible autonomous system for network logging and notification when significant network anomalies occur. The NERD is also charged with increasing the efficiency and effectiveness of currently implemented network security procedures. While it has always been possible for network and security managers to review log files for evidence of network irregularities, the NERD provides real-time display of network activity, as well as constant monitoring and notification services for managers. Similarly, real-time display and notification of possible security breaches will provide improved effectiveness in combating resource infiltration from both inside and outside the immediate network environment.

  1. Capturing significant events with neural networks.

    PubMed

    Szu, Harold; Hsu, Charles; Jenkins, Jeffrey; Willey, Jefferson; Landa, Joseph

    2012-05-01

    Smartphone video capture and transmission to the Web contributes to data pollution. In contrast, mammalian eyes sense all, capture only significant events, allowing us vividly recall the causalities. Likewise in our videos, we wish to skip redundancies and keep only significantly differences, as determined by real-time local medium filters. We construct a Picture Index (PI) of one's (center of gravity changes) among zeros (no changes) as Motion Organized Sparseness (MOS). Only non-overlapping time-ordered PI pair is admitted in the outer-product Associative Memory (AM). Another outer product between PI and its image builds Hetero-AM (HAM) for fault tolerant retrievals. PMID:22402410

  2. Subsurface event detection and classification using Wireless Signal Networks.

    PubMed

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  3. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  4. Drag and drop display & builder

    SciTech Connect

    Bolshakov, Timofei B.; Petrov, Andrey D.; /Fermilab

    2007-12-01

    The Drag and Drop (DnD) Display & Builder is a component-oriented system that allows users to create visual representations of data received from data acquisition systems. It is an upgrade of a Synoptic Display mechanism used at Fermilab since 2002. Components can be graphically arranged and logically interconnected in the web-startable Project Builder. Projects can be either lightweight AJAX- and SVG-based web pages, or they can be started as Java applications. The new version was initiated as a response to discussions between the LHC Controls Group and Fermilab.

  5. Builders Challenge High Performance Builder Spotlight - Artistic Homes, Albuquerque, NM

    SciTech Connect

    2009-01-01

    Building America Builders Challenge fact sheet on Artistic Homes of Albuquerque, New Mexico. Describes the first true zero E-scale home in a hot-dry climate with ducts inside, R-50 attic insulation, roof-mounted photovoltaic power system, and solar thermal water heating.

  6. Comparison of Event Detection Methods for Centralized Sensor Networks

    NASA Technical Reports Server (NTRS)

    Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.

    2006-01-01

    The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.

  7. Communication: Analysing kinetic transition networks for rare events.

    PubMed

    Stevenson, Jacob D; Wales, David J

    2014-07-28

    The graph transformation approach is a recently proposed method for computing mean first passage times, rates, and committor probabilities for kinetic transition networks. Here we compare the performance to existing linear algebra methods, focusing on large, sparse networks. We show that graph transformation provides a much more robust framework, succeeding when numerical precision issues cause the other methods to fail completely. These are precisely the situations that correspond to rare event dynamics for which the graph transformation was introduced. PMID:25084870

  8. Automatic classification of seismic events within a regional seismograph network

    NASA Astrophysics Data System (ADS)

    Tiira, Timo; Kortström, Jari; Uski, Marja

    2015-04-01

    A fully automatic method for seismic event classification within a sparse regional seismograph network is presented. The tool is based on a supervised pattern recognition technique, Support Vector Machine (SVM), trained here to distinguish weak local earthquakes from a bulk of human-made or spurious seismic events. The classification rules rely on differences in signal energy distribution between natural and artificial seismic sources. Seismic records are divided into four windows, P, P coda, S, and S coda. For each signal window STA is computed in 20 narrow frequency bands between 1 and 41 Hz. The 80 discrimination parameters are used as a training data for the SVM. The SVM models are calculated for 19 on-line seismic stations in Finland. The event data are compiled mainly from fully automatic event solutions that are manually classified after automatic location process. The station-specific SVM training events include 11-302 positive (earthquake) and 227-1048 negative (non-earthquake) examples. The best voting rules for combining results from different stations are determined during an independent testing period. Finally, the network processing rules are applied to an independent evaluation period comprising 4681 fully automatic event determinations, of which 98 % have been manually identified as explosions or noise and 2 % as earthquakes. The SVM method correctly identifies 94 % of the non-earthquakes and all the earthquakes. The results imply that the SVM tool can identify and filter out blasts and spurious events from fully automatic event solutions with a high level of confidence. The tool helps to reduce work-load in manual seismic analysis by leaving only ~5 % of the automatic event determinations, i.e. the probable earthquakes for more detailed seismological analysis. The approach presented is easy to adjust to requirements of a denser or wider high-frequency network, once enough training examples for building a station-specific data set are available.

  9. Automatic event detection based on artificial neural networks

    NASA Astrophysics Data System (ADS)

    Doubravová, Jana; Wiszniowski, Jan; Horálek, Josef

    2015-04-01

    The proposed algorithm was developed to be used for Webnet, a local seismic network in West Bohemia. The Webnet network was built to monitor West Bohemia/Vogtland swarm area. During the earthquake swarms there is a large number of events which must be evaluated automatically to get a quick estimate of the current earthquake activity. Our focus is to get good automatic results prior to precise manual processing. With automatic data processing we may also reach a lower completeness magnitude. The first step of automatic seismic data processing is the detection of events. To get a good detection performance we require low number of false detections as well as high number of correctly detected events. We used a single layer recurrent neural network (SLRNN) trained by manual detections from swarms in West Bohemia in the past years. As inputs of the SLRNN we use STA/LTA of half-octave filter bank fed by vertical and horizontal components of seismograms. All stations were trained together to obtain the same network with the same neuron weights. We tried several architectures - different number of neurons - and different starting points for training. Networks giving the best results for training set must not be the optimal ones for unknown waveforms. Therefore we test each network on test set from different swarm (but still with similar characteristics, i.e. location, focal mechanisms, magnitude range). We also apply a coincidence verification for each event. It means that we can lower the number of false detections by rejecting events on one station only and force to declare an event on all stations in the network by coincidence on two or more stations. In further work we would like to retrain the network for each station individually so each station will have its own coefficients (neural weights) set. We would also like to apply this method to data from Reykjanet network located in Reykjanes peninsula, Iceland. As soon as we have a reliable detection, we can proceed to

  10. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  11. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  12. Seismic event interpretation using fuzzy logic and neural networks

    SciTech Connect

    Maurer, W.J.; Dowla, F.U.

    1994-01-01

    In the computer interpretation of seismic data, unknown sources of seismic events must be represented and reasoned about using measurements from the recorded signal. In this report, we develop the use of fuzzy logic to improve our ability to interpret weak seismic events. Processing strategies for the use of fuzzy set theory to represent vagueness and uncertainty, a phenomena common in seismic data analysis, are developed. A fuzzy-assumption based truth-maintenance-inferencing engine is also developed. Preliminary results in interpreting seismic events using the fuzzy neural network knowledge-based system are presented.

  13. Characterizing interactions in online social networks during exceptional events

    NASA Astrophysics Data System (ADS)

    Omodei, Elisa; De Domenico, Manlio; Arenas, Alex

    2015-08-01

    Nowadays, millions of people interact on a daily basis on online social media like Facebook and Twitter, where they share and discuss information about a wide variety of topics. In this paper, we focus on a specific online social network, Twitter, and we analyze multiple datasets each one consisting of individuals' online activity before, during and after an exceptional event in terms of volume of the communications registered. We consider important events that occurred in different arenas that range from policy to culture or science. For each dataset, the users' online activities are modeled by a multilayer network in which each layer conveys a different kind of interaction, specifically: retweeting, mentioning and replying. This representation allows us to unveil that these distinct types of interaction produce networks with different statistical properties, in particular concerning the degree distribution and the clustering structure. These results suggests that models of online activity cannot discard the information carried by this multilayer representation of the system, and should account for the different processes generated by the different kinds of interactions. Secondly, our analysis unveils the presence of statistical regularities among the different events, suggesting that the non-trivial topological patterns that we observe may represent universal features of the social dynamics on online social networks during exceptional events.

  14. Forecasting solar proton event with artificial neural network

    NASA Astrophysics Data System (ADS)

    Gong, J.; Wang, J.; Xue, B.; Liu, S.; Zou, Z.

    Solar proton event (SPE), relatively rare but popular in solar maximum, can bring hazard situation to spacecraft. As a special event, SPE always accompanies flare, which is also called proton flare. To produce such an eruptive event, large amount energy must be accumulated within the active region. So we can investigate the character of the active region and its evolving trend, together with other such as cm radio emission and soft X-ray background to evaluate the potential of SEP in chosen area. In order to summarize the omen of SPEs in the active regions behind the observed parameters, we employed AI technology. Full connecting neural network was chosen to fulfil this job. After constructing the network, we train it with 13 parameters that was able to exhibit the character of active regions and their evolution trend. More than 80 sets of event parameter were defined to teach the neural network to identify whether an active region was potential of SPE. Then we test this model with a data base consisting SPE and non-SPE cases that was not used to train the neural network. The result showed that 75% of the choice by the model was right.

  15. Mining the key predictors for event outbreaks in social networks

    NASA Astrophysics Data System (ADS)

    Yi, Chengqi; Bao, Yuanyuan; Xue, Yibo

    2016-04-01

    It will be beneficial to devise a method to predict a so-called event outbreak. Existing works mainly focus on exploring effective methods for improving the accuracy of predictions, while ignoring the underlying causes: What makes event go viral? What factors that significantly influence the prediction of an event outbreak in social networks? In this paper, we proposed a novel definition for an event outbreak, taking into account the structural changes to a network during the propagation of content. In addition, we investigated features that were sensitive to predicting an event outbreak. In order to investigate the universality of these features at different stages of an event, we split the entire lifecycle of an event into 20 equal segments according to the proportion of the propagation time. We extracted 44 features, including features related to content, users, structure, and time, from each segment of the event. Based on these features, we proposed a prediction method using supervised classification algorithms to predict event outbreaks. Experimental results indicate that, as time goes by, our method is highly accurate, with a precision rate ranging from 79% to 97% and a recall rate ranging from 74% to 97%. In addition, after applying a feature-selection algorithm, the top five selected features can considerably improve the accuracy of the prediction. Data-driven experimental results show that the entropy of the eigenvector centrality, the entropy of the PageRank, the standard deviation of the betweenness centrality, the proportion of re-shares without content, and the average path length are the key predictors for an event outbreak. Our findings are especially useful for further exploring the intrinsic characteristics of outbreak prediction.

  16. Event management for large scale event-driven digital hardware spiking neural networks.

    PubMed

    Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean

    2013-09-01

    The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. PMID:23522624

  17. Event-triggered output feedback control for distributed networked systems.

    PubMed

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2016-01-01

    This paper addresses the problem of output-feedback communication and control with event-triggered framework in the context of distributed networked control systems. The design problem of the event-triggered output-feedback control is proposed as a linear matrix inequality (LMI) feasibility problem. The scheme is developed for the distributed system where only partial states are available. In this scheme, a subsystem uses local observers and share its information to its neighbors only when the subsystem's local error exceeds a specified threshold. The developed method is illustrated by using a coupled cart example from the literature. PMID:26708304

  18. Information Spread of Emergency Events: Path Searching on Social Networks

    PubMed Central

    Hu, Hongzhi; Wu, Tunan

    2014-01-01

    Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning. PMID:24600323

  19. Development and Evaluation of a Dynamic Moving Storm (DMS) Builder

    NASA Astrophysics Data System (ADS)

    Fang, N. Z.; Gao, S.

    2014-12-01

    The University of Texas at Arlington (UTA) developed a design rainfall generator - Dynamic Moving Storm (DMS). DMS is a unique tool because it accounts for three major factors of real rainfall events simultaneously that other tools do not: (1) spatial variability, (2) temporal variability, and (3) directional movement. The rainfall intensity distribution with a storm is normally referred to spatial variability factor. The DMS builder takes in account storm sizes, shapes, and orientations (for non-circular storms) within the spatial variability module. Given rainfall intensity within the storm always varies with respect of time, the builder has a capability of specifying temporal distributions of rainfall intensities following linear or exponential patterns. To represent the dynamic motions of real storms, the researchers at UTA developed a movement module into DMS to handle combinations of accelerations, decelerations, pause and turns. Typically, an idealized storm generated by DMS can be presented as a circular shape in 2-D and conic shape in 3-D views. While it moves across a watershed, the rainfall pattern within the storm follows a certain temporal pattern. Once various combinations of spatial, temporal, and movement factors are input into the DMS builder, it can generate corresponding elliptical-shaped rainfall contours with rainfall hyetographs for each subbasin of a particular watershed. The resulted rainfall information can then be fed into hydrologic models to evaluate the spatiotemporal impacts for any watersheds. This paper demonstrates a case study using DMS builder to access the vulnerability for the Brays Bayou watershed in Houston, Texas.

  20. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  1. Predicting adverse drug events using pharmacological network models.

    PubMed

    Cami, Aurel; Arnold, Alana; Manzi, Shannon; Reis, Ben

    2011-12-21

    Early and accurate identification of adverse drug events (ADEs) is critically important for public health. We have developed a novel approach for predicting ADEs, called predictive pharmacosafety networks (PPNs). PPNs integrate the network structure formed by known drug-ADE relationships with information on specific drugs and adverse events to predict likely unknown ADEs. Rather than waiting for sufficient post-market evidence to accumulate for a given ADE, this predictive approach relies on leveraging existing, contextual drug safety information, thereby having the potential to identify certain ADEs earlier. We constructed a network representation of drug-ADE associations for 809 drugs and 852 ADEs on the basis of a snapshot of a widely used drug safety database from 2005 and supplemented these data with additional pharmacological information. We trained a logistic regression model to predict unknown drug-ADE associations that were not listed in the 2005 snapshot. We evaluated the model's performance by comparing these predictions with the new drug-ADE associations that appeared in a 2010 snapshot of the same drug safety database. The proposed model achieved an AUROC (area under the receiver operating characteristic curve) statistic of 0.87, with a sensitivity of 0.42 given a specificity of 0.95. These findings suggest that predictive network methods can be useful for predicting unknown ADEs. PMID:22190238

  2. Event Networks and the Identification of Crime Pattern Motifs.

    PubMed

    Davies, Toby; Marchione, Elio

    2015-01-01

    In this paper we demonstrate the use of network analysis to characterise patterns of clustering in spatio-temporal events. Such clustering is of both theoretical and practical importance in the study of crime, and forms the basis for a number of preventative strategies. However, existing analytical methods show only that clustering is present in data, while offering little insight into the nature of the patterns present. Here, we show how the classification of pairs of events as close in space and time can be used to define a network, thereby generalising previous approaches. The application of graph-theoretic techniques to these networks can then offer significantly deeper insight into the structure of the data than previously possible. In particular, we focus on the identification of network motifs, which have clear interpretation in terms of spatio-temporal behaviour. Statistical analysis is complicated by the nature of the underlying data, and we provide a method by which appropriate randomised graphs can be generated. Two datasets are used as case studies: maritime piracy at the global scale, and residential burglary in an urban area. In both cases, the same significant 3-vertex motif is found; this result suggests that incidents tend to occur not just in pairs, but in fact in larger groups within a restricted spatio-temporal domain. In the 4-vertex case, different motifs are found to be significant in each case, suggesting that this technique is capable of discriminating between clustering patterns at a finer granularity than previously possible. PMID:26605544

  3. Event Networks and the Identification of Crime Pattern Motifs

    PubMed Central

    2015-01-01

    In this paper we demonstrate the use of network analysis to characterise patterns of clustering in spatio-temporal events. Such clustering is of both theoretical and practical importance in the study of crime, and forms the basis for a number of preventative strategies. However, existing analytical methods show only that clustering is present in data, while offering little insight into the nature of the patterns present. Here, we show how the classification of pairs of events as close in space and time can be used to define a network, thereby generalising previous approaches. The application of graph-theoretic techniques to these networks can then offer significantly deeper insight into the structure of the data than previously possible. In particular, we focus on the identification of network motifs, which have clear interpretation in terms of spatio-temporal behaviour. Statistical analysis is complicated by the nature of the underlying data, and we provide a method by which appropriate randomised graphs can be generated. Two datasets are used as case studies: maritime piracy at the global scale, and residential burglary in an urban area. In both cases, the same significant 3-vertex motif is found; this result suggests that incidents tend to occur not just in pairs, but in fact in larger groups within a restricted spatio-temporal domain. In the 4-vertex case, different motifs are found to be significant in each case, suggesting that this technique is capable of discriminating between clustering patterns at a finer granularity than previously possible. PMID:26605544

  4. Builders Challenge High Performance Builder Spotlight: Yavapai College, Chino Valley, Arizona

    SciTech Connect

    2009-12-22

    Building America Builders Challenge fact sheet on Yavapai College of Chino Valley, Arizona. These college students built a Building America Builders Challenge house that achieved the remarkably low HERS score of -3 and achieved a tight building envelope.

  5. Builders Challenge High Performance Builder Spotlight - Community Development Corporation of Utah

    SciTech Connect

    2008-01-01

    Building America/Builders Challenge fact sheet on Community Development Corp, an energy-efficient home builder in cold climate using advanced framing and compact duct design. Evaluates impacts to cost.

  6. Builders Challenge High Performance Builder Spotlight - Martha Rose Construction, Inc., Seattle, Washington

    SciTech Connect

    2008-01-01

    Building America/Builders Challenge fact sheet on Martha Rose Construction, an energy-efficient home builder in marine climate using the German Passiv Haus design, improved insulation, and solar photovoltaics.

  7. Event-driven approach of layered multicast to network adaptation in RED-based IP networks

    NASA Astrophysics Data System (ADS)

    Nahm, Kitae; Li, Qing; Kuo, C.-C. J.

    2003-11-01

    In this work, we investigate the congestion control problem for layered video multicast in IP networks of active queue management (AQM) using a simple random early detection (RED) queue model. AQM support from networks improves the visual quality of video streaming but makes network adaptation more di+/-cult for existing layered video multicast proticols that use the event-driven timer-based approach. We perform a simplified analysis on the response of the RED algorithm to burst traffic. The analysis shows that the primary problem lies in the weak correlation between the network feedback and the actual network congestion status when the RED queue is driven by burst traffic. Finally, a design guideline of the layered multicast protocol is proposed to overcome this problem.

  8. A novel progressive signal association algorithm for detecting teleseismic/network-outside events using regional seismic networks

    NASA Astrophysics Data System (ADS)

    Jin, Ping; Pan, Changzhou; Zhang, Chengliu; Shen, Xufeng; Wang, Hongchun; Lu, Na

    2015-06-01

    Regional seismic networks may and in some cases need to be used to monitor teleseismic or network-outside events. For detecting and localizing teleseismic events automatically and reliably in this case, in this paper we present a novel progressive association algorithm for teleseismic signals recorded by a regional seismic network. The algorithm takes triangle station arrays as the starting point to search for P waves of teleseismic events progressively by that, as detections from different stations actually are from the same teleseismic event, their arrival times should be linearly related to the average slowness vector with which the signal propagates across the network, and the slowness of direct teleseismic P wave basically is different from other major seismic phases. We have tested this algorithm using data recorded by Xinjiang Seismic Network of China (XJSN) for 16 d. The results show that the algorithm can effectively and reliably detect and localize earthquakes outside of the network. For the period of the test data, as all mb 4.0+ events with Δc < 30° and all mb 4.5+ events with Δc < 60° referring to the International Data Center-Reviewed Event Bulletin (IDC REB) were detected, where Δc is the epicentral distance relative to the network's geographical centre, the rate of false events only accounted for 2.4 per cent, suggesting that the new association algorithm has good application prospect for situations when regional seismic networks need to be used to monitor teleseismic events.

  9. Rome: sinkhole events and network of underground cavities (Italy)

    NASA Astrophysics Data System (ADS)

    Nisio, Stefania; Ciotoli, Giancarlo

    2016-04-01

    The anthropogenic sinkholes in the city of Rome are closely linked to the network of underground cavities produced by human activities in more than two thousand years of history. Over the past fifteen years the increased frequency of intense rainfall events, favors sinkhole formation. The risk assessment induced by anthropogenic sinkhole is really difficult. However, a susceptibility of the territory to sinkholes can be more easily determined as the probability that an event may occur in a given space, with unique geological-morphological characteristics, and in an infinite time. A sinkhole susceptibility map of the Rome territory, up to the ring road, has been constructed by using Geographically Weighted Regression technique and geostatistics. The spatial regression model includes the analysis of more than 2700 anthropogenic sinkholes (recorded from 1875 to 2015), as well as geological, morphological, hydrological and predisposing anthropogenic characteristics of the study area. The numerous available data (underground cavities, the ancient entrances to the quarry, bunkers, etc.) facilitate the creation of a series of maps. The density map of the cavity, updated to 2015, showed that more than 20 km2 of the Roman territory are affected by underground cavities. The census of sinkholes (over 2700) shows that over 30 km2 has been affected by sinkholes. The final susceptibility map highlights that inside the Ring Road about 40 km2 of the territory (about 11%) have a very high probability of triggering a sinkhole event. The susceptibility map was also compared with the data of ground subsidence (InSAR) to obtain a predictive model.

  10. Management of a Complex Open Channel Network During Flood Events

    NASA Astrophysics Data System (ADS)

    Franchini, M.; Valiani, A.; Schippa, L.; Mascellani, G.

    2003-04-01

    Most part of the area around Ferrara (Italy) is below the mean sea level and an extensive drainage system combined with several pump stations allows the use of this area for both urban development and industrial and agricultural activities. The three main channels of this hydraulic system constitute the Ferrara Inland Waterway (total length approximately 70 km), which connects the Po river near Ferrara to the sea. Because of the level difference between the upstream and dowstream ends of the waterway, three locks are located along it, each of them combined with a set of gates to control the water levels. During rainfall events, most of the water of the basin flows into the waterway and heavy precipitations sometimes cause flooding in several areas. This is due to the insufficiency of the channel network dimensions and an inadequate manual operation of the gates. This study presents a hydrological-hydraulic model for the entire Ferrara basin and a system of rules in order to operate the gates. In particular, their opening is designed to be regulated in real time by monitoring the water level in several sections along the channels. Besides flood peak attenuation, this operation strategy contributes also to the maintenance of a constant water level for irrigation and fluvial navigation during the dry periods. With reference to the flood event of May 1996, it is shown that this floodgate operation policy, unlike that which was actually adopted during that event, would lead to a significant flood peak attenuation, avoiding flooding in the area upstream of Ferrara.

  11. The CMS Remote Analysis Builder (CRAB)

    SciTech Connect

    Spiga, D.; Cinquilli, M.; Servoli, L.; Lacaprara, S.; Fanzago, F.; Dorigo, A.; Merlo, M.; Farina, F.; Fanfani, A.; Codispoti, G.; Bacchi, W.; /INFN, Bologna /Bologna U /CERN /INFN, CNAF /INFN, Trieste /Fermilab

    2008-01-22

    The CMS experiment will produce several Pbytes of data every year, to be distributed over many computing centers geographically distributed in different countries. Analysis of this data will be also performed in a distributed way, using grid infrastructure. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that allows a transparent access to distributed data to end physicist. Very limited knowledge of underlying technicalities are required to the user. CRAB interacts with the local user environment, the CMS Data Management services and with the Grid middleware. It is able to use WLCG, gLite and OSG middleware. CRAB has been in production and in routine use by end-users since Spring 2004. It has been extensively used in studies to prepare the Physics Technical Design Report (PTDR) and in the analysis of reconstructed event samples generated during the Computing Software and Analysis Challenge (CSA06). This involved generating thousands of jobs per day at peak rates. In this paper we discuss the current implementation of CRAB, the experience with using it in production and the plans to improve it in the immediate future.

  12. Wireless address event representation system for biological sensor networks

    NASA Astrophysics Data System (ADS)

    Folowosele, Fopefolu; Tapson, Jonathan; Etienne-Cummings, Ralph

    2007-05-01

    We describe wireless networking systems for close proximity biological sensors, as would be encountered in artificial skin. The sensors communicate to a "base station" that interprets the data and decodes its origin. Using a large bundle of ultra thin metal wires from the sensors to the "base station" introduces significant technological hurdles for both the construction and maintenance of the system. Fortunately, the Address Event Representation (AER) protocol provides an elegant and biomorphic method for transmitting many impulses (i.e. neural spikes) down a single wire/channel. However, AER does not communicate any sensory information within each spike, other that the address of the origination of the spike. Therefore, each sensor must provide a number of spikes to communicate its data, typically in the form of the inter-spike intervals or spike rate. Furthermore, complex circuitry is required to arbitrate access to the channel when multiple sensors communicate simultaneously, which results in spike delay. This error is exacerbated as the number of sensors per channel increases, mandating more channels and more wires. We contend that despite the effectiveness of the wire-based AER protocol, its natural evolution will be the wireless AER protocol. A wireless AER system: (1) does not require arbitration to handle multiple simultaneous access of the channel, (2) uses cross-correlation delay to encode sensor data in every spike (eliminating the error due to arbitration delay), and (3) can be reorganized and expanded with little consequence to the network. The system uses spread spectrum communications principles, implemented with a low-power integrate-and-fire neurons. This paper discusses the design, operation and capabilities of such a system. We show that integrate-and-fire neurons can be used to both decode the origination of each spike and extract the data contained within in. We also show that there are many technical obstacles to overcome before this version

  13. Builders Challenge High Performance Builder Spotlight - Masco Environments for Living, Las Vegas, Nevada

    SciTech Connect

    2009-01-01

    Building America Builders Challenge fact sheet on Masco’s Environments for Living Certified Green demo home at the 2009 International Builders Show in Las Vegas. The home has a Home Energy Rating System (HERS) index score of 44, a right-sized air conditi

  14. Builders Challenge High Performance Builder Spotlight: David Weekley Homes, Houston, Texas

    SciTech Connect

    2009-12-22

    Building America Builders Challenge fact sheet on David Weekley Homes of Houston, Texas. The builder plans homes as a "system," with features such as wood-framed walls that are air-sealed then insulated with R-13 unfaced fiberglass batts plus an external covering of R-2 polyisocyanurate rigid foam sheathing.

  15. Building America Best Practices Series Volume 8: Builders Challenge Quality Criteria Support Document

    SciTech Connect

    Baechler, Michael C.; Bartlett, Rosemarie; Gilbride, Theresa L.

    2010-11-01

    The U.S. Department of Energy (DOE) has posed a challenge to the homebuilding industry—to build 220,000 high-performance homes by 2012. Through the Builders Challenge, participating homebuilders will have an easy way to differentiate their best energy-performing homes from other products in the marketplace, and to make the benefits clear to buyers. This document was prepared by Pacific Northwest National Laboratory for DOE to provide guidance to U.S. home builders who want to accept the challenge. To qualify for the Builders Challenge, a home must score 70 or less on the EnergySmart Home Scale (E-Scale). The E-scale is based on the well-established Home Energy Rating System (HERS) index, developed by the Residential Energy Services Network (RESNET). The E-scale allows homebuyers to understand – at a glance – how the energy performance of a particular home compares with the performance of others. To learn more about the index and HERS Raters, visit www.natresnet.org. Homes also must meet the Builders Challenge criteria described in this document. To help builders meet the Challenge, guidance is provided in this report for each of the 29 criteria. Included with guidance for each criteria are resources for more information and references for relevant codes and standards. The Builders Challenge Quality Criteria were originally published in Dec. 2008. They were revised and published as PNNL-18009 Rev 1.2 in Nov. 2009. This is version 1.3, published Nov 2010. Changes from the Nov 2009 version include adding a title page and updating the Energy Star windows critiera to the Version 5.0 criteria approved April 2009 and effective January 4, 2010. This document and other information about the Builders Challenge is available on line at www.buildingamerica.gov/challenge.

  16. Interactive effects of elevation, species richness and extreme climatic events on plant-pollinator networks.

    PubMed

    Hoiss, Bernhard; Krauss, Jochen; Steffan-Dewenter, Ingolf

    2015-11-01

    Plant-pollinator interactions are essential for the functioning of terrestrial ecosystems, but are increasingly affected by global change. The risks to such mutualistic interactions from increasing temperature and more frequent extreme climatic events such as drought or advanced snow melt are assumed to depend on network specialization, species richness, local climate and associated parameters such as the amplitude of extreme events. Even though elevational gradients provide valuable model systems for climate change and are accompanied by changes in species richness, responses of plant-pollinator networks to climatic extreme events under different environmental and biotic conditions are currently unknown. Here, we show that elevational climatic gradients, species richness and experimentally simulated extreme events interactively change the structure of mutualistic networks in alpine grasslands. We found that the degree of specialization in plant-pollinator networks (H2') decreased with elevation. Nonetheless, network specialization increased after advanced snow melt at high elevations, whereas changes in network specialization after drought were most pronounced at sites with low species richness. Thus, changes in network specialization after extreme climatic events depended on climatic context and were buffered by high species richness. In our experiment, only generalized plant-pollinator networks changed in their degree of specialization after climatic extreme events. This indicates that contrary to our assumptions, network generalization may not always foster stability of mutualistic interaction networks. PMID:26332102

  17. Marketing Career Speed Networking: A Classroom Event to Foster Career Awareness

    ERIC Educational Resources Information Center

    Buff, Cheryl L.; O'Connor, Suzanne

    2012-01-01

    This paper describes the design, implementation, and evaluation of a marketing career speed networking event held during class time in two sections of the consumer behavior class. The event was coordinated through a partnering effort with marketing faculty and the college's Career Center. A total of 57 students participated in the event, providing…

  18. Impact assessment of extreme storm events using a Bayesian network

    USGS Publications Warehouse

    den Heijer, C.(Kees); Knipping, Dirk T.J.A.; Plant, Nathaniel G.; van Thiel de Vries, Jaap S. M.; Baart, Fedor; van Gelder, Pieter H. A. J. M.

    2012-01-01

    This paper describes an investigation on the usefulness of Bayesian Networks in the safety assessment of dune coasts. A network has been created that predicts the erosion volume based on hydraulic boundary conditions and a number of cross-shore profile indicators. Field measurement data along a large part of the Dutch coast has been used to train the network. Corresponding storm impact on the dunes was calculated with an empirical dune erosion model named duros+. Comparison between the Bayesian Network predictions and the original duros+ results, here considered as observations, results in a skill up to 0.88, provided that the training data covers the range of predictions. Hence, the predictions from a deterministic model (duros+) can be captured in a probabilistic model (Bayesian Network) such that both the process knowledge and uncertainties can be included in impact and vulnerability assessments.

  19. High Performance Builder Spotlight: Imagine Homes

    SciTech Connect

    2011-01-01

    Imagine Homes, working with the DOE's Building America research team member IBACOS, has developed a system that can be replicated by other contractors to build affordable, high-performance homes. Imagine Homes has used the system to produce more than 70 Builders Challenge-certified homes per year in San Antonio over the past five years.

  20. JOB BUILDER remote batch processing subsystem

    NASA Technical Reports Server (NTRS)

    Orlov, I. G.; Orlova, T. L.

    1980-01-01

    The functions of the JOB BUILDER remote batch processing subsystem are described. Instructions are given for using it as a component of a display system developed by personnel of the System Programming Laboratory, Institute of Space Research, USSR Academy of Sciences.

  1. High Performance Builder Spotlight: Cobblestone Homes

    SciTech Connect

    2011-01-01

    Cobblestone Homes of Freeland,MI quest to understand building science led to construction in 2010 of the "Vision Zero Project," a demonstration home that has earned a DOE Builders Challenge certification and achieved a HERS index of -4 with photovoltaics and 37 without PV.

  2. High Performance Builder Spotlight: Treasure Homes Inc.

    SciTech Connect

    2011-01-01

    Treasure Homes, Inc., achieved a HERS rating of 46 without PV on its prototype “Gem” home, located on the shores of Lake Michigan in northern Indiana, thanks in part to training received from a Building America partner, the National Association of Home Builders Research Center.

  3. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-12-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  4. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-07-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  5. Forecasting ENSO events: A neural network-extended EOF approach

    SciTech Connect

    Tangang, F.T.; Tang, B.; Monahan, A.H.; Hsieh, W.W.

    1998-01-01

    The authors constructed neural network models to forecast the sea surface temperature anomalies (SSTA) for three regions: Nino 4. Nino 3.5, and Nino 3, representing the western-central, the central, and the eastern-central parts of the equatorial Pacific Ocean, respectively. The inputs were the extended empirical orthogonal functions (EEOF) of the sea level pressure (SLP) field that covered the tropical Indian and Pacific Oceans and evolved for a duration of 1 yr. The EEOFs greatly reduced the size of the neural networks from those of the authors` earlier papers using EOFs. The Nino 4 region appeared to be the best forecasted region, with useful skills up to a year lead time for the 1982-93 forecast period. By network pruning analysis and spectral analysis, four important inputs were identified: modes 1, 2, and 6 of the SLP EEOFs and the SSTA persistence. Mode 1 characterized the low-frequency oscillation (LFO, with 4-5-yr period), and was seen as the typical ENSO signal, while mode 2, with a period of 2-5 yr, characterized the quasi-biennial oscillation (QBO) plus the LFO. Mode 6 was dominated by decadal and interdecadal variations. Thus, forecasting ENSO required information from the QBO, and the decadal-interdecadal oscillations. The nonlinearity of the networks tended to increase with lead time and to become stronger for the eastern regions of the equatorial Pacific Ocean. 35 refs., 14 figs., 4 tabs.

  6. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    SciTech Connect

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET, and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.

  7. Event-triggered synchronization strategy for complex dynamical networks with the Markovian switching topologies.

    PubMed

    Wang, Aijuan; Dong, Tao; Liao, Xiaofeng

    2016-02-01

    This paper concerns the synchronization problem of complex networks with the random switching topologies. By modeling the switching of network topologies as a Markov process, a novel event-triggered synchronization strategy is proposed. Unlike the existing strategies, the event detection of this strategy only works at the network topology switching time instant, which can significantly decrease the communication frequency between nodes and save the network resources. Under this strategy, the synchronization problem of complex network is equivalently converted to the stability of a class of Markovian jump systems with a time-varying delay. By using the Lyapunov-Krasovskii functional method and the weak infinitesimal operation, a sufficient condition for the mean square synchronization of the complex networks subject to Markovian switching topologies is established. Finally, a numerical simulation example is provided to demonstrate the theoretical results. PMID:26650712

  8. Cough event classification by pretrained deep neural network

    PubMed Central

    2015-01-01

    Background Cough is an essential symptom in respiratory diseases. In the measurement of cough severity, an accurate and objective cough monitor is expected by respiratory disease society. This paper aims to introduce a better performed algorithm, pretrained deep neural network (DNN), to the cough classification problem, which is a key step in the cough monitor. Method The deep neural network models are built from two steps, pretrain and fine-tuning, followed by a Hidden Markov Model (HMM) decoder to capture tamporal information of the audio signals. By unsupervised pretraining a deep belief network, a good initialization for a deep neural network is learned. Then the fine-tuning step is a back propogation tuning the neural network so that it can predict the observation probability associated with each HMM states, where the HMM states are originally achieved by force-alignment with a Gaussian Mixture Model Hidden Markov Model (GMM-HMM) on the training samples. Three cough HMMs and one noncough HMM are employed to model coughs and noncoughs respectively. The final decision is made based on viterbi decoding algorihtm that generates the most likely HMM sequence for each sample. A sample is labeled as cough if a cough HMM is found in the sequence. Results The experiments were conducted on a dataset that was collected from 22 patients with respiratory diseases. Patient dependent (PD) and patient independent (PI) experimental settings were used to evaluate the models. Five criteria, sensitivity, specificity, F1, macro average and micro average are shown to depict different aspects of the models. From overall evaluation criteria, the DNN based methods are superior to traditional GMM-HMM based method on F1 and micro average with maximal 14% and 11% error reduction in PD and 7% and 10% in PI, meanwhile keep similar performances on macro average. They also surpass GMM-HMM model on specificity with maximal 14% error reduction on both PD and PI. Conclusions In this paper, we

  9. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. PMID:25996752

  10. WebDB Component Builder - Lessons Learned

    SciTech Connect

    Macedo, C.

    2000-02-15

    Oracle WebDB is the easiest way to produce web enabled lightweight and enterprise-centric applications. This concept from Oracle has tantalized our taste for simplistic web development by using a purely web based tool that lives nowhere else but in the database. The use of online wizards, templates, and query builders, which produces PL/SQL behind the curtains, can be used straight ''out of the box'' by both novice and seasoned developers. The topic of this presentation will introduce lessons learned by developing and deploying applications built using the WebDB Component Builder in conjunction with custom PL/SQL code to empower a hybrid application. There are two kinds of WebDB components: those that display data to end users via reporting, and those that let end users update data in the database via entry forms. The presentation will also discuss various methods within the Component Builder to enhance the applications pushed to the desktop. The demonstrated example is an application entitled HOME (Helping Other's More Effectively) that was built to manage a yearly United Way Campaign effort. Our task was to build an end to end application which could manage approximately 900 non-profit agencies, an average of 4,100 individual contributions, and $1.2 million dollars. Using WebDB, the shell of the application was put together in a matter of a few weeks. However, we did encounter some hurdles that WebDB, in it's stage of infancy (v2.0), could not solve for us directly. Together with custom PL/SQL, WebDB's Component Builder became a powerful tool that enabled us to produce a very flexible hybrid application.

  11. XAL Application Framework and Bricks GUI Builder

    SciTech Connect

    Pelaia II, Tom

    2007-01-01

    The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.

  12. Network-based event-triggered filtering for Markovian jump systems

    NASA Astrophysics Data System (ADS)

    Wang, Huijiao; Shi, Peng; Agarwal, Ramesh K.

    2016-06-01

    The problem of event-triggered H∞ filtering for networked Markovian jump system is studied in this paper. A dynamic discrete event-triggered scheme is designed to choose the transmitted data for different Markovian jumping modes. The time-delay modelling method is employed to describe the event-triggered scheme and the network-related behaviour, such as transmission delay, data package dropout and disorder, into a networked Markovian time-delay jump system. Furthermore, a sufficient condition is derived to guarantee that the resulting filtering error system is stochastically stable with a prescribed performance index. A co-design method for the H∞ filter and the event-triggered scheme is then proposed. The effectiveness and potential of the theoretic results obtained are illustrated by a simulation example.

  13. Percolation Features on Climate Network under Attacks of El Niño Events

    NASA Astrophysics Data System (ADS)

    Lu, Z.

    2015-12-01

    Percolation theory under different attacks is one of the main research areas in complex networks but never be applied to investigate climate network. In this study, for the first time we construct a climate network of surface air temperature field to analyze its percolation features. Here, we regard El Niño event as a kind of naturally attacks generated from Pacific Ocean to attack its upper climate network. We find that El Niño event leads an abrupt percolation phase transition to the climate network which makes it splitting and unstable suddenly. Comparing the results of the climate network under three different forms of attacks, including most connected attack (MA), localized attack (LA) and random attack (RA) respectively, it is found that both MA and LA lead first-order transition and RA leads second-order transition to the climate network. Furthermore, we find that most real attacks consist of all these three forms of attacks. With El Niño event emerging, the ratios of LA and MA increase and dominate the style of attack while RA decreasing. It means the percolation phase transition due to El Niño events is close to first-order transition mostly affected by LA and MA. Our research may help us further understand two questions from perspective of percolation on network: (1) Why not all warming in Pacific Ocean but El Niño events could affect the climate. (2) Why the climate affected by El Niño events changes abruptly.

  14. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model.

    PubMed

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-11-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed 'quasi-orbits', which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network's firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms. PMID:26558616

  15. A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks

    PubMed Central

    Zhang, Shukui; Chen, Hao; Zhu, Qiaoming

    2014-01-01

    The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690

  16. Nonthreshold-based event detection for 3d environment monitoring in sensor networks

    SciTech Connect

    Li, M.; Liu, Y.H.; Chen, L.

    2008-12-15

    Event detection is a crucial task for wireless sensor network applications, especially environment monitoring. Existing approaches for event detection are mainly based on some predefined threshold values and, thus, are often inaccurate and incapable of capturing complex events. For example, in coal mine monitoring scenarios, gas leakage or water osmosis can hardly be described by the overrun of specified attribute thresholds but some complex pattern in the full-scale view of the environmental data. To address this issue, we propose a nonthreshold-based approach for the real 3D sensor monitoring environment. We employ energy-efficient methods to collect a time series of data maps from the sensor network and detect complex events through matching the gathered data to spatiotemporal data patterns. Finally, we conduct trace-driven simulations to prove the efficacy and efficiency of this approach on detecting events of complex phenomena from real-life records.

  17. You Never Walk Alone: Recommending Academic Events Based on Social Network Analysis

    NASA Astrophysics Data System (ADS)

    Klamma, Ralf; Cuong, Pham Manh; Cao, Yiwei

    Combining Social Network Analysis and recommender systems is a challenging research field. In scientific communities, recommender systems have been applied to provide useful tools for papers, books as well as expert finding. However, academic events (conferences, workshops, international symposiums etc.) are an important driven forces to move forwards cooperation among research communities. We realize a SNA based approach for academic events recommendation problem. Scientific communities analysis and visualization are performed to provide an insight into the communities of event series. A prototype is implemented based on the data from DBLP and EventSeer.net, and the result is observed in order to prove the approach.

  18. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model

    PubMed Central

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-01-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed ‘quasi-orbits’, which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network’s firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms. PMID:26558616

  19. 17. DETAIL OF BUILDER'S PLAQUE, LOOKING NORTH. Philadelphia & ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    17. DETAIL OF BUILDER'S PLAQUE, LOOKING NORTH. - Philadelphia & Reading Railroad, Wissahickon Creek Viaduct, Spanning Wissahickon Creek, north of Ridge Avenue Bridge, Philadelphia, Philadelphia County, PA

  20. 25. EAST KEYSTONE OF PORTE COCHERE INSCRIBED 'G. CAMERON. BUILDER.' ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. EAST KEYSTONE OF PORTE COCHERE INSCRIBED 'G. CAMERON. BUILDER.' - Smithsonian Institution Building, 1000 Jefferson Drive, between Ninth & Twelfth Streets, Southwest, Washington, District of Columbia, DC

  1. Probabilistic approaches to fault detection in networked discrete event systems.

    PubMed

    Athanasopoulou, Eleftheria; Hadjicostis, Christoforos N

    2005-09-01

    In this paper, we consider distributed systems that can be modeled as finite state machines with known behavior under fault-free conditions, and we study the detection of a general class of faults that manifest themselves as permanent changes in the next-state transition functionality of the system. This scenario could arise in a variety of situations encountered in communication networks, including faults occurred due to design or implementation errors during the execution of communication protocols. In our approach, fault diagnosis is performed by an external observer/diagnoser that functions as a finite state machine and which has access to the input sequence applied to the system but has only limited access to the system state or output. In particular, we assume that the observer/diagnoser is only able to obtain partial information regarding the state of the given system at intermittent time intervals that are determined by certain synchronizing conditions between the system and the observer/diagnoser. By adopting a probabilistic framework, we analyze ways to optimally choose these synchronizing conditions and develop adaptive strategies that achieve a low probability of aliasing, i.e., a low probability that the external observer/diagnoser incorrectly declares the system as fault-free. An application of these ideas in the context of protocol testing/classification is provided as an example. PMID:16252815

  2. Novel algorithms for improved pattern recognition using the US FDA Adverse Event Network Analyzer.

    PubMed

    Botsis, Taxiarchis; Scott, John; Goud, Ravi; Toman, Pamela; Sutherland, Andrea; Ball, Robert

    2014-01-01

    The medical review of adverse event reports for medical products requires the processing of "big data" stored in spontaneous reporting systems, such as the US Vaccine Adverse Event Reporting System (VAERS). VAERS data are not well suited to traditional statistical analyses so we developed the FDA Adverse Event Network Analyzer (AENA) and three novel network analysis approaches to extract information from these data. Our new approaches include a weighting scheme based on co-occurring triplets in reports, a visualization layout inspired by the islands algorithm, and a network growth methodology for the detection of outliers. We explored and verified these approaches by analysing the historical signal of Intussusception (IS) after the administration of RotaShield vaccine (RV) in 1999. We believe that our study supports the use of AENA for pattern recognition in medical product safety and other clinical data. PMID:25160375

  3. Event Detection in Aerospace Systems using Centralized Sensor Networks: A Comparative Study of Several Methodologies

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.

    2006-01-01

    Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.

  4. Airway reopening through catastrophic events in a hierarchical network

    PubMed Central

    Baudoin, Michael; Song, Yu; Manneville, Paul; Baroud, Charles N.

    2013-01-01

    When you reach with your straw for the final drops of a milkshake, the liquid forms a train of plugs that flow slowly initially because of the high viscosity. They then suddenly rupture and are replaced with a rapid airflow with the characteristic slurping sound. Trains of liquid plugs also are observed in complex geometries, such as porous media during petroleum extraction, in microfluidic two-phase flows, or in flows in the pulmonary airway tree under pathological conditions. The dynamics of rupture events in these geometries play the dominant role in the spatial distribution of the flow and in determining how much of the medium remains occluded. Here we show that the flow of a train of plugs in a straight channel is always unstable to breaking through a cascade of ruptures. Collective effects considerably modify the rupture dynamics of plug trains: Interactions among nearest neighbors take place through the wetting films and slow down the cascade, whereas global interactions, through the total resistance to flow of the train, accelerate the dynamics after each plug rupture. In a branching tree of microchannels, similar cascades occur along paths that connect the input to a particular output. This divides the initial tree into several independent subnetworks, which then evolve independently of one another. The spatiotemporal distribution of the cascades is random, owing to strong sensitivity to the plug divisions at the bifurcations. PMID:23277557

  5. A hybrid adaptive routing algorithm for event-driven wireless sensor networks.

    PubMed

    Figueiredo, Carlos M S; Nakamura, Eduardo F; Loureiro, Antonio A F

    2009-01-01

    Routing is a basic function in wireless sensor networks (WSNs). For these networks, routing algorithms depend on the characteristics of the applications and, consequently, there is no self-contained algorithm suitable for every case. In some scenarios, the network behavior (traffic load) may vary a lot, such as an event-driven application, favoring different algorithms at different instants. This work presents a hybrid and adaptive algorithm for routing in WSNs, called Multi-MAF, that adapts its behavior autonomously in response to the variation of network conditions. In particular, the proposed algorithm applies both reactive and proactive strategies for routing infrastructure creation, and uses an event-detection estimation model to change between the strategies and save energy. To show the advantages of the proposed approach, it is evaluated through simulations. Comparisons with independent reactive and proactive algorithms show improvements on energy consumption. PMID:22423207

  6. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  7. A Markovian event-based framework for stochastic spiking neural networks.

    PubMed

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks. PMID:21499739

  8. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    PubMed

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results. PMID:26829603

  9. Proton Single Event Effects (SEE) Testing of the Myrinet Crossbar Switch and Network Interface Card

    NASA Technical Reports Server (NTRS)

    Howard, James W., Jr.; LaBel, Kenneth A.; Carts, Martin A.; Stattel, Ronald; Irwin, Timothy L.; Day, John H. (Technical Monitor)

    2002-01-01

    As part of the Remote Exploration and Experimentation Project (REE), work was performed to do a proton SEE (Single Event Effect) evaluation of the Myricom network protocol system (Myrinet). This testing included the evaluation of the Myrinet crossbar switch and the Network Interface Card (NIC). To this end, two crossbar switch devices and five components in the NIC were exposed to the proton beam at the University of California at Davis Crocker Nuclear Laboratory (CNL).

  10. A network of discrete events for the representation and analysis of diffusion dynamics

    NASA Astrophysics Data System (ADS)

    Pintus, Alberto M.; Pazzona, Federico G.; Demontis, Pierfranco; Suffritti, Giuseppe B.

    2015-11-01

    We developed a coarse-grained description of the phenomenology of diffusive processes, in terms of a space of discrete events and its representation as a network. Once a proper classification of the discrete events underlying the diffusive process is carried out, their transition matrix is calculated on the basis of molecular dynamics data. This matrix can be represented as a directed, weighted network where nodes represent discrete events, and the weight of edges is given by the probability that one follows the other. The structure of this network reflects dynamical properties of the process of interest in such features as its modularity and the entropy rate of nodes. As an example of the applicability of this conceptual framework, we discuss here the physics of diffusion of small non-polar molecules in a microporous material, in terms of the structure of the corresponding network of events, and explain on this basis the diffusivity trends observed. A quantitative account of these trends is obtained by considering the contribution of the various events to the displacement autocorrelation function.

  11. A Database of Tornado Events as Perceived by the USArray Transportable Array Network

    NASA Astrophysics Data System (ADS)

    Tytell, J. E.; Vernon, F.; Reyes, J. C.

    2015-12-01

    Over the course of the deployment of Earthscope's USArray Transportable Array (TA) network there have numerous tornado events that have occurred within the changing footprint of its network. The Array Network Facility based in San Diego, California, has compiled a database of these tornado events based on data provided by the NOAA Storm Prediction Center (SPC). The SPC data itself consists of parameters such as start-end point track data for each event, maximum EF intensities, and maximum track widths. Our database is Antelope driven and combines these data from the SPC with detailed station information from the TA network. We are now able to list all available TA stations during any specific tornado event date and also provide a single calculated "nearest" TA station per individual tornado event. We aim to provide this database as a starting resource for those with an interest in investigating tornado signatures within surface pressure and seismic response data. On a larger scale, the database may be of particular interest to the infrasound research community

  12. Builders Challenge High Performance Builder Spotlight - NextGen Home, Las Vegas, Nevada

    SciTech Connect

    2009-01-01

    Building America Builders Challenge fact sheet on the NextGen demo home built in Las Vegas. The home has a Home Energy Rating System (HERS) index score of 44 with R-40 spray foam attic insulation, R-40 insulated concrete walls, and a 4kW DC solar laminate

  13. Energy Conservation for the Home Builder: A Course for Residential Builders. Course Outline and Instructional Materials.

    ERIC Educational Resources Information Center

    Koenigshofer, Daniel R.

    Background information, handouts and related instructional materials comprise this manual for conducting a course on energy conservation for home builders. Information presented in the five- and ten-hour course is intended to help residential contractors make appropriate and cost-effective decisions in constructing energy-efficient dwellings.…

  14. Builders Challenge High Performance Builder Spotlight: Artistic Homes, Albuquerque, New Mexico

    SciTech Connect

    2009-12-22

    Building America Builders Challenge fact sheet on Artistic Homes of Albuquerque, New Mexico. Standard features of their homes include advanced framed 2x6 24-inch on center walls, R-21 blown insulation in the walls, and high-efficiency windows.

  15. Strain Rate by Geodetic Observations Associated with Seismic Events in the SIRGAS-CON Network Region.

    NASA Astrophysics Data System (ADS)

    Marotta, G. S.; Franca, G.; Galera Monico, J. F.; Fuck, R. A.

    2014-12-01

    This research investigates surface strains related to seismic events and their relationship with pre- and post-seismic events in South American, Antarctica, Nazca, Cocos, North American and Caribbean plates , by analyzing the variation of estimated earth coordinates, for the period 2000-2014, supplied by a geodetic network called SIRGAS-CON. Based on data provided by the USGS for the same period, and after the Global Congruency test, we selected the events associated with unstable geodetic network points. The resulting strains were estimated based on the finite element method. It was possible to determine the strains along with the resulting guidelines for pre- and post-seismic, considering each region formed for analysis as a homogeneous solid body. Later, a multi-year solution of the network was estimated and used to estimate the strain rates of the earth surface from the changing directions of the velocity vectors of 332 geodetic points located in the South American plate and surround plates. The strain rate was determined and, using Euler vector computed, it was possible to estimate the convergence and accommodation rates to each plate. The results showed that contraction regions coincide with locations with most of the high magnitude seismic events. It suggest that major movements detected on the surface occur in regions with more heterogeneous geological structures and multiple rupture events; significant amounts of elastic strain can be accumulated on geological structures away from the plate boundary faults; and, behavior of contractions and extensions is similar to what has been found in seismological studies. Despite the association between seismic events and the strain of geodetic network, some events of high magnitude were excluded because it does not show the surface strain, which is located at great depths. It was confirmed that events of greater magnitude provide increased surface strain rate when compared with other similar depths.

  16. Event-triggered H∞ reliable control for offshore structures in network environments

    NASA Astrophysics Data System (ADS)

    Zhang, Bao-Lin; Han, Qing-Long; Zhang, Xian-Ming

    2016-04-01

    This paper investigates the network-based modeling and event-triggered H∞ reliable control for an offshore structure. First, a network-based model of the offshore structure subject to external wave force and actuator faults is presented. Second, an event-triggering mechanism is proposed such that during the control implementation, only requisite sampled-data is transmitted over networks. Third, an event-triggered H∞ reliable control problem for the offshore structure is solved by employing the Lyapunov-Krasovskii functional approach, and the desired controller can be derived. It is shown through simulation results that for possible actuator failures, the networked controller is capable of guaranteeing the stability of the offshore structure. In addition, compared with the H∞ control scheme without network settings, the proposed controller can suppress the vibration of the offshore structure to almost the same level as the H∞ controller, while the former requires less control cost. Furthermore, under the network-based controller, the communication resources can be saved significantly.

  17. Non-linear time series analysis of precipitation events using regional climate networks for Germany

    NASA Astrophysics Data System (ADS)

    Rheinwalt, Aljoscha; Boers, Niklas; Marwan, Norbert; Kurths, Jürgen; Hoffmann, Peter; Gerstengarbe, Friedrich-Wilhelm; Werner, Peter

    2016-02-01

    Synchronous occurrences of heavy rainfall events and the study of their relation in time and space are of large socio-economical relevance, for instance for the agricultural and insurance sectors, but also for the general well-being of the population. In this study, the spatial synchronization structure is analyzed as a regional climate network constructed from precipitation event series. The similarity between event series is determined by the number of synchronous occurrences. We propose a novel standardization of this number that results in synchronization scores which are not biased by the number of events in the respective time series. Additionally, we introduce a new version of the network measure directionality that measures the spatial directionality of weighted links by also taking account of the effects of the spatial embedding of the network. This measure provides an estimate of heavy precipitation isochrones by pointing out directions along which rainfall events synchronize. We propose a climatological interpretation of this measure in terms of propagating fronts or event traces and confirm it for Germany by comparing our results to known atmospheric circulation patterns.

  18. Detection of stick-slip events within the Whillans Ice Stream using an artificial neural network

    NASA Astrophysics Data System (ADS)

    Bernsen, S. P.

    2014-12-01

    Temporal changes in the periodic stick-slip events on the Whillans Ice Stream (WIS) help to understand the hydrosphere-cryosphere coupling in West Antarctica. Previous studies have shown that the periodic behavior has been ongoing for a number of years but the record of slip events is incomplete. Rayleigh waves from WIS grounding line events exhibit different patterns than events from the interior of the glacier. An algorithm using a backpropagation neural network is proposed to efficiently extract surface waves that are a result of stick slip events. A neural network approach has its advantages of machine learning, simplified mathematics, and eliminates the need for an analyst to correctly pick first arrivals. Training data has been assembled using 107 events occuring during the 2010 austral summer that were previously identified to correspond to stick slip events at the grounding line as well as the interior of the WIS. A 0.1 s moving window of 3 s of each of the preprocessed attributes is input into the neural network for automated surface wave detection. Following surface wave detection a much longer 30 minute sliding window is used to classify surface wave detections as grounding line, interior, or non-stick slip events. Similar to the automatic detection algorithms for body waves, preprocessing using STA/LTA ratio, degree of polarization, variance, and skewness exhibit obvious patterns during the onset of surface waves. The The automated event detection could lead to more cost effective means of data collection in future seismic experiments especially with an increase in array density in cold weather regions.

  19. Economic Comparison of On-Board Module Builder Harvest Methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton pickers with on-board module builders (OBMB) eliminates the need for boll buggies, module builders, the tractors, and labor needed to operate this machinery. Additionally, field efficiency may be increased due to less stoppage for unloading and/or waiting to unload. This study estimates the ...

  20. The magnetic network location of explosive events observed in the solar transition region

    NASA Technical Reports Server (NTRS)

    Porter, J. G.; Dere, K. P.

    1991-01-01

    Compact short-lived explosive events have been observed in solar transition region lines with the High-Resolution Telescope and Spectrograph (HRTS) flown by the Naval Research Laboratory on a series of rockets and on Spacelab 2. Data from Spacelab 2 are coaligned with a simultaneous magnetogram and near-simultaneous He I 10,380 -A spectroheliogram obtained at the National Solar Observatory at Kitt Peak. The comparison shows that the explosive events occur in the solar magnetic network lanes at the boundaries of supergranular convective cells. However, the events occur away from the larger concentrations of magnetic flux in the network, in contradiction to the observed tendency of the more energetic solar phenomena to be associated with the stronger magnetic fields.

  1. Event-driven model predictive control of sewage pumping stations for sulfide mitigation in sewer networks.

    PubMed

    Liu, Yiqi; Ganigué, Ramon; Sharma, Keshab; Yuan, Zhiguo

    2016-07-01

    Chemicals such as Mg(OH)2 and iron salts are widely dosed to sewage for mitigating sulfide-induced corrosion and odour problems in sewer networks. The chemical dosing rate is usually not automatically controlled but profiled based on experience of operators, often resulting in over- or under-dosing. Even though on-line control algorithms for chemical dosing in single pipes have been developed recently, network-wide control algorithms are currently not available. The key challenge is that a sewer network is typically wide-spread comprising many interconnected sewer pipes and pumping stations, making network-wide sulfide mitigation with a relatively limited number of dosing points challenging. In this paper, we propose and demonstrate an Event-driven Model Predictive Control (EMPC) methodology, which controls the flows of sewage streams containing the dosed chemical to ensure desirable distribution of the dosed chemical throughout the pipe sections of interests. First of all, a network-state model is proposed to predict the chemical concentration in a network. An EMPC algorithm is then designed to coordinate sewage pumping station operations to ensure desirable chemical distribution in the network. The performance of the proposed control methodology is demonstrated by applying the designed algorithm to a real sewer network simulated with the well-established SeweX model using real sewage flow and characteristics data. The EMPC strategy significantly improved the sulfide mitigation performance with the same chemical consumption, compared to the current practice. PMID:27124127

  2. Rare events statistics of random walks on networks: localisation and other dynamical phase transitions

    NASA Astrophysics Data System (ADS)

    De Bacco, Caterina; Guggiola, Alberto; Kühn, Reimer; Paga, Pierre

    2016-05-01

    Rare event statistics for random walks on complex networks are investigated using the large deviation formalism. Within this formalism, rare events are realised as typical events in a suitably deformed path-ensemble, and their statistics can be studied in terms of spectral properties of a deformed Markov transition matrix. We observe two different types of phase transition in such systems: (i) rare events which are singled out for sufficiently large values of the deformation parameter may correspond to localised modes of the deformed transition matrix; (ii) ‘mode-switching transitions’ may occur as the deformation parameter is varied. Details depend on the nature of the observable for which the rare event statistics is studied, as well as on the underlying graph ensemble. In the present paper we report results on rare events statistics for path averages of random walks in Erdős–Rényi and scale free networks. Large deviation rate functions and localisation properties are studied numerically. For observables of the type considered here, we also derive an analytical approximation for the Legendre transform of the large deviation rate function, which is valid in the large connectivity limit. It is found to agree well with simulations.

  3. OSCAR experiment high-density network data report: Event 4 - April 21-23, 1981

    SciTech Connect

    Dana, M.T.; Easter, R.C.; Thorp, J.M.

    1984-12-01

    The OSCAR (Oxidation and Scavenging Characteristics of April Rains) experiment, conducted during April 1981, was a cooperative field investigation of wet removal in cyclonic storm systems. The high-density component of OSCAR was located in northeast Indiana and included sequential precipitation chemistry measurements on a 100 by 100 km netwok, as well as airborne air chemistry and cloud chemistry mueasurements, surface air chemistry measurements, and supporting meteorological measurements. Four separate storm events were studied during the experiment. This report summarizes data taken by Pacific Northwest Laboratory (PNL) during the fourth storm event, April 21-23. The report contains the high-density network precipitation chemistry data, air and cloud chemistry data from the two PNL aircraft, and meteorological data for the event, including standard National Weather Service products and radar and rawindsonde data from the event. 3 references, 80 figures, 11 tables.

  4. A Climate Network Based Index to Distinguish Sub- and Supercritical ENSO Events

    NASA Astrophysics Data System (ADS)

    Feng, Qingyi; Dijkstra, Henk

    2015-04-01

    The Bjerknes stability (BJ) index has frequently been used to measure the stability of the Pacific climate state with respect to the occurrence of El Niño-Southern Oscillation (ENSO) events. Although it has been recently criticized for not always reflecting the heat budget accurately, the BJ index nicely distinguishes the effects of different feedbacks on the growth of the ENSO mode of variability. Its main disadvantage is, however, that it has been determined from reanalysis products but not from available observations. This work proposes a similar stability index which is easier to evaluate. Tools of complex network theory are used to reconstruct a climate network from available sea surface temperature data. The new stability index Sd is derived from one of the topological properties (connectedness) of this network. By using output from the Cane-Zebiak model, we demonstrate that Sd provides similar information as the BJ index and can monitor whether an ENSO event is sub- or supercritical. By considering observed temperature data, we show that the 1972 and 1982 events were subcritical (excited by stochastic noise) while the 1997 and 2009 events were supercritical (sustained oscillation).

  5. Sequence-of-events-driven automation of the deep space network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  6. Seismic network detection probability assessment using waveforms and accounting to event association logic

    NASA Astrophysics Data System (ADS)

    Pinsky, Vladimir; Shapira, Avi

    2016-05-01

    The geographical area where a seismic event of magnitude M ≥ M t is detected by a seismic station network, for a defined probability is derived from a station probability of detection estimated as a function of epicentral distance. The latter is determined from both the bulletin data and the waveforms recorded by the station during the occurrence of the event with and without band-pass filtering. For simulating the real detection process, the waveforms are processed using the conventional Carl Johnson detection and association algorithm. The attempt is presented to account for the association time criterion in addition to the conventional approach adopted by the known PMC method.

  7. The Knowledge-Integrated Network Biomarkers Discovery for Major Adverse Cardiac Events

    PubMed Central

    Jin, Guangxu; Zhou, Xiaobo; Wang, Honghui; Zhao, Hong; Cui, Kemi; Zhang, Xiang-Sun; Chen, Luonan; Hazen, Stanley L.; Li, King; Wong, Stephen T. C.

    2010-01-01

    The mass spectrometry (MS) technology in clinical proteomics is very promising for discovery of new biomarkers for diseases management. To overcome the obstacles of data noises in MS analysis, we proposed a new approach of knowledge-integrated biomarker discovery using data from Major Adverse Cardiac Events (MACE) patients. We first built up a cardiovascular-related network based on protein information coming from protein annotations in Uniprot, protein–protein interaction (PPI), and signal transduction database. Distinct from the previous machine learning methods in MS data processing, we then used statistical methods to discover biomarkers in cardiovascular-related network. Through the tradeoff between known protein information and data noises in mass spectrometry data, we finally could firmly identify those high-confident biomarkers. Most importantly, aided by protein–protein interaction network, that is, cardiovascular-related network, we proposed a new type of biomarkers, that is, network biomarkers, composed of a set of proteins and the interactions among them. The candidate network biomarkers can classify the two groups of patients more accurately than current single ones without consideration of biological molecular interaction. PMID:18665624

  8. Secret Forwarding of Events over Distributed Publish/Subscribe Overlay Network

    PubMed Central

    Kim, Beom Heyn

    2016-01-01

    Publish/subscribe is a communication paradigm where loosely-coupled clients communicate in an asynchronous fashion. Publish/subscribe supports the flexible development of large-scale, event-driven and ubiquitous systems. Publish/subscribe is prevalent in a number of application domains such as social networking, distributed business processes and real-time mission-critical systems. Many publish/subscribe applications are sensitive to message loss and violation of privacy. To overcome such issues, we propose a novel method of using secret sharing and replication techniques. This is to reliably and confidentially deliver decryption keys along with encrypted publications even under the presence of several Byzantine brokers across publish/subscribe overlay networks. We also propose a framework for dynamically and strategically allocating broker replicas based on flexibly definable criteria for reliability and performance. Moreover, a thorough evaluation is done through a case study on social networks using the real trace of interactions among Facebook users. PMID:27367610

  9. Secret Forwarding of Events over Distributed Publish/Subscribe Overlay Network.

    PubMed

    Yoon, Young; Kim, Beom Heyn

    2016-01-01

    Publish/subscribe is a communication paradigm where loosely-coupled clients communicate in an asynchronous fashion. Publish/subscribe supports the flexible development of large-scale, event-driven and ubiquitous systems. Publish/subscribe is prevalent in a number of application domains such as social networking, distributed business processes and real-time mission-critical systems. Many publish/subscribe applications are sensitive to message loss and violation of privacy. To overcome such issues, we propose a novel method of using secret sharing and replication techniques. This is to reliably and confidentially deliver decryption keys along with encrypted publications even under the presence of several Byzantine brokers across publish/subscribe overlay networks. We also propose a framework for dynamically and strategically allocating broker replicas based on flexibly definable criteria for reliability and performance. Moreover, a thorough evaluation is done through a case study on social networks using the real trace of interactions among Facebook users. PMID:27367610

  10. Identifying positions from affiliation networks: Preserving the duality of people and events

    PubMed Central

    Field, Sam; Frank, Kenneth A.; Schiller, Kathryn; Riegle-Crumb, Catherine; Muller, Chandra

    2010-01-01

    Frank’s [Frank, K.A., 1995. Identifying cohesive subgroups. Social Networks 17, 27–56] clustering technique for one-mode social network data is adapted to identify positions in affiliation networks by drawing on recent extensions of p* models to two-mode data. The algorithm is applied to the classic Deep South data on southern women and the social events in which they participated with results comparable to other algorithms. Monte Carlo simulations are used to generate sampling distributions to test for the presence of clustering in new data sets and to evaluate the performance of the algorithm. The algorithm and simulation results are then applied to high school students’ transcripts from one school from the Adolescent Health and Academic Achievement (AHAA) extension of the National Longitudinal Study of Adolescent Health. PMID:20354579

  11. Virtualization of Event Sources in Wireless Sensor Networks for the Internet of Things

    PubMed Central

    Martínez, Néstor Lucas; Martínez, José-Fernán; Díaz, Vicente Hernández

    2014-01-01

    Wireless Sensor Networks (WSNs) are generally used to collect information from the environment. The gathered data are delivered mainly to sinks or gateways that become the endpoints where applications can retrieve and process such data. However, applications would also expect from a WSN an event-driven operational model, so that they can be notified whenever occur some specific environmental changes instead of continuously analyzing the data provided periodically. In either operational model, WSNs represent a collection of interconnected objects, as outlined by the Internet of Things. Additionally, in order to fulfill the Internet of Things principles, Wireless Sensor Networks must have a virtual representation that allows indirect access to their resources, a model that should also include the virtualization of event sources in a WSN. Thus, in this paper a model for a virtual representation of event sources in a WSN is proposed. They are modeled as internet resources that are accessible by any internet application, following an Internet of Things approach. The model has been tested in a real implementation where a WSN has been deployed in an open neighborhood environment. Different event sources have been identified in the proposed scenario, and they have been represented following the proposed model. PMID:25470489

  12. Virtualization of event sources in wireless sensor networks for the internet of things.

    PubMed

    Lucas Martínez, Néstor; Martínez, José-Fernán; Hernández Díaz, Vicente

    2014-01-01

    Wireless Sensor Networks (WSNs) are generally used to collect information from the environment. The gathered data are delivered mainly to sinks or gateways that become the endpoints where applications can retrieve and process such data. However, applications would also expect from a WSN an event-driven operational model, so that they can be notified whenever occur some specific environmental changes instead of continuously analyzing the data provided periodically. In either operational model, WSNs represent a collection of interconnected objects, as outlined by the Internet of Things. Additionally, in order to fulfill the Internet of Things principles, Wireless Sensor Networks must have a virtual representation that allows indirect access to their resources, a model that should also include the virtualization of event sources in a WSN. Thus, in this paper a model for a virtual representation of event sources in a WSN is proposed. They are modeled as internet resources that are accessible by any internet application, following an Internet of Things approach. The model has been tested in a real implementation where a WSN has been deployed in an open neighborhood environment. Different event sources have been identified in the proposed scenario, and they have been represented following the proposed model. PMID:25470489

  13. 45. VIEW OF BRONZE BUILDERS PLATE LOCATED ON NORTH SIDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    45. VIEW OF BRONZE BUILDERS PLATE LOCATED ON NORTH SIDE AT OUTERMOST END OF WESTERN APPROACH WALL - Tomlinson Bridge, Spanning Quinnipiac River at Forbes Street (U.S. Route 1), New Haven, New Haven County, CT

  14. 8. DETAIL OF BUILDER'S PLATE, PROCLAIMING THE INVENTOR OF THIS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF BUILDER'S PLATE, PROCLAIMING THE INVENTOR OF THIS BRIDGE TYPE, WILLIAM SCHERZER. - Pennsylvania Railroad, "Eight-track" Bascule Bridge, Spanning Sanitary & Ship Canal, west of Western Avenue, Chicago, Cook County, IL

  15. Detail of builder's plate at northeast end. Waterville Bridge, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of builder's plate at northeast end. - Waterville Bridge, Spanning Swatara Creek at Appalachian Trail (moved from Little Pine Creek at State Route 44, Waterville, Lycoming County), Green Point, Lebanon County, PA

  16. Discrete-event simulation of a wide-area health care network.

    PubMed Central

    McDaniel, J G

    1995-01-01

    OBJECTIVE: Predict the behavior and estimate the telecommunication cost of a wide-area message store-and-forward network for health care providers that uses the telephone system. DESIGN: A tool with which to perform large-scale discrete-event simulations was developed. Network models for star and mesh topologies were constructed to analyze the differences in performances and telecommunication costs. The distribution of nodes in the network models approximates the distribution of physicians, hospitals, medical labs, and insurers in the Province of Saskatchewan, Canada. Modeling parameters were based on measurements taken from a prototype telephone network and a survey conducted at two medical clinics. Simulation studies were conducted for both topologies. RESULTS: For either topology, the telecommunication cost of a network in Saskatchewan is projected to be less than $100 (Canadian) per month per node. The estimated telecommunication cost of the star topology is approximately half that of the mesh. Simulations predict that a mean end-to-end message delivery time of two hours or less is achievable at this cost. A doubling of the data volume results in an increase of less than 50% in the mean end-to-end message transfer time. CONCLUSION: The simulation models provided an estimate of network performance and telecommunication cost in a specific Canadian province. At the expected operating point, network performance appeared to be relatively insensitive to increases in data volume. Similar results might be anticipated in other rural states and provinces in North America where a telephone-based network is desired. PMID:7583646

  17. A Tool for Modelling the Impact of Triggered Landslide Events on Road Networks

    NASA Astrophysics Data System (ADS)

    Taylor, F. E.; Santangelo, M.; Marchesini, I.; Malamud, B. D.; Guzzetti, F.

    2014-12-01

    In the minutes to weeks after a landslide trigger such as an earthquake or heavy rain, tens to thousands of landslides may occur across a region, resulting in simultaneous blockages across the road network, which can impact recovery efforts. In this paper, we show the development, application and confrontation with observed data, of a model to semi-stochastically simulate triggered landslide events and their impact on road network topologies. In this model, "synthetic" triggered landslide event inventories are created by randomly selecting landslide sizes and shapes from already established statistical distributions. The landslides are then semi-randomly distributed over the region's road network, where they are more or less likely to land based on a landslide susceptibility map. The number, size and network impact of the road blockages is then calculated. This process is repeated in a Monte Carlo type simulation to assess a range of scenarios. Due to the generally applicable statistical distributions used to create the synthetic triggered landslide event inventories and the relatively minimal data requirements to run the model, the model is theoretically applicable to many regions of the world where triggered landslide events occur. Current work focuses on applying the model to two regions: (i) the Collazzone basin (79 km2) in Central Italy where 422 landslides were triggered by rapid snowmelt in January 1997, (ii) the Oat Mountain quadrangle (155 km2) in California, USA, where 1,350 landslides were triggered by the Northridge Earthquake (M = 6.7) in January 1994. When appropriate adjustments are made to susceptibility in the immediate vicinity of the roads, model results match reasonably well observations. In Collazzone (length of road = 153 km, landslide density = 5.2 landslides km-2), the median number of road blockages over 100 model runs was 5 (±2.5 s.d.), compared to the observed number of 5. In Northridge (length of road = 780 km, landslide density = 8

  18. Using the relational event model (REM) to investigate the temporal dynamics of animal social networks

    PubMed Central

    Tranmer, Mark; Marcum, Christopher Steven; Morton, F. Blake; Croft, Darren P.; de Kort, Selvino R.

    2015-01-01

    Social dynamics are of fundamental importance in animal societies. Studies on nonhuman animal social systems often aggregate social interaction event data into a single network within a particular time frame. Analysis of the resulting network can provide a useful insight into the overall extent of interaction. However, through aggregation, information is lost about the order in which interactions occurred, and hence the sequences of actions over time. Many research hypotheses relate directly to the sequence of actions, such as the recency or rate of action, rather than to their overall volume or presence. Here, we demonstrate how the temporal structure of social interaction sequences can be quantified from disaggregated event data using the relational event model (REM). We first outline the REM, explaining why it is different from other models for longitudinal data, and how it can be used to model sequences of events unfolding in a network. We then discuss a case study on the European jackdaw, Corvus monedula, in which temporal patterns of persistence and reciprocity of action are of interest, and present and discuss the results of a REM analysis of these data. One of the strengths of a REM analysis is its ability to take into account different ways in which data are collected. Having explained how to take into account the way in which the data were collected for the jackdaw study, we briefly discuss the application of the model to other studies. We provide details of how the models may be fitted in the R statistical software environment and outline some recent extensions to the REM framework. PMID:26190856

  19. First-break refraction event picking and seismic data trace editing using neural networks

    SciTech Connect

    McCormack, M.D.; Zaucha, D.E.; Dushek, D.W. )

    1993-01-01

    Interactive seismic processing systems for editing noisy seismic traces and picking the first-break refraction events have been developed using a neural network learning algorithm. The authors employ a back propagation neural network (BNN) paradigm modified to improve the convergence rate of the BNN. The BNN is interactively trained'' to edit seismic data or pick first breaks by a human processor who judiciously selects and presents to the network examples of trace edits or refraction picks. The network then iteratively adjusts a set of internal weights until it can accurately duplicate the examples provided by the user. After the training session is completed, the BNN system an then process new data sets in a manner that mimics the human processor. Synthetic modeling studies indicated that the BNN uses many of the same subjective criteria that humans employ in editing and picking seismic data sets. Automated trace editing and first-break picking based on the modified BNN paradigm achieve 90 to 98 percent agreement with manual methods for seismic data of moderate to good quality. Productivity increases over manual editing, and picking techniques range from 60 percent for two-dimensional (2-D) data sets and up to 800 percent for three-dimensional (3-D) data sets. Neural network-based seismic processing can provide consistent and high quality results with substantial improvements in processing efficiency.

  20. The Real Time Display Builder (RTDB)

    NASA Technical Reports Server (NTRS)

    Kindred, Erick D.; Bailey, Samuel A., Jr.

    1989-01-01

    The Real Time Display Builder (RTDB) is a prototype interactive graphics tool that builds logic-driven displays. These displays reflect current system status, implement fault detection algorithms in real time, and incorporate the operational knowledge of experienced flight controllers. RTDB utilizes an object-oriented approach that integrates the display symbols with the underlying operational logic. This approach allows the user to specify the screen layout and the driving logic as the display is being built. RTDB is being developed under UNIX in C utilizing the MASSCOMP graphics environment with appropriate functional separation to ease portability to other graphics environments. RTDB grew from the need to develop customized real-time data-driven Space Shuttle systems displays. One display, using initial functionality of the tool, was operational during the orbit phase of STS-26 Discovery. RTDB is being used to produce subsequent displays for the Real Time Data System project currently under development within the Mission Operations Directorate at NASA/JSC. The features of the tool, its current state of development, and its applications are discussed.

  1. Feedback between Accelerator Physicists and magnet builders

    SciTech Connect

    Peggs, S.

    1995-12-31

    Our task is not to record history but to change it. (K. Marx (paraphrased)) How should Accelerator Physicists set magnet error specifications? In a crude social model, they place tolerance limits on undesirable nonlinearities and errors (higher order harmonics, component alignments, etc.). The Magnet Division then goes away for a suitably lengthy period of time, and comes back with a working magnet prototype that is reproduced in industry. A better solution is to set no specifications. Accelerator Physicists begin by evaluating expected values of harmonics, generated by the Magnet Division, before and during prototype construction. Damaging harmonics are traded off against innocuous harmonics as the prototype design evolves, lagging one generation behind the evolution of expected harmonics. Finally, the real harmonics are quickly evaluated during early industrial production, allowing a final round of performance trade-offs, using contingency scenarios prepared earlier. This solution assumes a close relationship and rapid feedback between the Accelerator Physicists and the magnet builders. What follows is one perspective of the way that rapid feedback was used to `change history` (improve linear and dynamic aperture) at RHIC, to great benefit.

  2. Words Analysis of Online Chinese News Headlines about Trending Events: A Complex Network Perspective

    PubMed Central

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines’ keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words’ networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly. PMID:25807376

  3. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    PubMed

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly. PMID:25807376

  4. Energy efficient data representation and aggregation with event region detection in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Banerjee, Torsha

    Unlike conventional networks, wireless sensor networks (WSNs) are limited in power, have much smaller memory buffers, and possess relatively slower processing speeds. These characteristics necessitate minimum transfer and storage of information in order to prolong the network lifetime. In this dissertation, we exploit the spatio-temporal nature of sensor data to approximate the current values of the sensors based on readings obtained from neighboring sensors and itself. We propose a Tree based polynomial REGression algorithm, (TREG) that addresses the problem of data compression in wireless sensor networks. Instead of aggregated data, a polynomial function (P) is computed by the regression function, TREG. The coefficients of P are then passed to achieve the following goals: (i) The sink can get attribute values in the regions devoid of sensor nodes, and (ii) Readings over any portion of the region can be obtained at one time by querying the root of the tree. As the size of the data packet from each tree node to its parent remains constant, the proposed scheme scales very well with growing network density or increased coverage area. Since physical attributes exhibit a gradual change over time, we propose an iterative scheme, UPDATE_COEFF, which obviates the need to perform the regression function repeatedly and uses approximations based on previous readings. Extensive simulations are performed on real world data to demonstrate the effectiveness of our proposed aggregation algorithm, TREG. Results reveal that for a network density of 0.0025 nodes/m2, a complete binary tree of depth 4 could provide the absolute error to be less than 6%. A data compression ratio of about 0.02 is achieved using our proposed algorithm, which is almost independent of the tree depth. In addition, our proposed updating scheme makes the aggregation process faster while maintaining the desired error bounds. We also propose a Polynomial-based scheme that addresses the problem of Event Region

  5. Social network, presence of cardiovascular events and mortality in hypertensive patients.

    PubMed

    Menéndez-Villalva, C; Gamarra-Mondelo, M T; Alonso-Fachado, A; Naveira-Castelo, A; Montes-Martínez, A

    2015-07-01

    The aim of this study was to ascertain the relationship between social network and the appearance of mortality (cardiovascular events (CVEs)) in patients with arterial hypertension (AHT). This is a cohort study of 236 patients with a 9-year follow-up. Measurements included age, sex, blood pressure (BP), diabetes, hypercholesterolemia, marital status, social network, social support, stage of family life cycle (FLC), mortality and CVEs. Patients with a low social network registered higher global mortality (hazards ratio (HR) 2.6 (95% confidence interval (CI) 1.3; 5.5)) as did the oldest patients (HR 5.6 (1.9; 16.8)), men (HR 3.5 (95% CI 1.3; 9.3)) and subjects in the last FLC stages (HR 4.3 (95% CI 1.3;14.1)). Patients with low social support registered higher cardiovascular mortality (HR 2.6 (95% CI 1.1; 6.1)) as did the oldest patients (HR 12.4 (95% CI 2.8; 55.2)) and those with diabetes (HR 3.00 (95% CI 1.2; 7.6)). Patients with a low social network registered more CVEs (HR 2.1 (95% CI 1.1; 4.1)) than patients with an adequate network, as did the oldest patients (HR 3.1 (95% CI 1.4; 6.9)), subjects who presented with a higher grade of severity of AHT (HR 2.7 (1.3; 5.5)) and those in the last FLC stages (HR 2.5 (95% CI 1.0; 6.2)). A low social network is associated with mortality and the appearance of CVEs in patients with AHT. Low functional social support is associated with the appearance of cardiovascular mortality. PMID:25500900

  6. The Waveform Correlation Event Detection System project, Phase II: Testing with the IDC primary network

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Moore, S.G.

    1998-04-01

    Further improvements to the Waveform Correlation Event Detection System (WCEDS) developed by Sandia Laboratory have made it possible to test the system on the accepted Comprehensive Test Ban Treaty (CTBT) seismic monitoring network. For our test interval we selected a 24-hour period from December 1996, and chose to use the Reviewed Event Bulletin (REB) produced by the Prototype International Data Center (PIDC) as ground truth for evaluating the results. The network is heterogeneous, consisting of array and three-component sites, and as a result requires more flexible waveform processing algorithms than were available in the first version of the system. For simplicity and superior performance, we opted to use the spatial coherency algorithm of Wagner and Owens (1996) for both types of sites. Preliminary tests indicated that the existing version of WCEDS, which ignored directional information, could not achieve satisfactory detection or location performance for many of the smaller events in the REB, particularly those in the south Pacific where the network coverage is unusually sparse. To achieve an acceptable level of performance, we made modifications to include directional consistency checks for the correlations, making the regions of high correlation much less ambiguous. These checks require the production of continuous azimuth and slowness streams for each station, which is accomplished by means of FK processing for the arrays and power polarization processing for the three-component sites. In addition, we added the capability to use multiple frequency-banded data streams for each site to increase sensitivity to phases whose frequency content changes as a function of distance.

  7. A statistical framework for evaluating neural networks to predict recurrent events in breast cancer

    NASA Astrophysics Data System (ADS)

    Gorunescu, Florin; Gorunescu, Marina; El-Darzi, Elia; Gorunescu, Smaranda

    2010-07-01

    Breast cancer is the second leading cause of cancer deaths in women today. Sometimes, breast cancer can return after primary treatment. A medical diagnosis of recurrent cancer is often a more challenging task than the initial one. In this paper, we investigate the potential contribution of neural networks (NNs) to support health professionals in diagnosing such events. The NN algorithms are tested and applied to two different datasets. An extensive statistical analysis has been performed to verify our experiments. The results show that a simple network structure for both the multi-layer perceptron and radial basis function can produce equally good results, not all attributes are needed to train these algorithms and, finally, the classification performances of all algorithms are statistically robust. Moreover, we have shown that the best performing algorithm will strongly depend on the features of the datasets, and hence, there is not necessarily a single best classifier.

  8. The Next Generation of NASA Night Sky Network: A Searchable Nationwide Database of Astronomy Events

    NASA Astrophysics Data System (ADS)

    Ames, Z.; Berendsen, M.; White, V.

    2010-08-01

    With support from NASA, the Astronomical Society of the Pacific (ASP) first developed the Night Sky Network (NSN) in 2004. The NSN was created in response to research conducted by the Institute for Learning Innovation (ILI) to determine what type of support amateur astronomers could use to increase the efficiency and extent of their educational outreach programs. Since its creation, the NSN has grown to include an online searchable database of toolkit resources, Presentation Skills Videos covering topics such as working with kids and how to answer difficult questions, and a searchable nationwide calendar of astronomy events that supports club organization. The features of the NSN have allowed the ASP to create a template that amateur science organizations might use to create a similar support network for their members and the public.

  9. HOS network-based classification of power quality events via regression algorithms

    NASA Astrophysics Data System (ADS)

    Palomares Salas, José Carlos; González de la Rosa, Juan José; Sierra Fernández, José María; Pérez, Agustín Agüera

    2015-12-01

    This work compares seven regression algorithms implemented in artificial neural networks (ANNs) supported by 14 power-quality features, which are based in higher-order statistics. Combining time and frequency domain estimators to deal with non-stationary measurement sequences, the final goal of the system is the implementation in the future smart grid to guarantee compatibility between all equipment connected. The principal results are based in spectral kurtosis measurements, which easily adapt to the impulsive nature of the power quality events. These results verify that the proposed technique is capable of offering interesting results for power quality (PQ) disturbance classification. The best results are obtained using radial basis networks, generalized regression, and multilayer perceptron, mainly due to the non-linear nature of data.

  10. How events determine spreading patterns: information transmission via internal and external influences on social networks

    NASA Astrophysics Data System (ADS)

    Liu, Chuang; Zhan, Xiu-Xiu; Zhang, Zi-Ke; Sun, Gui-Quan; Hui, Pak Ming

    2015-11-01

    Recently, information transmission models motivated by the classical epidemic propagation, have been applied to a wide-range of social systems, generally assume that information mainly transmits among individuals via peer-to-peer interactions on social networks. In this paper, we consider one more approach for users to get information: the out-of-social-network influence. Empirical analyzes of eight typical events’ diffusion on a very large micro-blogging system, Sina Weibo, show that the external influence has significant impact on information spreading along with social activities. In addition, we propose a theoretical model to interpret the spreading process via both internal and external channels, considering three essential properties: (i) memory effect; (ii) role of spreaders; and (iii) non-redundancy of contacts. Experimental and mathematical results indicate that the information indeed spreads much quicker and broader with mutual effects of the internal and external influences. More importantly, the present model reveals that the event characteristic would highly determine the essential spreading patterns once the network structure is established. The results may shed some light on the in-depth understanding of the underlying dynamics of information transmission on real social networks.

  11. Adaptive Routing Protocol with Energy Efficiency and Event Clustering for Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Tran Quang, Vinh; Miyoshi, Takumi

    Wireless sensor network (WSN) is a promising approach for a variety of applications. Routing protocol for WSNs is very challenging because it should be simple, scalable, energy-efficient, and robust to deal with a very large number of nodes, and also self-configurable to node failures and changes of the network topology dynamically. Recently, many researchers have focused on developing hierarchical protocols for WSNs. However, most protocols in the literatures cannot scale well to large sensor networks and difficult to apply in the real applications. In this paper, we propose a novel adaptive routing protocol for WSNs called ARPEES. The main design features of the proposed method are: energy efficiency, dynamic event clustering, and multi-hop relay considering the trade-off relationship between the residual energy available of relay nodes and distance from the relay node to the base station. With a distributed and light overhead traffic approach, we spread energy consumption required for aggregating data and relaying them to different sensor nodes to prolong the lifetime of the whole network. In this method, we consider energy and distance as the parameters in the proposed function to select relay nodes and finally select the optimal path among cluster heads, relay nodes and the base station. The simulation results show that our routing protocol achieves better performance than other previous routing protocols.

  12. Characterization of computer network events through simultaneous feature selection and clustering of intrusion alerts

    NASA Astrophysics Data System (ADS)

    Chen, Siyue; Leung, Henry; Dondo, Maxwell

    2014-05-01

    As computer network security threats increase, many organizations implement multiple Network Intrusion Detection Systems (NIDS) to maximize the likelihood of intrusion detection and provide a comprehensive understanding of intrusion activities. However, NIDS trigger a massive number of alerts on a daily basis. This can be overwhelming for computer network security analysts since it is a slow and tedious process to manually analyse each alert produced. Thus, automated and intelligent clustering of alerts is important to reveal the structural correlation of events by grouping alerts with common features. As the nature of computer network attacks, and therefore alerts, is not known in advance, unsupervised alert clustering is a promising approach to achieve this goal. We propose a joint optimization technique for feature selection and clustering to aggregate similar alerts and to reduce the number of alerts that analysts have to handle individually. More precisely, each identified feature is assigned a binary value, which reflects the feature's saliency. This value is treated as a hidden variable and incorporated into a likelihood function for clustering. Since computing the optimal solution of the likelihood function directly is analytically intractable, we use the Expectation-Maximisation (EM) algorithm to iteratively update the hidden variable and use it to maximize the expected likelihood. Our empirical results, using a labelled Defense Advanced Research Projects Agency (DARPA) 2000 reference dataset, show that the proposed method gives better results than the EM clustering without feature selection in terms of the clustering accuracy.

  13. Discrete event command and control for networked teams with multiple missions

    NASA Astrophysics Data System (ADS)

    Lewis, Frank L.; Hudas, Greg R.; Pang, Chee Khiang; Middleton, Matthew B.; McMurrough, Christopher

    2009-05-01

    During mission execution in military applications, the TRADOC Pamphlet 525-66 Battle Command and Battle Space Awareness capabilities prescribe expectations that networked teams will perform in a reliable manner under changing mission requirements, varying resource availability and reliability, and resource faults. In this paper, a Command and Control (C2) structure is presented that allows for computer-aided execution of the networked team decision-making process, control of force resources, shared resource dispatching, and adaptability to change based on battlefield conditions. A mathematically justified networked computing environment is provided called the Discrete Event Control (DEC) Framework. DEC has the ability to provide the logical connectivity among all team participants including mission planners, field commanders, war-fighters, and robotic platforms. The proposed data management tools are developed and demonstrated on a simulation study and an implementation on a distributed wireless sensor network. The results show that the tasks of multiple missions are correctly sequenced in real-time, and that shared resources are suitably assigned to competing tasks under dynamically changing conditions without conflicts and bottlenecks.

  14. Digital Learning Network Education Events for the Desert Research and Technology Studies

    NASA Technical Reports Server (NTRS)

    Paul, Heather L.; Guillory, Erika R.

    2007-01-01

    NASA s Digital Learning Network (DLN) reaches out to thousands of students each year through video conferencing and webcasting. As part of NASA s Strategic Plan to reach the next generation of space explorers, the DLN develops and delivers educational programs that reinforce principles in the areas of science, technology, engineering and mathematics. The DLN has created a series of live education videoconferences connecting the Desert Research and Technology Studies (RATS) field test to students across the United States. The programs are also extended to students around the world via live webcasting. The primary focus of the events is the Vision for Space Exploration. During the programs, Desert RATS engineers and scientists inform and inspire students about the importance of exploration and share the importance of the field test as it correlates with plans to return to the Moon and explore Mars. This paper describes the events that took place in September 2006.

  15. Cluster synchronization of complex networks via event-triggered strategy under stochastic sampling

    NASA Astrophysics Data System (ADS)

    Hu, Aihua; Cao, Jinde; Hu, Manfeng; Guo, Liuxiao

    2015-09-01

    This paper is concerned with the issue of mean square cluster synchronization of non-identical nodes connected by a directed network. Suppose that the nodes possess nonlinear dynamics and split into several clusters, then an event-triggered control scheme is proposed for synchronization based on the information from stochastic sampling. Meanwhile, an equilibrium is considered to be the synchronization state or the virtual leader for each cluster, which can apply pinning control to the following nodes. Assume that a spanning tree exists in the subgraph consisting of the nodes belonging to the same cluster and the corresponding virtual leader, and the instants for updating controllers are determined by the given event-triggered strategy, then some sufficient conditions for cluster synchronization are presented according to the Lyapunov stability theory and linear matrix inequality technique. Finally, a specific numerical example is shown to demonstrate the effectiveness of the theoretical results.

  16. Event-train restoration via backpropagation neural networks. Interim report, January-July 1989

    SciTech Connect

    Raeth, P.G.

    1989-12-01

    This project is investigating backpropagation neural networks for specific applications in passive electronic warfare involving restoration of deinterleaved event trains to their original broadcast form. This is different from traditional bit-error detection/correction which relies on a prior knowledge of what the original bit stream looked liked. In electronic warfare it is unlikely that such prior knowledge will be available. Results of this research can be applied to 3 major problem areas: (1) pulse-train restoration, (2) communications signal compression, and (3) data compression.

  17. An event-based neural network architecture with an asynchronous programmable synaptic memory.

    PubMed

    Moradi, Saber; Indiveri, Giacomo

    2014-02-01

    We present a hybrid analog/digital very large scale integration (VLSI) implementation of a spiking neural network with programmable synaptic weights. The synaptic weight values are stored in an asynchronous Static Random Access Memory (SRAM) module, which is interfaced to a fast current-mode event-driven DAC for producing synaptic currents with the appropriate amplitude values. These currents are further integrated by current-mode integrator synapses to produce biophysically realistic temporal dynamics. The synapse output currents are then integrated by compact and efficient integrate and fire silicon neuron circuits with spike-frequency adaptation and adjustable refractory period and spike-reset voltage settings. The fabricated chip comprises a total of 32 × 32 SRAM cells, 4 × 32 synapse circuits and 32 × 1 silicon neurons. It acts as a transceiver, receiving asynchronous events in input, performing neural computation with hybrid analog/digital circuits on the input spikes, and eventually producing digital asynchronous events in output. Input, output, and synaptic weight values are transmitted to/from the chip using a common communication protocol based on the Address Event Representation (AER). Using this representation it is possible to interface the device to a workstation or a micro-controller and explore the effect of different types of Spike-Timing Dependent Plasticity (STDP) learning algorithms for updating the synaptic weights values in the SRAM module. We present experimental results demonstrating the correct operation of all the circuits present on the chip. PMID:24681923

  18. Building Air Monitoring Networks

    ERIC Educational Resources Information Center

    Environmental Science and Technology, 1977

    1977-01-01

    The different components of air monitoring networks, the status of air monitoring in the United States, and the services and activities of the three major American network builders are detailed. International air monitoring networks and alert systems are identified, with emphasis on the Dutch air monitoring network. (BT)

  19. Diet Segregation between Cohabiting Builder and Inquiline Termite Species

    PubMed Central

    Florencio, Daniela Faria; Marins, Alessandra; Rosa, Cassiano Sousa; Cristaldo, Paulo Fellipe; Araújo, Ana Paula Albano; Silva, Ivo Ribeiro; DeSouza, Og

    2013-01-01

    How do termite inquilines manage to cohabit termitaria along with the termite builder species? With this in mind, we analysed one of the several strategies that inquilines could use to circumvent conflicts with their hosts, namely, the use of distinct diets. We inspected overlapping patterns for the diets of several cohabiting Neotropical termite species, as inferred from carbon and nitrogen isotopic signatures for termite individuals. Cohabitant communities from distinct termitaria presented overlapping diet spaces, indicating that they exploited similar diets at the regional scale. When such communities were split into their components, full diet segregation could be observed between builders and inquilines, at regional (environment-wide) and local (termitarium) scales. Additionally, diet segregation among inquilines themselves was also observed in the vast majority of inspected termitaria. Inquiline species distribution among termitaria was not random. Environmental-wide diet similarity, coupled with local diet segregation and deterministic inquiline distribution, could denounce interactions for feeding resources. However, inquilines and builders not sharing the same termitarium, and thus not subject to potential conflicts, still exhibited distinct diets. Moreover, the areas of the builder’s diet space and that of its inquilines did not correlate negatively. Accordingly, the diet areas of builders which hosted inquilines were in average as large as the areas of builders hosting no inquilines. Such results indicate the possibility that dietary partitioning by these cohabiting termites was not majorly driven by current interactive constraints. Rather, it seems to be a result of traits previously fixed in the evolutionary past of cohabitants. PMID:23805229

  20. How activation, entanglement, and searching a semantic network contribute to event memory.

    PubMed

    Nelson, Douglas L; Kitto, Kirsty; Galea, David; McEvoy, Cathy L; Bruza, Peter D

    2013-08-01

    Free-association norms indicate that words are organized into semantic/associative neighborhoods within a larger network of words and links that bind the net together. We present evidence indicating that memory for a recent word event can depend on implicitly and simultaneously activating related words in its neighborhood. Processing a word during encoding primes its network representation as a function of the density of the links in its neighborhood. Such priming increases recall and recognition and can have long-lasting effects when the word is processed in working memory. Evidence for this phenomenon is reviewed in extralist-cuing, primed free-association, intralist-cuing, and single-item recognition tasks. The findings also show that when a related word is presented in order to cue the recall of a studied word, the cue activates the target in an array of related words that distract and reduce the probability of the target's selection. The activation of the semantic network produces priming benefits during encoding, and search costs during retrieval. In extralist cuing, recall is a negative function of cue-to-distractor strength, and a positive function of neighborhood density, cue-to-target strength, and target-to-cue strength. We show how these four measures derived from the network can be combined and used to predict memory performance. These measures play different roles in different tasks, indicating that the contribution of the semantic network varies with the context provided by the task. Finally, we evaluate spreading-activation and quantum-like entanglement explanations for the priming effects produced by neighborhood density. PMID:23645391

  1. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  2. Source space analysis of event-related dynamic reorganization of brain networks.

    PubMed

    Ioannides, Andreas A; Dimitriadis, Stavros I; Saridis, George A; Voultsidou, Marotesa; Poghosyan, Vahe; Liu, Lichan; Laskaris, Nikolaos A

    2012-01-01

    How the brain works is nowadays synonymous with how different parts of the brain work together and the derivation of mathematical descriptions for the functional connectivity patterns that can be objectively derived from data of different neuroimaging techniques. In most cases static networks are studied, often relying on resting state recordings. Here, we present a quantitative study of dynamic reconfiguration of connectivity for event-related experiments. Our motivation is the development of a methodology that can be used for personalized monitoring of brain activity. In line with this motivation, we use data with visual stimuli from a typical subject that participated in different experiments that were previously analyzed with traditional methods. The earlier studies identified well-defined changes in specific brain areas at specific latencies related to attention, properties of stimuli, and tasks demands. Using a recently introduced methodology, we track the event-related changes in network organization, at source space level, thus providing a more global and complete view of the stages of processing associated with the regional changes in activity. The results suggest the time evolving modularity as an additional brain code that is accessible with noninvasive means and hence available for personalized monitoring and clinical applications. PMID:23097678

  3. Source Space Analysis of Event-Related Dynamic Reorganization of Brain Networks

    PubMed Central

    Ioannides, Andreas A.; Dimitriadis, Stavros I.; Saridis, George A.; Voultsidou, Marotesa; Poghosyan, Vahe; Liu, Lichan; Laskaris, Nikolaos A.

    2012-01-01

    How the brain works is nowadays synonymous with how different parts of the brain work together and the derivation of mathematical descriptions for the functional connectivity patterns that can be objectively derived from data of different neuroimaging techniques. In most cases static networks are studied, often relying on resting state recordings. Here, we present a quantitative study of dynamic reconfiguration of connectivity for event-related experiments. Our motivation is the development of a methodology that can be used for personalized monitoring of brain activity. In line with this motivation, we use data with visual stimuli from a typical subject that participated in different experiments that were previously analyzed with traditional methods. The earlier studies identified well-defined changes in specific brain areas at specific latencies related to attention, properties of stimuli, and tasks demands. Using a recently introduced methodology, we track the event-related changes in network organization, at source space level, thus providing a more global and complete view of the stages of processing associated with the regional changes in activity. The results suggest the time evolving modularity as an additional brain code that is accessible with noninvasive means and hence available for personalized monitoring and clinical applications. PMID:23097678

  4. Statistical Patterns of Triggered Landslide Events and their Application to Road Networks

    NASA Astrophysics Data System (ADS)

    Taylor, Faith E.; Malamud, Bruce D.; Santangelo, Michele; Marchesini, Ivan; Guzzetti, Fausto

    2015-04-01

    In the minutes to weeks after a landslide trigger such as an earthquake or heavy rainfall, as part of a triggered landslide event, one individual to tens of thousands of landslides may occur across a region. If in the region, one or more roads become blocked by landslides, this can cause extensive detours and delay rescue and recovery operations. In this paper, we show the development, application and confrontation with real data of a model to simulate triggered landslide events and their impacts upon road networks. This is done by creating a 'synthetic' triggered landslide event inventory by randomly sampling landslide areas and shapes from already established statistical distributions. These landslides are then semi-randomly dropped across a given study region, conditioned by that region's landslide susceptibility. The resulting synthetic triggered landslide event inventory is overlaid with the region's road network map and the number, size, location and network impact of road blockages and landslides near roads calculated. This process is repeated hundreds of times in a Monte Carlo type simulation. The statistical distributions and approaches used in the model are thought to be generally applicable for low-mobility triggered landslides in many medium to high-topography regions throughout the world. The only local data required to run the model are a road network map, a landslide susceptibility map, a map of the study area boundary and a digital elevation model. Coupled with an Open Source modelling approach (in GRASS-GIS), this model may be applied to many regions where triggered landslide events are an issue. We present model results and confrontation with observed data for two study regions where the model has been applied: Collazzone (Central Italy) where rapid snowmelt triggered 413 landslides in January 1997 and Oat Mountain (Northridge, USA), where the Northridge Earthquake triggered 1,356 landslides in January 1994. We find that when the landslide

  5. EVENT84 user's manual: a computer code for analyzing explosion-induced gas-dynamic transients in flow networks

    SciTech Connect

    Martin, R.A.; Wilson, T.L.

    1984-12-01

    This manual supports the computer code EVENT84, which can predict explosion-induced gas-dynamic transients in flow networks. The code can model transients in any arbitrarily designated network of building rooms and ventilation systems. A lumped-parameter formulation is used. EVENT84 was designed to provide a safety analysis tool for he nuclear, chemical, and mining industries. It is particularly suitable for calculating the detailed effects of explosions in the far field using a parametric representation of the explosive event. The code input and a sample problem that illustrates its capabilities are provided.

  6. Impacts of the January 2014 extreme rainfall event on transportation network in the Alps Maritimes (France)

    NASA Astrophysics Data System (ADS)

    Voumard, Jeremie; Penna, Ivanna; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    Road networks in mountain areas are highly inter-dependent systems, and hillslope processes such as landslides are main drivers of infrastructure detriment and transportation disruptions. Besides the structural damages, economic losses are also related to road and surrounding slope maintenance, as well as due to the disruption of transportation of goods, inaccessibility of tourist resorts, etc. 16-17th January 2014, an intense rainfall event was recorded in the Alpes Maritimes from the southern part of France. According to meteorological data, it was the highest since the 70's. This rainfall triggered numerous landslides (rockfalls, earth flows and debris flows), mostly on January 17th. There were no casualties registered due to hillslope processes, but several houses were damaged, some populations living in the Var valley along the RM 2205 road were isolated, and several roads were partially and totally blocked. 1.5 km upstream the village of Saint-Sauveur-sur-Tinée, 150 m3 of rock detached from the slope and blocked the road, after which temporary traffic interruptions due to road works lasted around one week. In the Menton area, where hillslopes are highly urbanized, the volume of rocks involved in slope failures was so large that materials removed to reestablish the traffic had to be placed in transitory storage sites. The average landslide volume was estimated at around 100 m3. Most of the landslides occurred in slopes cut during road and houses constructions. Several trucks were needed to clean up materials, giving place to traffic jams, etc. (some single events reached around 400 m3). The aim of this study is to document the impact on transportation networks caused by this rainfall event. Damages and consequences for the traffic were documented during a field visit, obtained from secondary information, as well as by the aid of a drone in the case of inaccessible areas.

  7. Group Sex Events and HIV/STI Risk in an Urban Network

    PubMed Central

    Friedman, Samuel R.; Bolyard, Melissa; Khan, Maria; Maslow, Carey; Sandoval, Milagros; Mateu-Gelabert, Pedro; Krauss, Beatrice; Aral, Sevgi O.

    2012-01-01

    Objectives To describe: a. the prevalence and individual and network characteristics of group sex events (GSE) and GSE attendees; and b. HIV/STI discordance among respondents who said they went to a GSE together. Methods and Design In a sociometric network study of risk partners (defined as sexual partners, persons with whom respondents attended a GSE, or drug-injection partners) in Brooklyn, NY, we recruited a high-risk sample of 465 adults. Respondents reported on GSE attendance, the characteristics of GSEs, and their own and others’ behaviors at GSEs. Sera and urines were collected and STI prevalence was assayed. Results Of the 465 participants, 36% had attended a GSE in the last year, 26% had sex during the most recent of these GSEs, and 13% had unprotected sex there. Certain subgroups (hard drug users, men who have sex with men, women who have sex with women, and sex workers) were more likely to attend and more likely to engage in risk behaviors at these events. Among 90 GSE dyads in which at least one partner named the other as someone with whom they attended a GSE in the previous three months, STI/HIV discordance was common (HSV-2: 45% of dyads, HIV: 12% of dyads, Chlamydia: 21% of dyads). Many GSEs had 10 or more participants, and multiple partnerships at GSEs were common. High attendance rates at GSEs among members of large networks may increase community vulnerability to STI/HIV, particularly since network data show that almost all members of a large sociometric risk network either had sex with a GSE attendee or had sex with someone who had sex with a GSE attended. Conclusions Self-reported GSE attendance and participation was common among this high-risk sample. STI/HIV discordance among GSE attendees was high, highlighting the potential transmission risk associated with GSEs. Research on sexual behaviors should incorporate measures of GSE behaviors as standard research protocol. Interventions should be developed to reduce transmission at GSEs. PMID

  8. Digital Learning Network Education Events of NASA's Extreme Environments Mission Operations

    NASA Technical Reports Server (NTRS)

    Paul, Heather; Guillory, Erika

    2007-01-01

    NASA's Digital Learning Network (DLN) reaches out to thousands of students each year through video conferencing and web casting. The DLN has created a series of live education videoconferences connecting NASA s Extreme Environment Missions Operations (NEEMO) team to students across the United States. The programs are also extended to students around the world live web casting. The primary focus of the events is the vision for space exploration. During the programs, NEEMO Crewmembers including NASA astronauts, engineers and scientists inform and inspire students about the importance of exploration and share the impact of the project as it correlates with plans to return to the moon and explore the planet Mars. These events highlight interactivity. Students talk live with the aquanauts in Aquarius, the National Oceanic and Atmospheric Administration s underwater laboratory. With this program, NASA continues the Agency s tradition of investing in the nation's education programs. It is directly tied to the Agency's major education goal of attracting and retaining students in science, technology, and engineering disciplines. Before connecting with the aquanauts, the students conduct experiments of their own designed to coincide with mission objectives. This paper describes the events that took place in September 2006.

  9. A polar cap absorption event observed using the Southern Hemisphere SuperDARN radar network.

    NASA Astrophysics Data System (ADS)

    Breed, A.; Morris, R.; Parkinson, M.; Duldig, M.; Dyson, P.

    A large X5 class solar flare and coronal mass ejection were observed emanating from the sun on July 14, 2000. Approximately 10 minutes later a large cosmic ray ground level enhancement was observed using neutron monitors located at Mawson station (70.5°S CGM), Antarctica; Large increases in proton flux were also observed using satellites during this time. This marked the start of a large polar cap absorption event with cosmic noise absorption peaking at 30 dB, as measured by a 30 MHz riometer located at Casey station (80.4°S CGM), Antarctica. The spatial evolution of this event and its subsequent recovery were studied using the Southern Hemisphere SuperDARN radar network, including the relatively low latitude observation provided by the Tasman International Geospace Environment Radar (TIGER) located on Bruny Island (54.6°S GGM), Tasmania. When the bulk of the CME arrived at the Earth two days later it triggered an intense geomagnetic storm. This paper presents observations of the dramatic sequence of events.

  10. EEG-based event detection using optimized echo state networks with leaky integrator neurons.

    PubMed

    Ayyagari, Sudhanshu S D P; Jones, Richard D; Weddell, Stephen J

    2014-01-01

    This study investigates the classification ability of linear and nonlinear classifiers on biological signals using the electroencephalogram (EEG) and examines the impact of architectural changes within the classifier in order to enhance the classification. Consequently, artificial events were used to validate a prototype EEG-based microsleep detection system based around an echo state network (ESN) and a linear discriminant analysis (LDA) classifier. The artificial events comprised infrequent 2-s long bursts of 15 Hz sinusoids superimposed on prerecorded 16-channel EEG data which provided a means of determining and optimizing the accuracy of overall classifier on `gold standard' events. The performance of this system was tested on different signal-to-noise amplitude ratios (SNRs) ranging from 16 down to 0.03. Results from several feature selection/reduction and pattern classification modules indicated that training the classifier using a leaky-integrator neuron ESN structure yielded highest classification accuracy. For datasets with a low SNR of 0.3, training the leaky-neuron ESN using only those features which directly correspond to the underlying event, resulted in a phi correlation of 0.92 compared to 0.37 that employed principal component analysis (PCA). On the same datasets, other classifiers such as LDA and simple ESNs using PCA performed weakly with a correlation of 0.05 and 0 respectively. These results suggest that ESNs with leaky neuron architectures have superior pattern recognition properties. This, in turn, may reflect their superior ability to exploit differences in state dynamics and, hence, provide superior temporal characteristics in learning. PMID:25571328

  11. Muon physics and neural network event classifier for the Sudbury Neutrino Observatory

    NASA Astrophysics Data System (ADS)

    Chon, Myung Chol

    1998-12-01

    The Sudbury Neutrino Observatory (SNO) has been designed principally to study solar neutrinos and other sources of neutrinos such as supernova neutrinos and atmospheric neutrinos. The SNO heavy water Cerenkov detector will be able to observe all three flavors of neutrinos and allow us to determine the probability of neutrino flavor oscillation. It is hoped that SNO will provide answers to the questions posed by the solar neutrino problem and the atmospheric neutrino anomaly. In order for the experiment to be successful, it is important to fully understand muon interactions. First, muons may produce an important source of background for solar neutrino detection. Secondly, the detection of high-energy atmospheric neutrinos depends on detection of muons produced by the neutrino interaction either inside the detector or in the material surrounding the detector. The processes induced by stopping muons and muon-nucleus interaction are of great importance in a water Cerenkov detector as they produce secondary particles. Muon capture and muon decay processes have been studied in detail. The routines describing theses processes have been implemented in the SNOMAN code to study the detector response. A model to describe muon-nucleus deep inelastic scattering is proposed. In particular, the attempts to parameterize the secondary hadron multiplicity due to deep inelastic scattering are made. In addition, the hadron transport code has been added to SNOMAN for the simulation of the secondary hadron transport and subsequent Cerenkov photon production. Full Monte Carlo simulation of muon transport down to the SNO detector depth has been performed to understand the kinematic properties of cosmic-ray muons entering the SNO detector. Based on the results of the simulations, a simplified method to generate muon flux deep underground has been developed. The usage of pattern recognition techniques with Artificial Neural Networks has been investigated for the event-type classification

  12. Complex networks identify spatial patterns of extreme rainfall events of the South American Monsoon System

    NASA Astrophysics Data System (ADS)

    Boers, Niklas; Bookhagen, Bodo; Marwan, Norbert; Kurths, Jürgen; Marengo, José

    2013-08-01

    We investigate the spatial characteristics of extreme rainfall synchronicity of the South American Monsoon System (SAMS) by means of Complex Networks (CN). By introducing a new combination of CN measures and interpreting it in a climatic context, we investigate climatic linkages and classify the spatial characteristics of extreme rainfall synchronicity. Although our approach is based on only one variable (rainfall), it reveals the most important features of the SAMS, such as the main moisture pathways, areas with frequent development of Mesoscale Convective Systems (MCS), and the major convergence zones. In addition, our results reveal substantial differences between the spatial structures of rainfall synchronicity above the 90th and above the 95th percentiles. Most notably, events above the 95th percentile contribute stronger to MCS in the La Plata Basin.

  13. 78 FR 42122 - Bridge Builder Trust and Olive Street Investment Advisers, LLC; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-15

    ... COMMISSION Bridge Builder Trust and Olive Street Investment Advisers, LLC; Notice of Application July 9, 2013... requirements. APPLICANTS: Bridge Builder Trust (the ``Trust'') and Olive Street Investment Advisers (the... currently intends to rely on the requested order as a Fund is the Bridge Builder Bond Fund. For purposes...

  14. Building America Top Innovations 2012: Reduced Call-Backs with High-Performance Production Builders

    SciTech Connect

    none,

    2013-01-01

    This Building America Top Innovations profile describes ways Building America teams have helped builders cut call-backs. Harvard University study found builders who worked with Building America had a 50% drop in call-backs. One builder reported a 50-fold reduction in the incidence of pipe freezing, a 50% reduction in drywall cracking, and a 60% decline in call-backs.

  15. Builder 1 & C: Naval Training Command Rate Training Manual. Revised 1973.

    ERIC Educational Resources Information Center

    Naval Training Command, Pensacola, FL.

    The training manual is designed to help Navy personnel meet the occupational qualifications for advancement to Builder First Class and Chief Builder. The introductory chapter provides information to aid personnel in their preparation for advancement and outlines the scope of the Builder rating and the types of billets to which he can be assigned.…

  16. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    PubMed Central

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  17. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network.

    PubMed

    Hao, Xiaoqing; An, Haizhong; Zhang, Lijia; Li, Huajiao; Wei, Guannan

    2015-01-01

    To study the sentiment diffusion of online public opinions about hot events, we collected people's posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P), weakly positive (p), neutral (o), weakly negative (n), and strongly negative (N). These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP) with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts' sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people's sentiments regarding online hot events according to the sentiment diffusion mechanism. PMID:26462230

  18. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network

    PubMed Central

    Hao, Xiaoqing; An, Haizhong; Zhang, Lijia; Li, Huajiao; Wei, Guannan

    2015-01-01

    To study the sentiment diffusion of online public opinions about hot events, we collected people’s posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P), weakly positive (p), neutral (o), weakly negative (n), and strongly negative (N). These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP) with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts’ sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people’s sentiments regarding online hot events according to the sentiment diffusion mechanism. PMID:26462230

  19. 5. DETAIL OF BUILDER'S PLATE, WHICH READS '1898, THE SANITARY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. DETAIL OF BUILDER'S PLATE, WHICH READS '1898, THE SANITARY DISTRICT OF CHICAGO, BOARD OF TRUSTEES, WILLIAM BOLDENWECK, JOSEPH C. BRADEN, ZINA R. CARTER, BERNARD A. ECKART, ALEXANDER J. JONES, THOMAS KELLY, JAMES P. MALLETTE, THOMAS SMYTHE, FRANK WINTER; ISHAM RANDOLPH, CHIEF ENGINEER.' - Santa Fe Railroad, Sanitary & Ship Canal Bridge, Spanning Sanitary & Ship Canal east of Harlem Avenue, Chicago, Cook County, IL

  20. 4. DETAIL OF BUILDER'S PLATE WHICH READS 'BUILT 1906 BY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. DETAIL OF BUILDER'S PLATE WHICH READS 'BUILT 1906 BY THE CHICAGO AND ALTON RY. CO.; G.H. KIMBALL CHIEF ENGINEER; W.M. HUGHES CONSULTING ENG'R; PAGE & SHNABLE PATENTEES' - Chicago & Alton Railroad Bridge, Spanning South Branch of Chicago River, Chicago, Cook County, IL

  1. New Whole-House Solutions Case Study: Pine Mountain Builders

    SciTech Connect

    none,

    2013-02-01

    Pine Mountain Builders achieved HERS scores as low as 59 and electric bills as low as $50/month with extensive air sealing (blower door tests = 1.0 to 1.8 ACH 50), R-3 XPS sheathing instead of OSB, and higher efficiency heat pumps.

  2. 10. DETAIL OF BUILDER'S PLATE AT NORTH PORTAL. PLATE READS: ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. DETAIL OF BUILDER'S PLATE AT NORTH PORTAL. PLATE READS: 1889, BUILT BY THE BERLIN IRON BRIDGE CO. EAST BERLIN CONN. DOUGLAS & JARVIS PAT. APT. 16, 1878, AP'L 17, 1885. A.P. FORESMAN, WM. S. STARR, T.J. STREBEIGH, COMMISSIONERS. - Pine Creek Bridge, River Road spanning Pine Creek, Jersey Shore, Lycoming County, PA

  3. Polyol and Amino Acid-Based Biosurfactants, Builders, and Hydrogels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This chapter reviews different detergent materials which have been synthesized from natural agricultural commodities. Background information, which gives reasons why the use of biobased materials may be advantageous, is presented. Detergent builders from L-aspartic acid, citric acid and D-sorbitol...

  4. E-Classical Fairy Tales: Multimedia Builder as a Tool

    ERIC Educational Resources Information Center

    Eteokleous, Nikleia; Ktoridou, Despo; Tsolakidis, Symeon

    2011-01-01

    The study examines pre-service teachers' experiences in delivering a traditional-classical fairy tale using the Multimedia Builder software, in other words an e-fairy tale. A case study approach was employed, collecting qualitative data through classroom observations and focus groups. The results focus on pre-service teachers' reactions, opinions,…

  5. 2. DETAIL OF BUILDER'S PLATE: 'SUPERSTRUCTURE BUILT BY STROBEL STEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. DETAIL OF BUILDER'S PLATE: 'SUPERSTRUCTURE BUILT BY STROBEL STEEL CONSTRUCTION CO., CHICAGO, ILL., 1913, SUBSTRUCTURE BUILT BY FITZSIMONS & CONNELL D&D CO., CHICAGO, ILL.' - Chicago River Bascule Bridge, Grand Avenue, Spanning North Branch Chicago River at Grand Avenue, Chicago, Cook County, IL

  6. Marketing and promoting solar water heaters to home builders

    SciTech Connect

    Keller, C.; Ghent, P.

    1999-12-06

    This is the final report of a four-task project to develop a marketing plan designed for businesses interested in marketing solar water heaters in the new home industry. This report outlines suggested marketing communication materials and other promotional tools focused on selling products to the new home builder. Information relevant to promoting products to the new home buyer is also included.

  7. Children's E-Book Technology: Devices, Books, and Book Builder

    ERIC Educational Resources Information Center

    Shiratuddin, Norshuhada; Landoni, Monica

    2003-01-01

    This article describes a study of children's electronic books (e-books) technology. In particular, the focus is on devices used to access children's e-books, current available e-books and an e-book builder specifically for children. Three small case studies were conducted: two to evaluate how children accept the devices and one to test the ease of…

  8. SDMProjectBuilder: SWAT Setup for Nutrient Fate and Transport

    EPA Science Inventory

    This tutorial reviews some of the screens, icons, and basic functions of the SDMProjectBuilder (SDMPB) and explains how one uses SDMPB output to populate the Soil and Water Assessment Tool (SWAT) input files for nutrient fate and transport modeling in the Salt River Basin. It dem...

  9. 4. DETAIL OF BUILDER'S PLATES: '1901, CARTER H. HARRISON, COMMISSIONER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. DETAIL OF BUILDER'S PLATES: '1901, CARTER H. HARRISON, COMMISSIONER OF PUBLIC WORKS, MAYOR F.W. BLOCKI, JOHN ERICSON, CITY ENGINEER'; 'FITZSIMONS AND CONNELL CO. SUBSTRUCTURE'; 'AMERICAN BRIDGE COMPANY, LASSIG PLANT, CONTRACTOR FOR SUPERSTRUCTURE' - Chicago River Bascule Bridge, West Cortland Street, Spanning North Branch of Chicago River at West Cortland Street, Chicago, Cook County, IL

  10. Collaborative-Comparison Learning for Complex Event Detection Using Distributed Hierarchical Graph Neuron (DHGN) Approach in Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Muhamad Amin, Anang Hudaya; Khan, Asad I.

    Research trends in existing event detection schemes using Wireless Sensor Network (WSN) have mainly focused on routing and localisation of nodes for optimum coordination when retrieving sensory information. Efforts have also been put in place to create schemes that are able to provide learning mechanisms for event detection using classification or clustering approaches. These schemes entail substantial communication and computational overheads owing to the event-oblivious nature of data transmissions. In this paper, we present an event detection scheme that has the ability to distribute detection processes over the resource-constrained wireless sensor nodes and is suitable for events with spatio-temporal characteristics. We adopt a pattern recognition algorithm known as Distributed Hierarchical Graph Neuron (DHGN) with collaborative-comparison learning for detecting critical events in WSN. The scheme demonstrates good accuracy for binary classification and offers low-complexity and high-scalability in terms of its processing requirements.

  11. FormBuilder/FBGraphics programmer`s reference manual

    SciTech Connect

    Goetsch, J.A.

    1994-09-01

    A primary concern in most modern software applications is the development of an attractive, ``friendly`` Graphical User Interface (GUI). Increasingly, that concern is being met through the use of the OSF/Motif widget set. While this software toolset is extremely powerful, the vast knowledge and attention to detail required by the programmer tend to be nearly unmanageable. This translates into an extended learning curve and a large investment of time and effort before the programmer reaches a desirable level of productivity. Even then, developing anything but the most basic GUI often proves to be a tedious and costly undertaking. FormBuilder is an application programmers interface (API) that provides the programmer with a high-level interface to a subset of the ``X`` Window System and the OSF/Motif widget set. Through the use of the FormBuilder data types and procedure calls, the GUI programmer is afforded several distinct advantages over coding directly at the Motif, Xt, and Xlib layers. Among these advantages are a substantially reduced learning curve, more readable/maintainable/modifiable code, smaller, more efficient binaries, and reduced compile/link/debug time during development. Working in concert with the FormBuilder library is the FBGraphics library, a 2-dimensional graphics library that allows the programmer to perform graphical operations within certain FormBuilder ``windows``. The FBGraphics library is based on the Xlib drawing routines, and much like FormBuilder, its purpose is to provide the programmers with a simpler, more productive mechanism for producing the desired graphical output on the screen.

  12. Mining induced seismic event on an inactive fault in view of local surface and in mine underground networksS

    NASA Astrophysics Data System (ADS)

    Rudzinski, Lukasz; Lizurek, Grzegorz; Plesiewicz, Beata

    2014-05-01

    On 19th March 2013 tremor shook the surface of Polkowice town were "Rudna" mine is located. This event of ML=4.2 was third most powerful seismic event recorded in Legnica Głogów Copper District (LGCD). Citizens of the area reported that felt tremors were bigger and last longer than any other ones felt in last couple years. The event was studied with use of two different networks: underground network of "Rudna" mine and surface local network run by IGF PAS (LUMINEOS network). The first one is composed of 32 vertical seismometers at mining level, except 5 sensors placed in elevator shafts, seismometers location depth varies from 300 down to 1000 meters below surface. The seismometers used in this network are vertical short period Willmore MkII and MkIII sensors, with the frequency band from 1Hz to 100Hz. At the beginning of 2013th the local surface network of the Institute of Geophysics Polish Academy of Sciences (IGF PAS) with acronym LUMINEOS was installed under agreement with KGHM SA and "Rudna" mine officials. This network at the moment of the March 19th 2013 event was composed of 4 short-period one-second triaxial seismometers LE-3D/1s manufactured by Lenartz Electronics. Analysis of spectral parameters of the records from in mine seismic system and surface LUMINEOS network along with broadband station KSP record were carried out. Location of the event was close to the Rudna Główna fault zone, the nodal planes orientations determined with two different approaches were almost parallel to the strike of the fault. The mechanism solutions were also obtained in form of Full Moment Tensor inversion from P wave amplitude pulses of underground records and waveform inversion of surface network seismograms. Final results of the seismic analysis along with macroseismic survey and observed effects from the destroyed part of the mining panel indicate that the mechanism of the event was thrust faulting on inactive tectonic fault. The results confirm that the fault zones

  13. Real-time Monitoring Network to Characterize Anthropogenic and Natural Events Affecting the Hudson River, NY

    NASA Astrophysics Data System (ADS)

    Islam, M. S.; Bonner, J. S.; Fuller, C.; Kirkey, W.; Ojo, T.

    2011-12-01

    The Hudson River watershed spans 34,700 km2 predominantly in New York State, including agricultural, wilderness, and urban areas. The Hudson River supports many activities including shipping, supplies water for municipal, commercial, and agricultural uses, and is an important recreational resource. As the population increases within this watershed, so does the anthropogenic impact on this natural system. To address the impacts of anthropogenic and natural activities on this ecosystem, the River and Estuary Observatory Network (REON) is being developed through a joint venture between the Beacon Institute, Clarkson University, General Electric Inc. and IBM Inc. to monitor New York's Hudson and Mohawk Rivers in real-time. REON uses four sensor platform types with multiple nodes within the network to capture environmentally relevant episodic events. Sensor platform types include: 1) fixed robotic vertical profiler (FRVP); 2) mobile robotic undulating platform (MRUP); 3) fixed acoustic Doppler current profiler (FADCP) and 4) Autonomous Underwater Vehicle (AUV). The FRVP periodically generates a vertical profile with respect to water temperature, salinity, dissolved oxygen, particle concentration and size distribution, and fluorescence. The MRUP utilizes an undulating tow-body tethered behind a research vessel to measure the same set of water parameters as the FRVP, but does so 'synchronically' over a highly-resolved spatial regime. The fixed ADCP provides continuous water current profiles. The AUV maps four-dimensional (time, latitude, longitude, depth) variation of water quality, water currents and bathymetry along a pre-determined transect route. REON data can be used to identify episodic events, both anthropogenic and natural, that impact the Hudson River. For example, a strong heat signature associated with cooling water discharge from the Indian Point nuclear power plant was detected with the MRUP. The FRVP monitoring platform at Beacon, NY, located in the

  14. Application of Parallel Discrete Event Simulation to the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jefferson, D.; Leek, J.

    2010-09-01

    In this paper we describe how and why we chose parallel discrete event simulation (PDES) as the paradigm for modeling the Space Surveillance Network (SSN) in our modeling framework, TESSA (Testbed Environment for Space Situational Awareness). DES is a simulation paradigm appropriate for systems dominated by discontinuous state changes at times that must be calculated dynamically. It is used primarily for complex man-made systems like telecommunications, vehicular traffic, computer networks, economic models etc., although it is also useful for natural systems that are not described by equations, such as particle systems, population dynamics, epidemics, and combat models. It is much less well known than simple time-stepped simulation methods, but has the great advantage of being time scale independent, so that one can freely mix processes that operate at time scales over many orders of magnitude with no runtime performance penalty. In simulating the SSN we model in some detail: (a) the orbital dynamics of up to 105 objects, (b) their reflective properties, (c) the ground- and space-based sensor systems in the SSN, (d) the recognition of orbiting objects and determination of their orbits, (e) the cueing and scheduling of sensor observations, (f) the 3-d structure of satellites, and (g) the generation of collision debris. TESSA is thus a mixed continuous-discrete model. But because many different types of discrete objects are involved with such a wide variation in time scale (milliseconds for collisions, hours for orbital periods) it is suitably described using discrete events. The PDES paradigm is surprising and unusual. In any instantaneous runtime snapshot some parts my be far ahead in simulation time while others lag behind, yet the required causal relationships are always maintained and synchronized correctly, exactly as if the simulation were executed sequentially. The TESSA simulator is custom-built, conservatively synchronized, and designed to scale to

  15. Simulation and Performance evaluation of ZigBee for wireless sensor networks having multiple events occurring simultaneously at a time

    NASA Astrophysics Data System (ADS)

    Dhama, Nitin; Minal, Kaur, Prabhjot; Kumar, Neelu

    2010-11-01

    ZigBee is an emerging standard for Wireless Sensor Networks (WSNs). It targets low distance, low data rate, low power consumption and low cost applications. According to standard nomenclature, it implements a Low Rate-Wireless Personal Area Network (LR-WPAN). ZigBee defines upper layers (network and application) of the ISO protocol reference model. On the contrary, in regards to the physical and data link ones, it relies over another standard, the well accepted IEEE802.15.4, which offers a gross transfer rate of 250 kbps in the 2.4 GHz ISM unlicensed band. Although ZigBee is designed for event-based applications, ZigBee is designed as a low-cost, low-power, low-data rate wireless mesh technology. There are many wireless sensor networks in which it is required to send information to the pan coordinator continuously and simultaneously. Our purpose here in this paper is to test zigbee for such kind of networks where multiple events take place simultaneously. Also we want to see the effect of increasing the number of events in a scenario, so that we can find out its effect.

  16. Molecular Insights into Reprogramming-Initiation Events Mediated by the OSKM Gene Regulatory Network

    PubMed Central

    Liao, Mei-Chih; Prigione, Alessandro; Jozefczuk, Justyna; Lichtner, Björn; Wolfrum, Katharina; Haltmeier, Manuela; Flöttmann, Max; Schaefer, Martin; Hahn, Alexander; Mrowka, Ralf; Klipp, Edda; Andrade-Navarro, Miguel A.; Adjaye, James

    2011-01-01

    Somatic cells can be reprogrammed to induced pluripotent stem cells by over-expression of OCT4, SOX2, KLF4 and c-MYC (OSKM). With the aim of unveiling the early mechanisms underlying the induction of pluripotency, we have analyzed transcriptional profiles at 24, 48 and 72 hours post-transduction of OSKM into human foreskin fibroblasts. Experiments confirmed that upon viral transduction, the immediate response is innate immunity, which induces free radical generation, oxidative DNA damage, p53 activation, senescence, and apoptosis, ultimately leading to a reduction in the reprogramming efficiency. Conversely, nucleofection of OSKM plasmids does not elicit the same cellular stress, suggesting viral response as an early reprogramming roadblock. Additional initiation events include the activation of surface markers associated with pluripotency and the suppression of epithelial-to-mesenchymal transition. Furthermore, reconstruction of an OSKM interaction network highlights intermediate path nodes as candidates for improvement intervention. Overall, the results suggest three strategies to improve reprogramming efficiency employing: 1) anti-inflammatory modulation of innate immune response, 2) pre-selection of cells expressing pluripotency-associated surface antigens, 3) activation of specific interaction paths that amplify the pluripotency signal. PMID:21909390

  17. Best Practices Case Study: Pine Mountain Builders - Pine Mountain, GA

    SciTech Connect

    2011-09-01

    Case study of Pine Mountain Builders who worked with DOE’s IBACOS team to achieve HERS scores of 59 on 140 homes built around a wetlands in Georgia. The team used taped rigid foam exterior sheathing and spray foam insulation in the walls and on the underside of the attic for a very tight 1.0 to 1.8 ACH 50 building shell.

  18. HOT METAL BRIDGE (NOTE: BUILDERS: JONES AND LAUGHLIN STEEL CA. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT METAL BRIDGE (NOTE: BUILDERS: JONES AND LAUGHLIN STEEL CA. 1890), SOUTH PORTAL. THREE PIN CONNECTED CAMELBACK TRUSS SPANS, ONE SKEWED THROUGH TRUSS SPAN ON NORTH SIDE TRUSS BRIDGE, EAST OF HOT METAL BRIDGE BUILT BY AMERICAN BRIDGE COMPANY CA. 1910. (RIVETED MULTI-SPAN TRUSS). - Jones & Laughlin Steel Corporation, Pittsburgh Works, Morgan Billet Mill Engine, 550 feet north of East Carson Street, opposite South Twenty-seventh Street, Pittsburgh, Allegheny County, PA

  19. CHARMM-GUI Membrane Builder toward realistic biological membrane simulations.

    PubMed

    Wu, Emilia L; Cheng, Xi; Jo, Sunhwan; Rui, Huan; Song, Kevin C; Dávila-Contreras, Eder M; Qi, Yifei; Lee, Jumin; Monje-Galvan, Viviana; Venable, Richard M; Klauda, Jeffery B; Im, Wonpil

    2014-10-15

    CHARMM-GUI Membrane Builder, http://www.charmm-gui.org/input/membrane, is a web-based user interface designed to interactively build all-atom protein/membrane or membrane-only systems for molecular dynamics simulations through an automated optimized process. In this work, we describe the new features and major improvements in Membrane Builder that allow users to robustly build realistic biological membrane systems, including (1) addition of new lipid types, such as phosphoinositides, cardiolipin (CL), sphingolipids, bacterial lipids, and ergosterol, yielding more than 180 lipid types, (2) enhanced building procedure for lipid packing around protein, (3) reliable algorithm to detect lipid tail penetration to ring structures and protein surface, (4) distance-based algorithm for faster initial ion displacement, (5) CHARMM inputs for P21 image transformation, and (6) NAMD equilibration and production inputs. The robustness of these new features is illustrated by building and simulating a membrane model of the polar and septal regions of E. coli membrane, which contains five lipid types: CL lipids with two types of acyl chains and phosphatidylethanolamine lipids with three types of acyl chains. It is our hope that CHARMM-GUI Membrane Builder becomes a useful tool for simulation studies to better understand the structure and dynamics of proteins and lipids in realistic biological membrane environments. PMID:25130509

  20. Adaptive Multi-Path Routing with Guaranteed Target-Delivery Ratio of Critical Events in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Baek, Jang Woon; Nam, Young Jin; Seo, Dae-Wha

    Wireless sensor networks are subject to node and link failures for a variety of reasons. This paper proposes a k-disjoint-path routing algorithm that varies the number of disjoint paths (k) in order to meet a target-delivery ratio of critical events and to reduce energy consumption. The proposed algorithm sends packets to the base station through a single path without the occurrence of critical events, however, it sends packets to the base station through k disjoint paths (k > 1) under the occurrence of critical events, where k is computed from a well-defined fault model. The proposed algorithm detects the occurrence of critical events by monitoring collected data patterns. The simulation results reveal that the proposed algorithm is more resilient to random node failure and patterned failure than other routing algorithms, and it also decreases energy consumption much more than the multi-path and path-repair algorithms.

  1. Onset and Offset of Aversive Events Establish Distinct Memories Requiring Fear and Reward Networks

    ERIC Educational Resources Information Center

    Andreatta, Marta; Fendt, Markus; Muhlberger, Andreas; Wieser, Matthias J.; Imobersteg, Stefan; Yarali, Ayse; Gerber, Bertram; Pauli, Paul

    2012-01-01

    Two things are worth remembering about an aversive event: What made it happen? What made it cease? If a stimulus precedes an aversive event, it becomes a signal for threat and will later elicit behavior indicating conditioned fear. However, if the stimulus is presented upon cessation of the aversive event, it elicits behavior indicating…

  2. Spatial interpolation of precipitation in a dense gauge network for monsoon storm events in the southwestern United States

    NASA Astrophysics Data System (ADS)

    Garcia, Matthew; Peters-Lidard, Christa D.; Goodrich, David C.

    2008-05-01

    Inaccuracy in spatially distributed precipitation fields can contribute significantly to the uncertainty of hydrological states and fluxes estimated from land surface models. This paper examines the results of selected interpolation methods for both convective and mixed/stratiform events that occurred during the North American monsoon season over a dense gauge network at the U.S. Department of Agriculture Agricultural Research Service Walnut Gulch Experimental Watershed in the southwestern United States. The spatial coefficient of variation for the precipitation field is employed as an indicator of event morphology, and a gauge clustering factor CF is formulated as a new, scale-independent measure of network organization. We consider that CF < 0 (a more distributed gauge network) will produce interpolation errors by reduced resolution of the precipitation field and that CF > 0 (clustering in the gauge network) will produce errors because of reduced areal representation of the precipitation field. Spatial interpolation is performed using both inverse-distance-weighted (IDW) and multiquadric-biharmonic (MQB) methods. We employ ensembles of randomly selected network subsets for the statistical evaluation of interpolation errors in comparison with the observed precipitation. The magnitude of interpolation errors and differences in accuracy between interpolation methods depend on both the density and the geometrical organization of the gauge network. Generally, MQB methods outperform IDW methods in terms of interpolation accuracy under all conditions, but it is found that the order of the IDW method is important to the results and may, under some conditions, be just as accurate as the MQB method. In almost all results it is demonstrated that the inverse-distance-squared method for spatial interpolation, commonly employed in operational analyses and for engineering assessments, is inferior to the ID-cubed method, which is also more computationally efficient than the MQB

  3. Prediction of the most extreme rainfall events in the South American Andes: A statistical forecast based on complex networks

    NASA Astrophysics Data System (ADS)

    Boers, Niklas; Bookhagen, Bodo; Barbosa, Henrique; Marwan, Norbert; Kurths, Jürgen; Marengo, Jose

    2015-04-01

    During the monsoon season, the subtropical Andes in South America are exposed to spatially extensive extreme rainfall events that frequently lead to flashfloods and landslides with severe socio-economic impacts. Since dynamical weather forecast has substantial problems with predicting the most extreme events (above the 99th percentile), alternative forecast methods are called for. Based on complex network theory, we developed a general mathematical framework for statistical prediction of extreme events in significantly interrelated time series. The key idea of our approach is to make the internal synchronization structure of extreme events mathematically accessible in terms of the topology of a network which is constructed from measuring the synchronization of extreme events at different locations. The application of our method to high-spatiotemporal resolution rainfall data (TRMM 3B42) reveals a migration pattern of large convective systems from southeastern South America towards the Argentinean and Bolivian Andes, against the direction of the northwesterly low-level moisture flow from the Amazon Basin. Once these systems reach the Andes, they lead to spatially extensive extreme events up to elevations above 4000m, leading to substantial risks of associated natural hazards. Based on atmospheric composites, we could identify an intricate interplay of frontal systems approaching from the South, low-level moisture flow from the Amazon Basin to the North, and the Andean orography as responsible climatic mechanism. These insights allow to formulate a simple forecast rule predicting 60% (90% during El Niño conditions) of extreme rainfall events at the eastern slopes of the subtropical Andes. The rule can be computed from readily available rainfall and pressure data and is already being tested by local institutions for disaster preparation.

  4. Development of an event search and download system for analyzing waveform data observed at seafloor seismic network, DONET

    NASA Astrophysics Data System (ADS)

    Takaesu, M.; Horikawa, H.; Sueki, K.; Kamiya, S.; Nakamura, T.; Nakano, M.; Takahashi, N.; Sonoda, A.; Tsuboi, S.

    2014-12-01

    Mega-thrust earthquakes are anticipated to occur in the Nankai Trough in southwest Japan. In the source areas, we installed seafloor seismic network, DONET (Dense Ocean-floor Network System for Earthquake and Tsunamis), in 2010 in order to monitor seismicity, crustal deformations, and tsunamis. DONET system consists of totally 20 stations, which is composed of six kinds of sensors; strong-motion and broadband seismometers, quartz and differential pressure gauges, hydrophone, and thermometer. The stations are densely distributed with an average spatial interval of 15-20 km and cover near coastal areas to the trench axis. Observed data are transferred to a land station through a fiber-optical cable and then to JAMSTEC (Japan Agency for Marine-Earth Science and Technology) data management center through a private network in real time. The data are based on WIN32 format in the private network and finally archived in SEED format in the management center to combine waveform data with related metadata. We are developing a web-based application system to easily download seismic waveform data of DONET. In this system, users can select 20 Hz broadband (BH type) and 200 Hz strong-motion (EH type) data and download them in SEED. Users can also search events from the options of time periods, magnitude, source area and depth in a GUI platform. Event data are produced referring to event catalogues from USGS and JMA (Japan Meteorological Agency). The thresholds of magnitudes for the production are M6 for far-field and M4 for local events using the USGS and JMA lists, respectively. Available data lengths depend on magnitudes and epicentral distances. In this presentation, we briefly introduce DONET stations and then show our developed application system. We open DONET data through the system and want them to be widely recognized so that many users analyze. We also discuss next plans for further developments of the system.

  5. Evaluation of the U.S. Department of Energy Challenge Home Program Certification of Production Builders

    SciTech Connect

    Kerrigan, P.; Loomis, H.

    2014-09-01

    The purpose of this project was to evaluate integrated packages of advanced measures in individual test homes to assess their performance with respect to Building America Program goals, specifically compliance with the DOE Challenge Home Program. BSC consulted on the construction of five test houses by three Cold Climate production builders in three separate US cities. BSC worked with the builders to develop a design package tailored to the cost-related impacts for each builder. Therefore, the resulting design packages do vary from builder to builder. BSC provided support through this research project on the design, construction and performance testing of the five test homes. Overall, the builders have concluded that the energy related upgrades (either through the prescriptive or performance path) represent reasonable upgrades. The builders commented that while not every improvement in specification was cost effective (as in a reasonable payback period), many were improvements that could improve the marketability of the homes and serve to attract more energy efficiency discerning prospective homeowners. However, the builders did express reservations on the associated checklists and added certifications. An increase in administrative time was observed with all builders. The checklists and certifications also inherently increase cost due to: 1. Adding services to the scope of work for various trades, such as HERS Rater, HVAC contractor; 2. Increased material costs related to the checklists, especially the EPA Indoor airPLUS and EPA WaterSense(R) Efficient Hot Water Distribution requirement.

  6. Discrimination Analysis of Earthquakes and Man-Made Events Using ARMA Coefficients Determination by Artificial Neural Networks

    SciTech Connect

    AllamehZadeh, Mostafa

    2011-12-15

    A Quadratic Neural Networks (QNNs) model has been developed for identifying seismic source classification problem at regional distances using ARMA coefficients determination by Artificial Neural Networks (ANNs). We have devised a supervised neural system to discriminate between earthquakes and chemical explosions with filter coefficients obtained by windowed P-wave phase spectra (15 s). First, we preprocess the recording's signals to cancel out instrumental and attenuation site effects and obtain a compact representation of seismic records. Second, we use a QNNs system to obtain ARMA coefficients for feature extraction in the discrimination problem. The derived coefficients are then applied to the neural system to train and classification. In this study, we explore the possibility of using single station three-component (3C) covariance matrix traces from a priori-known explosion sites (learning) for automatically recognizing subsequent explosions from the same site. The results have shown that this feature extraction gives the best classifier for seismic signals and performs significantly better than other classification methods. The events have been tested, which include 36 chemical explosions at the Semipalatinsk test site in Kazakhstan and 61 earthquakes (mb = 5.0-6.5) recorded by the Iranian National Seismic Network (INSN). The 100% correct decisions were obtained between site explosions and some of non-site events. The above approach to event discrimination is very flexible as we can combine several 3C stations.

  7. Adaptive and context-aware detection and classification of potential QoS degradation events in biomedical wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Abreu, Carlos; Miranda, Francisco; Mendes, Paulo M.

    2016-06-01

    The use of wireless sensor networks in healthcare has the potential to enhance the services provided to citizens. In particular, they play an important role in the development of state-of-the-art patient monitoring applications. Nevertheless, due to the critical nature of the data conveyed by such patient monitoring applications, they have to fulfil high standards of quality of service in order to obtain the confidence of all players in the healthcare industry. In such context, vis-à-vis the quality of service being provided by the wireless sensor network, this work presents an adaptive and context-aware method to detect and classify performance degradation events. The proposed method has the ability to catch the most significant and damaging variations on the metrics being used to quantify the quality of service provided by the network without overreacting to small and innocuous variations on the metric's value.

  8. Whole-House Approach Benefits Builders, Buyers, and the Environment

    SciTech Connect

    Not Available

    2001-05-01

    This document provides an overview of the U.S. Department of Energy's Building America program. Building America works with the residential building industry to develop and implement innovative building processes and technologies-innovations that save builders and homeowners millions of dollars in construction and energy costs. This industry-led, cost-shared partnership program aims to reduce energy use by 50% and reduce construction time and waste, improve indoor air quality and comfort, encourage a systems engineering approach for design and construction of new homes, and accelerate the development and adoption of high performance in production housing.

  9. What would you do? Managing a metro network during mass crowd events.

    PubMed

    Barr, Andy C; Lau, Raymond C M; Ng, Nelson W H; da Silva, Marco Antônio; Baptista, Marcia; Oliveira, Vinícius Floriano; Barbosa, Maria Beatriz; Batistini, Estela; de Toledo Ramos, Nancy

    2010-03-01

    Major public events, such as sporting events, carnivals and festivals, are common occurrences in urban and city environments. They are characterised by the mass movement of people in relatively small areas, far in excess of normal daily activity. This section reviews how different metro systems across the globe respond to such peaks of activity, ensuring that people are moved swiftly, efficiently and safely. To this end, representatives from four major public metro systems (London, Hong Kong, Rio de Janeiro and São Paulo) describe how their respective metro systems respond to the capacity demands of a major annual event. PMID:20494882

  10. Detecting and mitigating abnormal events in large scale networks: budget constrained placement on smart grids

    SciTech Connect

    Santhi, Nandakishore; Pan, Feng

    2010-10-19

    Several scenarios exist in the modern interconnected world which call for an efficient network interdiction algorithm. Applications are varied, including various monitoring and load shedding applications on large smart energy grids, computer network security, preventing the spread of Internet worms and malware, policing international smuggling networks, and controlling the spread of diseases. In this paper we consider some natural network optimization questions related to the budget constrained interdiction problem over general graphs, specifically focusing on the sensor/switch placement problem for large-scale energy grids. Many of these questions turn out to be computationally hard to tackle. We present a particular form of the interdiction question which is practically relevant and which we show as computationally tractable. A polynomial-time algorithm will be presented for solving this problem.

  11. Solar installer training: Home Builders Institute Job Corps

    SciTech Connect

    Hansen, K.; Mann, R.

    1996-10-01

    The instructors describe the solar installation training program operated since 1979 by the Home Builders Institute, the Educational Arm of the National Association of Home Builders for the US Department of Labor, Job Corps in San Diego, CA. The authors are the original instructors and have developed the program since its inception by a co-operative effort between the Solar Energy Industries Association, NAHB and US DOL. Case studies of a few of the 605 students who have gone to work over the years after the training are included. It is one of the most successful programs under the elaborate Student Performance Monitoring Information System used by all Job Corps programs. Job Corps is a federally funded residential job training program for low income persons 16--24 years of age. Discussion details the curriculum and methods used in the program including classroom, shop and community service projects. Solar technologies including all types of hot water heating, swimming pool and spa as well as photovoltaics are included.

  12. Evaluation of the Bridge Builders Program: Students Involved in Multicultural Activities.

    ERIC Educational Resources Information Center

    Petry, John R.; McCree, Herbert L.

    Bridge Builders is a 2-year program intended to develop leadership in high school students. Programmatic goals include enhancing the participants' understanding of other racial and ethnic groups, socioeconomic groups, gender awareness, social responsibility, and the value of community service. Bridge Builders participants confronted community…

  13. Wildlife Scenario Builder and User's Guide (Version 1.0, Beta Test)

    EPA Science Inventory

    Cover of the Wildlife Scenario <span class=Builder User's Manual" vspace = "5" hspace="5" align="right" border="1" /> The Wildlife Scenario Builder (WSB) was developed to improve the quality of wildlif...

  14. DOE Zero Energy Ready Home Case Study: New Town Builders, Denver, Colorado

    SciTech Connect

    none,

    2013-09-01

    All homes in the Stapleton community must be ENERGY STAR certified; New Town Builders has announced that it will build 250–300 new homes over the next 7–10 years, all of which will be Challenge Homes. New Town received a 2013 Housing Innovation Award in the production builder category.

  15. Practices and Processes of Leading High Performance Home Builders in the Upper Midwest

    SciTech Connect

    Von Thoma, E.; Ojczyk, C.

    2012-12-01

    The NorthernSTAR Building America Partnership team proposed this study to gain insight into the business, sales, and construction processes of successful high performance builders. The knowledge gained by understanding the high performance strategies used by individual builders, as well as the process each followed to move from traditional builder to high performance builder, will be beneficial in proposing more in-depth research to yield specific action items to assist the industry at large transform to high performance new home construction. This investigation identified the best practices of three successful high performance builders in the upper Midwest. In-depth field analysis of the performance levels of their homes, their business models, and their strategies for market acceptance were explored. All three builders commonly seek ENERGY STAR certification on their homes and implement strategies that would allow them to meet the requirements for the Building America Builders Challenge program. Their desire for continuous improvement, willingness to seek outside assistance, and ambition to be leaders in their field are common themes. Problem solving to overcome challenges was accepted as part of doing business. It was concluded that crossing the gap from code-based building to high performance based building was a natural evolution for these leading builders.

  16. Complex network analysis of high rainfall events during the northeast monsoon over south peninsular India and Sri Lanka

    NASA Astrophysics Data System (ADS)

    Martin, P.; Malik, N.; Marwan, N.; Kurths, J.

    2012-04-01

    The Indian Summer monsoon (ISM) accounts for a large part of the annual rainfall budget across most of the Indian peninsula; however, the coastal regions along the southeast Indian peninsula, as well as Sri Lanka, receive 50% or more of their annual rainfall budget during the northeast monsoon (NEM), or winter monsoon, during the months from October through December. In this study, we investigate the behavior of the NEM over the last 60 years using complex network theory. The network is constructed according to a method previously developed for the ISM, using event synchronization of extreme rainfall events as a correlation measure to create directed and undirected links between geographical locations, which represent potential pathways of moisture transport. Network measures, such as degree centrality and closeness centrality, are then used to illuminate the dynamics of the NEM rainfall over the relevant regions, and to examine the spatial distribution and temporal evolution of the rainfall. Understanding the circulation of the monsoon cycle as a whole, i.e. the NEM together with the ISM, is vital for the agricultural industry and thus the population of the affected areas.

  17. Building America Case Study: New Town Builders' Power of Zero Energy Center, Denver, Colorado (Brochure)

    SciTech Connect

    Not Available

    2014-10-01

    New Town Builders, a builder of energy efficient homes in Denver, Colorado, offers a zero energy option for all the homes it builds. To attract a wide range of potential homebuyers to its energy efficient homes, New Town Builders created a 'Power of Zero Energy Center' linked to its model home in the Stapleton community of Denver. This case study presents New Town Builders' marketing approach, which is targeted to appeal to homebuyers' emotions rather than overwhelming homebuyers with scientific details about the technology. The exhibits in the Power of Zero Energy Center focus on reduced energy expenses for the homeowner, improved occupant comfort, the reputation of the builder, and the lack of sacrificing the homebuyers' desired design features to achieve zero net energy in the home. The case study also contains customer and realtor testimonials related to the effectiveness of the Center in influencing homebuyers to purchase a zero energy home.

  18. Practices and Processes of Leading High Performance Home Builders in the Upper Midwest

    SciTech Connect

    Von Thoma, Ed; Ojzcyk, Cindy

    2012-12-01

    The NorthernSTAR Building America Partnership team proposed this study to gain insight into the business, sales, and construction processes of successful high performance builders. The knowledge gained by understanding the high performance strategies used by individual builders, as well as the process each followed to move from traditional builder to high performance builder, will be beneficial in proposing more in-depth research to yield specific action items to assist the industry at large transform to high performance new home construction. This investigation identified the best practices of three successful high performance builders in the upper Midwest. In-depth field analysis of the performance levels of their homes, their business models, and their strategies for market acceptance were explored.

  19. Co-design of H∞ jump observers for event-based measurements over networks

    NASA Astrophysics Data System (ADS)

    Peñarrocha, Ignacio; Dolz, Daniel; Romero, Julio Ariel; Sanchis, Roberto

    2016-01-01

    This work presents a strategy to minimise the network usage and the energy consumption of wireless battery-powered sensors in the observer problem over networks. The sensor nodes implement a periodic send-on-delta approach, sending new measurements when a measure deviates considerably from the previous sent one. The estimator node implements a jump observer whose gains are computed offline and depend on the combination of available new measurements. We bound the estimator performance as a function of the sending policies and then state the design procedure of the observer under fixed sending thresholds as a semidefinite programming problem. We address this problem first in a deterministic way and, to reduce conservativeness, in a stochastic one after obtaining bounds on the probabilities of having new measurements and applying robust optimisation problem over the possible probabilities using sum of squares decomposition. We relate the network usage with the sending thresholds and propose an iterative procedure for the design of those thresholds, minimising the network usage while guaranteeing a prescribed estimation performance. Simulation results and experimental analysis show the validity of the proposal and the reduction of network resources that can be achieved with the stochastic approach.

  20. Storm Event Variability in Particulate Organic Matter Source, Size, and Carbon and Nitrogen Content Along a Forested Drainage Network

    NASA Astrophysics Data System (ADS)

    Rowland, R. D.; Inamdar, S. P.; Parr, T. B.

    2015-12-01

    Coupled inputs of carbon and nitrogen comprise an important energy and nutrient subsidy for aquatic ecosystems. Large storm events can mobilize substantial amounts of these elements, especially in particulate form. While the role of storms in mobilizing allochthonous particulate organic matter (POM) is well recognized, less is known about the changes in source, particle size, and composition of POM as it is routed through the fluvial network. Questions we addressed include- (a) How does source, size, and C and N content of suspended POM vary with storm magnitude and intensity? (b) How does POM size and C and N content evolve along the drainage network? (c) How accurate are high-frequency, in-situ sensors in characterizing POM? We conducted this study in a 79 ha, forested catchment in the Piedmont region of Maryland. Event sampling for suspended POM was performed using automated stream water samplers and in-situ, high-frequency sensors (s::can specto::lyser and YSI EXO 2; 30 minute intervals) at 12 and 79 ha drainage locations. Composite storm-event sediment samples were also collected using passive samplers at five catchment drainage scales. Data is available for multiple storms since August 2014. Samples were partitioned into three discrete particle size classes (coarse: 1000-2000 µm, medium: 250-1000 µm, fine: < 250 µm) for organic C and N determination. Suspended sediments and seven soil end members were also analyzed for stable 13C and 15N isotopes ratios to characterize the evolution in sediment sources through the drainage network. Contrary to our expectations, preliminary results suggest finer suspended sediments in the upstream portion of the catchment, and that these may contain more POM. Unsurprisingly, sensors' ability to estimate the coarser particle classes via turbidity are weak compared to the finer class, but this is less pronounced in organic-rich sediments. Distinct patterns in in-situ absorbance spectra may suggest an ability to discern

  1. A replica exchange transition interface sampling method with multiple interface sets for investigating networks of rare events

    NASA Astrophysics Data System (ADS)

    Swenson, David W. H.; Bolhuis, Peter G.

    2014-07-01

    The multiple state transition interface sampling (TIS) framework in principle allows the simulation of a large network of complex rare event transitions, but in practice suffers from convergence problems. To improve convergence, we combine multiple state TIS [J. Rogal and P. G. Bolhuis, J. Chem. Phys. 129, 224107 (2008)] with replica exchange TIS [T. S. van Erp, Phys. Rev. Lett. 98, 268301 (2007)]. In addition, we introduce multiple interface sets, which allow more than one order parameter to be defined for each state. We illustrate the methodology on a model system of multiple independent dimers, each with two states. For reaction networks with up to 64 microstates, we determine the kinetics in the microcanonical ensemble, and discuss the convergence properties of the sampling scheme. For this model, we find that the kinetics depend on the instantaneous composition of the system. We explain this dependence in terms of the system's potential and kinetic energy.

  2. A pair of RNA binding proteins controls networks of splicing events contributing to specialization of neural cell types

    PubMed Central

    Norris, Adam D.; Gao, Shangbang; Norris, Megan L.; Ray, Debashish; Ramani, Arun K.; Fraser, Andrew G.; Morris, Quaid; Hughes, Timothy R.; Zhen, Mei; Calarco, John A.

    2014-01-01

    SUMMARY Alternative splicing is important for the development and function of the nervous system, but little is known about the differences in alternative splicing between distinct types of neurons. Furthermore, the factors that control cell-type-specific splicing and the physiological roles of these alternative isoforms are unclear. By monitoring alternative splicing at single cell resolution in Caenorhabditis elegans, we demonstrate that splicing patterns in different neurons are often distinct and highly regulated. We identify two conserved RNA binding proteins, UNC-75/CELF and EXC-7/Hu/ELAV, which regulate overlapping networks of splicing events in GABAergic and cholinergic neurons. We use the UNC-75 exon network to discover regulators of synaptic transmission and to identify unique roles for isoforms of UNC-64/Syntaxin, a protein required for synaptic vesicle fusion. Our results indicate that combinatorial regulation of alternative splicing in distinct neurons provides a mechanism to specialize metazoan nervous systems. PMID:24910101

  3. Reward and Novelty Enhance Imagination of Future Events in a Motivational-Episodic Network

    PubMed Central

    Bulganin, Lisa; Wittmann, Bianca C.

    2015-01-01

    Thinking about personal future events is a fundamental cognitive process that helps us make choices in daily life. We investigated how the imagination of episodic future events is influenced by implicit motivational factors known to guide decision making. In a two-day functional magnetic resonance imaging (fMRI) study, we controlled learned reward association and stimulus novelty by pre-familiarizing participants with two sets of words in a reward learning task. Words were repeatedly presented and consistently followed by monetary reward or no monetary outcome. One day later, participants imagined personal future events based on previously rewarded, unrewarded and novel words. Reward association enhanced the perceived vividness of the imagined scenes. Reward and novelty-based construction of future events were associated with higher activation of the motivational system (striatum and substantia nigra/ ventral tegmental area) and hippocampus, and functional connectivity between these areas increased during imagination of events based on reward-associated and novel words. These data indicate that implicit past motivational experience contributes to our expectation of what the future holds in store. PMID:26599537

  4. Time-to-event analysis with artificial neural networks: an integrated analytical and rule-based study for breast cancer.

    PubMed

    Lisboa, Paulo J G; Etchells, Terence A; Jarman, Ian H; Hane Aung, M S; Chabaud, Sylvie; Bachelot, Thomas; Perol, David; Gargi, Thérèse; Bourdès, Valérie; Bonnevay, Stéphane; Négrier, Sylvie

    2008-01-01

    This paper presents an analysis of censored survival data for breast cancer specific mortality and disease-free survival. There are three stages to the process, namely time-to-event modelling, risk stratification by predicted outcome and model interpretation using rule extraction. Model selection was carried out using the benchmark linear model, Cox regression but risk staging was derived with Cox regression and with Partial Logistic Regression Artificial Neural Networks regularised with Automatic Relevance Determination (PLANN-ARD). This analysis compares the two approaches showing the benefit of using the neural network framework especially for patients at high risk. The neural network model also has results in a smooth model of the hazard without the need for limiting assumptions of proportionality. The model predictions were verified using out-of-sample testing with the mortality model also compared with two other prognostic models called TNG and the NPI rule model. Further verification was carried out by comparing marginal estimates of the predicted and actual cumulative hazards. It was also observed that doctors seem to treat mortality and disease-free models as equivalent, so a further analysis was performed to observe if this was the case. The analysis was extended with automatic rule generation using Orthogonal Search Rule Extraction (OSRE). This methodology translates analytical risk scores into the language of the clinical domain, enabling direct validation of the operation of the Cox or neural network model. This paper extends the existing OSRE methodology to data sets that include continuous-valued variables. PMID:18304780

  5. Assessing the Influence of an Individual Event in Complex Fault Spreading Network Based on Dynamic Uncertain Causality Graph.

    PubMed

    Dong, Chunling; Zhao, Yue; Zhang, Qin

    2016-08-01

    Identifying the pivotal causes and highly influential spreaders in fault propagation processes is crucial for improving the maintenance decision making for complex systems under abnormal and emergency situations. A dynamic uncertain causality graph-based method is introduced in this paper to explicitly model the uncertain causalities among system components, identify fault conditions, locate the fault origins, and predict the spreading tendency by means of probabilistic reasoning. A new algorithm is proposed to assess the impacts of an individual event by investigating the corresponding node's time-variant betweenness centrality and the strength of global causal influence in the fault propagation network. The algorithm does not depend on the whole original and static network but on the real-time spreading behaviors and dynamics, which makes the algorithm to be specifically targeted and more efficient. Experiments on both simulated networks and real-world systems demonstrate the accuracy, effectiveness, and comprehensibility of the proposed method for the fault management of power grids and other complex networked systems. PMID:27101619

  6. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Standard form of War Risk Builder's Risk Insurance... TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Builder's Risk Insurance § 308.409 Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk...

  7. Aeolian dust event in Korea observed by an EZ Lidar in the frame of global lidar networks.

    NASA Astrophysics Data System (ADS)

    Lolli, Simone

    2010-05-01

    Duststorms and sandstorms regularly devastate Northeast Asia and cause considerable damage to transportation system and public health; further, these events are conceived to be one of the very important indices for estimating the global warming and desertification. Previously, yellow sand events were considered natural phenomena that originate in deserts and arid areas. However, the greater scale and frequency of these events in recent years are considered to be the result of human activities such as overgrazing and over-cultivation. Japan, Korea, Cina and Mongolia are directly concerned to prevent and control these storms and have been able to some extent to provide forecasts and early warnings. In this framework, to improve the accuracy of forecasting , a compact and rugged eye safe lidar, the EZ LIDAR™, developed together by Laboratoire des Sciences du Climat et l'Environnement (LSCE) (CEA-CNRS) and LEOSPHERE (France) to study and investigate structural and optical properties of clouds and aerosols, thanks to the strong know-how of CEA and CNRS in the field of air quality measurements and cloud observation and analysis, was deployed in Seoul, Korea in order to detect and study yellow sand events, thanks to its depolarization channel and scan capabilities. The preliminary results, showed in this paper, of this measurement campaign put in evidence that EZ Lidar, for its capabilities of operating unattended day and night under each atmospheric condition, is mature to be deployed in a global network to study long-range transport, crucial in the forecasting model.

  8. Establishing relationships with Artificial Neural Networks between geopotential height patterns and heavy rainfall events

    NASA Astrophysics Data System (ADS)

    Michaelides, Silas; Tymvios, Filippos

    2010-05-01

    Dynamically induced rainfall is strongly connected with synoptic atmospheric circulation patterns at the upper levels. This study investigates the relationship between days of high precipitation volume events in the eastern Mediterranean and the associated geopotential height patterns at 500hPa. To reduce the number of different patterns and to simplify the statistical processing, the input days were classified into clusters of synoptic cases having similar characteristics, by utilizing Kohonen' Self Organizing Maps (SOM) architecture. Using this architecture, synoptic patterns were grouped into 9, 18, 27 and 36 clusters which were subsequently used in the analysis. The classification performance was tested by applying the method to extreme rainfall events in the eastern Mediterranean. The relationship of the synoptic upper air patterns (500hPa geopotential height) and surface features (heavy rainfall events) was established. The 36 member classification was proven to be the most efficient system.

  9. Characterization and Analysis of Networked Array of Sensors for Event Detection (CANARY-EDS)

    Energy Science and Technology Software Center (ESTSC)

    2011-05-27

    CANARY -EDS provides probabilistic event detection based on analysis of time-series data from water quality or other sensors. CANARY also can compare patterns against a library of previously seen data to indicate that a certain pattern has reoccurred, suppressing what would otherwise be considered an event. CANARY can be configured to analyze previously recorded data from files or databases, or it can be configured to run in real-time mode directory from a database, or throughmore » the US EPA EDDIES software.« less

  10. Social Network Changes and Life Events across the Life Span: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wrzus, Cornelia; Hanel, Martha; Wagner, Jenny; Neyer, Franz J.

    2013-01-01

    For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-analysis on age-related social network…

  11. Age differences in the Attention Network Test: Evidence from behavior and event-related potentials.

    PubMed

    Williams, Ryan S; Biel, Anna Lena; Wegier, Pete; Lapp, Leann K; Dyson, Benjamin J; Spaniol, Julia

    2016-02-01

    The Attention Network Test (ANT) is widely used to capture group and individual differences in selective attention. Prior behavioral studies with younger and older adults have yielded mixed findings with respect to age differences in three putative attention networks (alerting, orienting, and executive control). To overcome the limitations of behavioral data, the current study combined behavioral and electrophysiological measures. Twenty-four healthy younger adults (aged 18-29years) and 24 healthy older adults (aged 60-76years) completed the ANT while EEG data were recorded. Behaviorally, older adults showed reduced alerting, but did not differ from younger adults in orienting or executive control. Electrophysiological components related to alerting and orienting (P1, N1, and CNV) were similar in both age groups, whereas components related to executive control (N2 and P3) showed age-related differences. Together these results suggest that comparisons of network effects between age groups using behavioral data alone may not offer a complete picture of age differences in selective attention, especially for alerting and executive control networks. PMID:26760449

  12. Event-Related fMRI of Category Learning: Differences in Classification and Feedback Networks

    ERIC Educational Resources Information Center

    Little, Deborah M.; Shin, Silvia S.; Sisco, Shannon M.; Thulborn, Keith R.

    2006-01-01

    Eighteen healthy young adults underwent event-related (ER) functional magnetic resonance imaging (fMRI) of the brain while performing a visual category learning task. The specific category learning task required subjects to extract the rules that guide classification of quasi-random patterns of dots into categories. Following each classification…

  13. DEVELOPMENT, EVALUATION AND APPLICATION OF AN AUTOMATED EVENT PRECIPITATION SAMPLER FOR NETWORK OPERATION

    EPA Science Inventory

    In 1993, the University of Michigan Air Quality Laboratory (UMAQL) designed a new wet-only precipitation collection system that was utilized in the Lake Michigan Loading Study. The collection system was designed to collect discrete mercury and trace element samples on an event b...

  14. Space fabrication demonstration system. [beam builder and induction fastening

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The development effort on the composite beam cap fabricator was completed within cost and close to abbreviated goals. The design and analysis of flight weight primary and secondary beam builder structures proceeded satisfactorily but remains curtailed until further funding is made available to complete the work. The induction fastening effort remains within cost and schedule constraints. Tests of the LARC prototype induction welder is continuing in an instrumented test stand comprised of a Dumore drill press (air over oil feed for variable applied loads) and a dynamometer to measure actual welding loads. Continued testing shows that the interface screening must be well impregnated with resin to ensure proper flow when bonding graphite-acrylic lap shear samples. Specimens prepared from 0.030 inch thick graphite-polyethersulfone are also available for future induction fastening evaluation.

  15. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  16. Systematic observations of long-range transport events and climatological backscatter profiles with the DWD ceilometer network

    NASA Astrophysics Data System (ADS)

    Mattis, Ina; Müller, Gerhard; Wagner, Frank; Hervo, Maxime

    2015-04-01

    The German Meteorological Service (DWD) operates a network of about 60 CHM15K-Nimbus ceilometers for cloud base height observations. Those very powerful ceilometers allow for the detection and characterization of aerosol layers. Raw data of all network ceilometers are transferred online to DWD's data analysis center at the Hohenpeißenberg Meteorological Observatory. There, the occurrence of aerosol layers from long-range transport events in the free troposphere is systematically monitored on daily basis for each single station. If possible, the origin of the aerosol layers is determined manually from the analysis of the meteorological situation and model output. We use backward trajectories as well as the output of the MACC and DREAM models for the decision, whether the observed layer originated in the Sahara region, from forest fires in North America or from another, unknown source. Further, the magnitude of the observed layers is qualitatively estimated taking into account the geometrical layer depth, signal intensity, model output and nearby sun photometer or lidar observations (where available). All observed layers are attributed to one of the categories 'faint', 'weak', 'medium', 'strong', or 'extreme'. We started this kind of analysis in August 2013 and plan to continue this systematic documentation of long-range transport events of aerosol layers to Germany on long-term base in the framework of our GAW activities. Most of the observed aerosol layers have been advected from the Sahara region to Germany. In the 15 months between August 2013 and November 2014 we observed on average 46 days with Sahara dust layers per station, but only 16 days with aerosol layers from forest fires. The occurrence of Sahara dust layers vary with latitude. We observed only 28 dusty days in the north, close to the coasts of North Sea and Baltic Sea. In contrast, in southern Germany, in Bavarian Pre-Alps and in the Black Forest mountains, we observed up to 59 days with dust. At

  17. Constructing a molecular interaction network for thyroid cancer via large-scale text mining of gene and pathway events

    PubMed Central

    2015-01-01

    Background Biomedical studies need assistance from automated tools and easily accessible data to address the problem of the rapidly accumulating literature. Text-mining tools and curated databases have been developed to address such needs and they can be applied to improve the understanding of molecular pathogenesis of complex diseases like thyroid cancer. Results We have developed a system, PWTEES, which extracts pathway interactions from the literature utilizing an existing event extraction tool (TEES) and pathway named entity recognition (PathNER). We then applied the system on a thyroid cancer corpus and systematically extracted molecular interactions involving either genes or pathways. With the extracted information, we constructed a molecular interaction network taking genes and pathways as nodes. Using curated pathway information and network topological analyses, we highlight key genes and pathways involved in thyroid carcinogenesis. Conclusions Mining events involving genes and pathways from the literature and integrating curated pathway knowledge can help improve the understanding of molecular interactions of complex diseases. The system developed for this study can be applied in studies other than thyroid cancer. The source code is freely available online at https://github.com/chengkun-wu/PWTEES. PMID:26679379

  18. Real-time prediction of acute cardiovascular events using hardware-implemented Bayesian networks.

    PubMed

    Tylman, Wojciech; Waszyrowski, Tomasz; Napieralski, Andrzej; Kamiński, Marek; Trafidło, Tamara; Kulesza, Zbigniew; Kotas, Rafał; Marciniak, Paweł; Tomala, Radosław; Wenerski, Maciej

    2016-02-01

    This paper presents a decision support system that aims to estimate a patient׳s general condition and detect situations which pose an immediate danger to the patient׳s health or life. The use of this system might be especially important in places such as accident and emergency departments or admission wards, where a small medical team has to take care of many patients in various general conditions. Particular stress is laid on cardiovascular and pulmonary conditions, including those leading to sudden cardiac arrest. The proposed system is a stand-alone microprocessor-based device that works in conjunction with a standard vital signs monitor, which provides input signals such as temperature, blood pressure, pulseoxymetry, ECG, and ICG. The signals are preprocessed and analysed by a set of artificial intelligence algorithms, the core of which is based on Bayesian networks. The paper focuses on the construction and evaluation of the Bayesian network, both its structure and numerical specification. PMID:26456181

  19. Evaluation of the U.S. Department of Energy Challenge Home Program Certification of Production Builders

    SciTech Connect

    Kerrigan, P.; Loomis, H.

    2014-09-01

    The purpose of this project was to evaluate integrated packages of advanced measures in individual test homes to assess their performance with respect to Building America program goals, specifically compliance with the DOE Challenge Home Program. BSC consulted on the construction of five test houses by three cold climate production builders in three U.S. cities and worked with the builders to develop a design package tailored to the cost-related impacts for each builder. Also, BSC provided support through performance testing of the five test homes. Overall, the builders have concluded that the energy related upgrades (either through the prescriptive or performance path) represent reasonable upgrades. The builders commented that while not every improvement in specification was cost effective (as in a reasonable payback period), many were improvements that could improve the marketability of the homes and serve to attract more energy efficiency discerning prospective homeowners. However, the builders did express reservations on the associated checklists and added certifications. An increase in administrative time was observed with all builders. The checklists and certifications also inherently increase cost due to: adding services to the scope of work for various trades, such as HERS Rater, HVAC contractor; and increased material costs related to the checklists, especially the EPA Indoor airPLUS and EPA WaterSense® Efficient Hot Water Distribution requirement.

  20. Application of the EVEX resource to event extraction and network construction: Shared Task entry and result analysis

    PubMed Central

    2015-01-01

    Background Modern methods for mining biomolecular interactions from literature typically make predictions based solely on the immediate textual context, in effect a single sentence. No prior work has been published on extending this context to the information automatically gathered from the whole biomedical literature. Thus, our motivation for this study is to explore whether mutually supporting evidence, aggregated across several documents can be utilized to improve the performance of the state-of-the-art event extraction systems. In this paper, we describe our participation in the latest BioNLP Shared Task using the large-scale text mining resource EVEX. We participated in the Genia Event Extraction (GE) and Gene Regulation Network (GRN) tasks with two separate systems. In the GE task, we implemented a re-ranking approach to improve the precision of an existing event extraction system, incorporating features from the EVEX resource. In the GRN task, our system relied solely on the EVEX resource and utilized a rule-based conversion algorithm between the EVEX and GRN formats. Results In the GE task, our re-ranking approach led to a modest performance increase and resulted in the first rank of the official Shared Task results with 50.97% F-score. Additionally, in this paper we explore and evaluate the usage of distributed vector representations for this challenge. In the GRN task, we ranked fifth in the official results with a strict/relaxed SER score of 0.92/0.81 respectively. To try and improve upon these results, we have implemented a novel machine learning based conversion system and benchmarked its performance against the original rule-based system. Conclusions For the GRN task, we were able to produce a gene regulatory network from the EVEX data, warranting the use of such generic large-scale text mining data in network biology settings. A detailed performance and error analysis provides more insight into the relatively low recall rates. In the GE task we

  1. Network Systems Technician.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 17 subjects appropriate for use in a competency list for the occupation of network systems technician, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 17 units are as follows:…

  2. Response control networks are selectively modulated by attention to rare events and memory load regardless of the need for inhibition.

    PubMed

    Wijeakumar, Sobanawartiny; Magnotta, Vincent A; Buss, Aaron T; Ambrose, Joseph P; Wifall, Timothy A; Hazeltine, Eliot; Spencer, John P

    2015-10-15

    Recent evidence has sparked debate about the neural bases of response selection and inhibition. In the current study, we employed two reactive inhibition tasks, the Go/Nogo (GnG) and Simon tasks, to examine questions central to these debates. First, we investigated whether a fronto-cortical-striatal system was sensitive to the need for inhibition per se or the presentation of infrequent stimuli, by manipulating the proportion of trials that do not require inhibition (Go/Compatible trials) relative to trials that require inhibition (Nogo/Incompatible trials). A cortico-subcortical network composed of insula, putamen, and thalamus showed greater activation on salient and infrequent events, regardless of the need for inhibition. Thus, consistent with recent findings, key parts of the fronto-cortical-striatal system are engaged by salient events and do not appear to play a selective role in response inhibition. Second, we examined how the fronto-cortical-striatal system is modulated by working memory demands by varying the number of stimulus-response (SR) mappings. Right inferior parietal lobule showed decreasing activation as the number of SR mappings increased, suggesting that a form of associative memory - rather than working memory - might underlie performance in these tasks. A broad motor planning and control network showed similar trends that were also modulated by the number of motor responses required in each task. Finally, bilateral lingual gyri were more robustly engaged in the Simon task, consistent with the role of this area in shifts of visuo-spatial attention. The current study sheds light on how the fronto-cortical-striatal network is selectively engaged in reactive control tasks and how control is modulated by manipulations of attention and memory load. PMID:26190403

  3. Differential Network Analyses of Alzheimer’s Disease Identify Early Events in Alzheimer’s Disease Pathology

    DOE PAGESBeta

    Xia, Jing; Rocke, David M.; Perry, George; Ray, Monika

    2014-01-01

    In late-onset Alzheimer’s disease (AD), multiple brain regions are not affected simultaneously. Comparing the gene expression of the affected regions to identify the differences in the biological processes perturbed can lead to greater insight into AD pathogenesis and early characteristics. We identified differentially expressed (DE) genes from single cell microarray data of four AD affected brain regions: entorhinal cortex (EC), hippocampus (HIP), posterior cingulate cortex (PCC), and middle temporal gyrus (MTG). We organized the DE genes in the four brain regions into region-specific gene coexpression networks. Differential neighborhood analyses in the coexpression networks were performed to identify genes with lowmore » topological overlap (TO) of their direct neighbors. The low TO genes were used to characterize the biological differences between two regions. Our analyses show that increased oxidative stress, along with alterations in lipid metabolism in neurons, may be some of the very early events occurring in AD pathology. Cellular defense mechanisms try to intervene but fail, finally resulting in AD pathology as the disease progresses. Furthermore, disease annotation of the low TO genes in two independent protein interaction networks has resulted in association between cancer, diabetes, renal diseases, and cardiovascular diseases.« less

  4. Stochastic switching in gene networks can occur by a single-molecule event or many molecular steps.

    PubMed

    Choi, Paul J; Xie, X Sunney; Shakhnovich, Eugene I

    2010-02-12

    Due to regulatory feedback, biological networks can exist stably in multiple states, leading to heterogeneous phenotypes among genetically identical cells. Random fluctuations in protein numbers, tuned by specific molecular mechanisms, have been hypothesized to drive transitions between these different states. We develop a minimal theoretical framework to analyze the limits of switching in terms of simple experimental parameters. Our model identifies and distinguishes between two distinct molecular mechanisms for generating stochastic switches. In one class of switches, the stochasticity of a single-molecule event, a specific and rare molecular reaction, directly controls the macroscopic change in a cell's state. In the second class, no individual molecular event is significant, and stochasticity arises from the propagation of biochemical noise through many molecular pathways and steps. As an example, we explore switches based on protein-DNA binding fluctuations and predict relations between transcription factor kinetics, absolute switching rate, robustness, and efficiency that differentiate between switching by single-molecule events or many molecular steps. Finally, we apply our methods to recent experimental data on switching in Escherichia coli lactose metabolism, providing quantitative interpretations of a single-molecule switching mechanism. PMID:19931280

  5. PAnnBuilder: an R package for assembling proteomic annotation data.

    PubMed

    Li, Hong; Ding, Guohui; Xie, Lu; Li, Yixue

    2009-04-15

    PAnnBuilder is an R package to automatically assemble protein annotation information from public resources to provide uniform annotation data for large-scale proteomic studies. Sixteen public databases have been parsed and 54 annotation packages have been constructed based on R environment or SQLite database. These ready-to-use packages cover most frequently needed protein annotation for three model species including human, mouse and rat. Several extended applications such as annotation based on protein sequence similarity are also provided. Sophisticated users can develop their own packages using PAnnBuilder. PAnnBuilder may become an important tool for proteomic research. PMID:19237448

  6. Design of simulation builder software to support the enterprise modeling and simulation task of the AMTEX program

    SciTech Connect

    Nolan, M.; Lamont, A.; Chang, L.

    1995-12-12

    This document describes the implementation of the Simulation Builder developed as part of the Enterprise Modeling and Simulation (EM&S) portion of the Demand Activated Manufacturing Architecture (DAMA) project. The Simulation Builder software allows users to develop simulation models using pre-defined modules from a library. The Simulation Builder provides the machinery to allow the modules to link together and communicate information during the simulation run. This report describes the basic capabilities and structure of the Simulation Builder to assist a user in reviewing and using the code. It also describes the basic steps to follow when developing modules to take advantage of the capabilities provided by the Simulation Builder. The Simulation Builder software is written in C++. The discussion in this report assumes a sound understanding of the C++ language. Although this report describes the steps to follow when using the Simulation Builder, it is not intended to be a tutorial for a user unfamiliar with C++.

  7. Uncertainty of precipitation estimates in convective events by the Meteorological Service of Catalonia radar network

    NASA Astrophysics Data System (ADS)

    Trapero, Laura; Bech, Joan; Rigo, Tomeu; Pineda, Nicolau; Forcadell, David

    In order to quantify the uncertainty of the radar-derived surface point quantitative precipitation estimates (QPE) from a regional radar network, a comparison has been made with a network of rain gauges. Three C-band Doppler radars and 161 telemetered gauges have been used. Both networks cover the area of Catalonia (NE Spain). Hourly accumulations integrated in daily amounts are studied. For each radar, three different precipitation products are obtained: short range, long range, and short range corrected radar QPE. The corrected product is generated by the Hydrometeorological Integrated Forecasting Tool (EHIMI), a software package designed to correct radar observations in real time for its use in hydrometeorological applications. Among other features, EHIMI includes a topographical beam blockage correction procedure. The first part of the analysis examines the bias found in the radar. The three radars generally underestimate precipitation, an effect increased with range from the radar and beam blockage, which is examined in detail in this study. Moreover, corrected QPEs systematically improve the BIAS (2 dB) and RMSf for high blockages (50-70%). The second part of the analysis illustrates the temporal evolution of the daily mean bias. Finally, the uncertainty of each rain gauge has been compared to each rainfall radar product. Geographic distribution of daily BIAS is consistent with slight under-estimation at short range and substantial at long range, especially in the north of Catalonia, which is an area with important beam blockage (> 40%). These results contribute to improve the knowledge about the spatial distribution of the QPE error benefiting a number of applications including verification of high-resolution NWP precipitation forecasts and use of advanced hydrometeorological models.

  8. Using neural networks as an event trigger in elementary particle physics experiments

    SciTech Connect

    Neis, E.; Starr, F.W.; Handler, T.; Gabriel, T.; Glover, C.; Saini, S.

    1994-02-01

    Elementary particle physics experiments often have to deal with high data rates. In order to avoid having to write out all data that is occurring online processors, triggers, are used to cull out the uninteresting data. These triggers are based on some particular aspect of the physics being examined. At times these aspects are often equivalent to simple pattern recognition problems. The reliability of artificial neural networks(ANNs) in pattern recognition problems in many fields has been well demonstrated. We present here the results of a study on the feasibility of using ANNs as an online trigger for high energy physics experiments.

  9. Automatic detection of epileptiform events in EEG by a three-stage procedure based on artificial neural networks.

    PubMed

    Acir, Nurettin; Oztura, Ibrahim; Kuntalp, Mehmet; Baklan, Bariş; Güzeliş, Cüneyt

    2005-01-01

    This paper introduces a three-stage procedure based on artificial neural networks for the automatic detection of epileptiform events (EVs) in a multichannel electroencephalogram (EEG) signal. In the first stage, two discrete perceptrons fed by six features are used to classify EEG peaks into three subgroups: 1) definite epileptiform transients (ETs); 2) definite non-ETs; and 3) possible ETs and possible non-ETs. The pre-classification done in the first stage not only reduces the computation time but also increases the overall detection performance of the procedure. In the second stage, the peaks falling into the third group are aimed to be separated from each other by a nonlinear artificial neural network that would function as a postclassifier whose input is a vector of 41 consecutive sample values obtained from each peak. Different networks, i.e., a backpropagation multilayer perceptron and two radial basis function networks trained by a hybrid method and a support vector method, respectively, are constructed as the postclassifier and then compared in terms of their classification performances. In the third stage, multichannel information is integrated into the system for contributing to the process of identifying an EV by the electroencephalographers (EEGers). After the integration of multichannel information, the overall performance of the system is determined with respect to EVs. Visual evaluation, by two EEGers, of 19 channel EEG records of 10 epileptic patients showed that the best performance is obtained with a radial basis support vector machine providing an average sensitivity of 89.1%, an average selectivity of 85.9%, and a false detection rate (per hour) of 7.5. PMID:15651562

  10. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  11. Multistate Model Builder (MSMB): a flexible editor for compact biochemical models

    PubMed Central

    2014-01-01

    Background Building models of molecular regulatory networks is challenging not just because of the intrinsic difficulty of describing complex biological processes. Writing a model is a creative effort that calls for more flexibility and interactive support than offered by many of today’s biochemical model editors. Our model editor MSMB — Multistate Model Builder — supports multistate models created using different modeling styles. Results MSMB provides two separate advances on existing network model editors. (1) A simple but powerful syntax is used to describe multistate species. This reduces the number of reactions needed to represent certain molecular systems, thereby reducing the complexity of model creation. (2) Extensive feedback is given during all stages of the model creation process on the existing state of the model. Users may activate error notifications of varying stringency on the fly, and use these messages as a guide toward a consistent, syntactically correct model. MSMB default values and behavior during model manipulation (e.g., when renaming or deleting an element) can be adapted to suit the modeler, thus supporting creativity rather than interfering with it. MSMB’s internal model representation allows saving a model with errors and inconsistencies (e.g., an undefined function argument; a syntactically malformed reaction). A consistent model can be exported to SBML or COPASI formats. We show the effectiveness of MSMB’s multistate syntax through models of the cell cycle and mRNA transcription. Conclusions Using multistate reactions reduces the number of reactions need to encode many biochemical network models. This reduces the cognitive load for a given model, thereby making it easier for modelers to build more complex models. The many interactive editing support features provided by MSMB make it easier for modelers to create syntactically valid models, thus speeding model creation. Complete information and the installation package can be

  12. Small-World Synchronized Computing Networks for Scalable Parallel Discrete-Event Simulations

    NASA Astrophysics Data System (ADS)

    Guclu, Hasan; Korniss, Gyorgy; Toroczkai, Zoltan; Novotny, Mark A.

    We study the scalability of parallel discrete-event simulations for arbitrary short-range interacting systems with asynchronous dynamics. When the synchronization topology mimics that of the short-range interacting underlying system, the virtual time horizon (corresponding to the progress of the processing elements) exhibits Kardar-Parisi-Zhang-like kinetic roughening. Although the virtual times, on average, progress at a nonzero rate, their statistical spread diverges with the number of processing elements, hindering efficient data collection. We show that when the synchronization topology is extended to include quenched random communication links between the processing elements, they make a close-to-uniform progress with a nonzero rate, without global synchronization. We discuss in detail a coarse-grained description for the small-world synchronized virtual time horizon and compare the findings to those obtained by simulating the simulations based on the exact algorithmic rules.

  13. A twenty-first century California observing network for monitoring extreme weather events

    USGS Publications Warehouse

    White, A.B.; Anderson, M.L.; Dettinger, M.D.; Ralph, F.M.; Hinojosa, A.; Cayan, D.R.; Hartman, R.K.; Reynolds, D.W.; Johnson, L.E.; Schneider, T.L.; Cifelli, R.; Toth, Z.; Gutman, S.I.; King, C.W.; Gehrke, F.; Johnston, P.E.; Walls, C.; Mann, Dorte; Gottas, D.J.; Coleman, T.

    2013-01-01

    During Northern Hemisphere winters, the West Coast of North America is battered by extratropical storms. The impact of these storms is of paramount concern to California, where aging water supply and flood protection infrastructures are challenged by increased standards for urban flood protection, an unusually variable weather regime, and projections of climate change. Additionally, there are inherent conflicts between releasing water to provide flood protection and storing water to meet requirements for water supply, water quality, hydropower generation, water temperature and flow for at-risk species, and recreation. In order to improve reservoir management and meet the increasing demands on water, improved forecasts of precipitation, especially during extreme events, is required. Here we describe how California is addressing their most important and costliest environmental issue – water management – in part, by installing a state-of-the-art observing system to better track the area’s most severe wintertime storms.

  14. Best Practices Case Study: Devoted Builders, LLC, Mediterrtanean Villas, Pasco,WA

    SciTech Connect

    2010-12-01

    Devoted Builders of Kennewick, WA worked with Building America's BIRA team to achieve the 50% Federal tax credit level energy savings on 81 homes at its Mediterranean Villas community in eastern Washington.

  15. Building with passive solar: an application guide for the southern homeowner and builder

    SciTech Connect

    1981-03-01

    This instructional material was prepared for training workshops for builders and home designers. It includes: fundamental definitions and equations, climate and site studies, building components, passive systems and techniques, and design tools. (MHR)

  16. 77 FR 28411 - Adrenalina, Affinity Technology Group, Inc., Braintech, Inc., Builders Transport, Incorporated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... From the Federal Register Online via the Government Publishing Office SECURITIES AND EXCHANGE COMMISSION Adrenalina, Affinity Technology Group, Inc., Braintech, Inc., Builders Transport, Incorporated... Adrenalina because it has not filed any periodic reports since the period ended September 30, 2008....

  17. CarbBuilder: Software for building molecular models of complex oligo- and polysaccharide structures.

    PubMed

    Kuttel, Michelle M; Ståhle, Jonas; Widmalm, Göran

    2016-08-15

    CarbBuilder is a portable software tool for producing three-dimensional molecular models of carbohydrates from the simple text specification of a primary structure. CarbBuilder can generate a wide variety of carbohydrate structures, ranging from monosaccharides to large, branched polysaccharides. Version 2.0 of the software, described in this article, supports monosaccharides of both mammalian and bacterial origin and a range of substituents for derivatization of individual sugar residues. This improved version has a sophisticated building algorithm to explore the range of possible conformations for a specified carbohydrate molecule. Illustrative examples of models of complex polysaccharides produced by CarbBuilder demonstrate the capabilities of the software. CarbBuilder is freely available under the Artistic License 2.0 from https://people.cs.uct.ac.za/~mkuttel/Downloads.html. © 2016 Wiley Periodicals, Inc. PMID:27317625

  18. DOE Zero Energy Ready Home Case Study: Preferred Builders, Old Greenwich, Connecticut

    SciTech Connect

    none,

    2013-04-01

    The first Challenge Home built in New England features cool-roof shingles, HERS 20–42, and walls densely packed with blown fiberglass. This house won a 2013 Housing Innovation Award in the custom builder category.

  19. Building America Top Innovations 2012: DOE Challenge Home (Formerly Builders Challenge)

    SciTech Connect

    none,

    2013-01-01

    This Building America Top Innovations profile describes DOE’s Builders Challenge. Hundreds of leading-edge builders across the country signed on to the Challenge and more than 14,000 homes earned the label, saving homeowners over $10 million a year in utility bills. DOE’s new program, the DOE Challenge Home, increases the rigor of the guidelines including requiring homes to be Zero Net-Energy Ready.

  20. Discrete-event-dynamic-system-based approaches for scheduling transmissions in multihop packet radio networks

    NASA Astrophysics Data System (ADS)

    Cassandras, Christos G.

    In the classic transmission scheduling problem, the nodes of a packed radio network (PRN) broadcast fixed-length packets over a common resource (the channel). Packet transmissions are subject to interference constraints; for example, if a node is transmitting a packet, then all adjacent (neighboring) nodes must refrain from transmission. One then adopts a slotted time model where every slot is allocated to a set of nodes which can simultaneously transmit without conflict. Thus, a node generally belongs to one or more of these sets (called transmission sets). Our approach is based on viewing the transmission scheduling problem as a single server multiclass polling problem with simultaneous resource possession. Here, a class corresponds to a transmission set. The server corresponds to a channel operating with deterministic service times: a service time is equal to one time slot required for transmitting a packet. The scheduling problem is then equivalent to assigning the server (equivalently, each time slot) to a particular transmissions set. The simultaneous resource possession feature arises because the server is assigned to a transmission set, i.e. it can simultaneously provide service to packets from all nodes which belong to that set. The construction of the transmission set is dependent upon the topology and connectivity of the PRN and is equivalent to a graph partitioning problem. For our purposes, we assume M transmission sets have been specified. Finally, we allow for overlapping transmission sets, i.e. a node can belong to two or more difference transmission sets.

  1. Modeling the Energy Performance of Event-Driven Wireless Sensor Network by Using Static Sink and Mobile Sink

    PubMed Central

    Chen, Jiehui; Salim, Mariam B.; Matsumoto, Mitsuji

    2010-01-01

    Wireless Sensor Networks (WSNs) designed for mission-critical applications suffer from limited sensing capacities, particularly fast energy depletion. Regarding this, mobile sinks can be used to balance the energy consumption in WSNs, but the frequent location updates of the mobile sinks can lead to data collisions and rapid energy consumption for some specific sensors. This paper explores an optimal barrier coverage based sensor deployment for event driven WSNs where a dual-sink model was designed to evaluate the energy performance of not only static sensors, but Static Sink (SS) and Mobile Sinks (MSs) simultaneously, based on parameters such as sensor transmission range r and the velocity of the mobile sink v, etc. Moreover, a MS mobility model was developed to enable SS and MSs to effectively collaborate, while achieving spatiotemporal energy performance efficiency by using the knowledge of the cumulative density function (cdf), Poisson process and M/G/1 queue. The simulation results verified that the improved energy performance of the whole network was demonstrated clearly and our eDSA algorithm is more efficient than the static-sink model, reducing energy consumption approximately in half. Moreover, we demonstrate that our results are robust to realistic sensing models and also validate the correctness of our results through extensive simulations. PMID:22163503

  2. Structuring the Future: Anticipated Life Events, Peer Networks, and Adolescent Sexual Behavior

    PubMed Central

    Soller, Brian; Haynie, Dana L.

    2013-01-01

    While prior research has established associations between individual expectations of future events and risk behavior among adolescents, the potential effects of peers’ future perceptions on risk-taking have been overlooked. We extend prior research by testing whether peers’ anticipation of college completion is associated with adolescent sexual risk-taking. We also examine whether adolescents’ perceptions of the negative consequences of pregnancy and idealized romantic relationship scripts mediate the association between peers’ anticipation of college completion and sexual risk-taking. Results from multivariate regression models with data from the National Longitudinal Study of Adolescent Health (Add Health) indicate peers’ anticipation of college completion is negatively associated with a composite measure of sexual risk-taking and positively associated with the odds of abstaining from sexual intercourse and only engaging in intercourse with a romantic partner (compared to having intercourse with a non-romantic partner). In addition, perceptions of the negative consequences of pregnancy and sexualized relationship scripts appear to mediate a large portion of the association between peers’ anticipation of future success and sexual risk-taking and the likelihood of abstaining (but not engaging in romantic-only intercourse). Results from our study underscore the importance of peers in shaping adolescent sexual behavior. PMID:24223438

  3. MCPB.py: A Python Based Metal Center Parameter Builder.

    PubMed

    Li, Pengfei; Merz, Kenneth M

    2016-04-25

    MCPB.py, a python based metal center parameter builder, has been developed to build force fields for the simulation of metal complexes employing the bonded model approach. It has an optimized code structure, with far fewer required steps than the previous developed MCPB program. It supports various AMBER force fields and more than 80 metal ions. A series of parametrization schemes to derive force constants and charge parameters are available within the program. We give two examples (one metalloprotein example and one organometallic compound example), indicating the program's ability to build reliable force fields for different metal ion containing complexes. The original version was released with AmberTools15. It is provided via the GNU General Public License v3.0 (GNU_GPL_v3) agreement and is free to download and distribute. MCPB.py provides a bridge between quantum mechanical calculations and molecular dynamics simulation software packages thereby enabling the modeling of metal ion centers. It offers an entry into simulating metal ions in a number of situations by providing an efficient way for researchers to handle the vagaries and difficulties associated with metal ion modeling. PMID:26913476

  4. Librarians as Knowledge Builders: Strategic Partnering for Service and Advocacy

    SciTech Connect

    Kreitz, P

    2003-12-15

    In their article on the challenges facing the postmodern library authors Elteto and Frank warn that the ''relevancy of academic libraries are at stake as a result of dramatic budget reductions and ongoing changes in the use of libraries.'' Recognizing the fiscal crisis facing libraries, many leaders in the profession are calling for libraries to strengthen their core roles in supporting campus research, teaching, and learning and to become more proactive and effective communicators of the critical role the library plays in supporting institutional goals. Responding to this difficult period facing academia and interested in highlighting the creative ways academic libraries around the country are responding, ACRL President, Tyrone Cannon has chosen ''Partnerships and Connections: the Learning Community as Knowledge Builders'' 2 as the theme for his presidential year. His intention is to foster opportunities for libraries to ''play a key role in developing, defining and enhancing learning communities central to campus life.'' Focusing our efforts on supporting the core business of academia will ensure that academic libraries continue to be places of ''opportunity, interaction, serendipity and strong collections and remain central to the knowledge building process.''

  5. Network analysis of possible anaphylaxis cases reported to the US vaccine adverse event reporting system after H1N1 influenza vaccine.

    PubMed

    Botsis, Taxiarchis; Ball, Robert

    2011-01-01

    The identification of signals from spontaneous reporting systems plays an important role in monitoring the safety of medical products. Network analysis (NA) allows the representation of complex interactions among the key elements of such systems. We developed a network for a subset of the US Vaccine Adverse Event Reporting System (VAERS) by representing the vaccines/adverse events (AEs) and their interconnections as the nodes and the edges, respectively; this subset we focused upon included possible anaphylaxis reports that were submitted for the H1N1 influenza vaccine. Subsequently, we calculated the main metrics that characterize the connectivity of the nodes and applied the island algorithm to identify the densest region in the network and, thus, identify potential safety signals. AEs associated with anaphylaxis formed a dense region in the 'anaphylaxis' network demonstrating the strength of NA techniques for pattern recognition. Additional validation and development of this approach is needed to improve future pharmacovigilance efforts. PMID:21893812

  6. HIV-1 subtype F1 epidemiological networks among Italian heterosexual males are associated with introduction events from South America.

    PubMed

    Lai, Alessia; Simonetti, Francesco R; Zehender, Gianguglielmo; De Luca, Andrea; Micheli, Valeria; Meraviglia, Paola; Corsi, Paola; Bagnarelli, Patrizia; Almi, Paolo; Zoncada, Alessia; Paolucci, Stefania; Gonnelli, Angela; Colao, Grazia; Tacconi, Danilo; Franzetti, Marco; Ciccozzi, Massimo; Zazzi, Maurizio; Balotta, Claudia

    2012-01-01

    About 40% of the Italian HIV-1 epidemic due to non-B variants is sustained by F1 clade, which circulates at high prevalence in South America and Eastern Europe. Aim of this study was to define clade F1 origin, population dynamics and epidemiological networks through phylogenetic approaches. We analyzed pol sequences of 343 patients carrying F1 subtype stored in the ARCA database from 1998 to 2009. Citizenship of patients was as follows: 72.6% Italians, 9.3% South Americans and 7.3% Rumanians. Heterosexuals, Homo-bisexuals, Intravenous Drug Users accounted for 58.1%, 24.0% and 8.8% of patients, respectively. Phylogenetic analysis indicated that 70% of sequences clustered in 27 transmission networks. Two distinct groups were identified; the first clade, encompassing 56 sequences, included all Rumanian patients. The second group involved the remaining clusters and included 10 South American Homo-bisexuals in 9 distinct clusters. Heterosexual modality of infection was significantly associated with the probability to be detected in transmission networks. Heterosexuals were prevalent either among Italians (67.2%) or Rumanians (50%); by contrast, Homo-bisexuals accounted for 71.4% of South Americans. Among patients with resistant strains the proportion of clustering sequences was 57.1%, involving 14 clusters (51.8%). Resistance in clusters tended to be higher in South Americans (28.6%) compared to Italian (17.7%) and Rumanian patients (14.3%). A striking proportion of epidemiological networks could be identified in heterosexuals carrying F1 subtype residing in Italy. Italian Heterosexual males predominated within epidemiological clusters while foreign patients were mainly Heterosexual Rumanians, both males and females, and South American Homo-bisexuals. Tree topology suggested that F1 variant from South America gave rise to the Italian F1 epidemic through multiple introduction events. The contact tracing also revealed an unexpected burden of resistance in epidemiological

  7. Data scanner and event builder for the SMVD of the KEK B-factory

    NASA Astrophysics Data System (ADS)

    Tanka, Manobu; Ikeda, Hirokazu; Fujita, Youichi; Ozaki, Hitoshi; Fukunaga, Chikara

    1994-02-01

    We are designing a readout system for a silicon microstrip vertex detector (SMVD) to be used in the KEK B-factory experiment. In order to obtain optimum numbers for the buffer size and the processing time we have constructed a simplified model using Verilog hardware description language (HDL). We estimated the live-time fraction of the SMVD readout system for various configurations.

  8. Striatal development involves a switch in gene expression networks, followed by a myelination event: implications for neuropsychiatric disease

    PubMed Central

    Novak, Gabriela; Fan, Theresa; O’Dowd, Brian F.; George, Susan R.

    2012-01-01

    Because abnormal development of striatal neurons is thought to be part of pathology underlying major psychiatric illnesses, we studied the expression pattern of genes involved in striatal development and of genes comprising key striatal-specific pathways, during an active striatal maturation period, the first two postnatal weeks in rat. This period parallels human striatal development during the second trimester, when prenatal stress is though to lead to increased risk for neuropsychiatric disorders. In order to identify genes involved in this developmental process, we used subtractive hybridization, followed by quantitative real-time PCR, which allowed us to characterize the developmental expression of over 60 genes, many not previously known to play a role in neuromaturation. Of these 12 were novel transcripts, which did not match known genes, but which showed strict developmental expression and may play a role in striatal neurodevelopment. We show that during the first two postnatal weeks in rat, an early gene expression network, still lacking key striatal-specific signaling pathways, is downregulated and replaced by a mature gene expression network, containing key striatal-specific genes including the dopamine D1 and D2 receptors, conferring to these neurons their functional identity. Therefore, before this developmental switch, striatal neurons lack many of their key phenotypic characteristics. This maturation process is followed by a striking rise in expression of myelination genes, indicating a striatal-specific myelination event. Such strictly controlled developmental program has the potential to be a point of susceptibility to disruption by external factors. Indeed, this period is known to be a susceptibility period in both humans and rats. PMID:23184870

  9. Transitioning to High Performance Homes: Successes and Lessons Learned From Seven Builders

    SciTech Connect

    Widder, Sarah H.; Kora, Angela R.; Baechler, Michael C.; Fonorow, Ken; Jenkins, David W.; Stroer, Dennis

    2013-03-01

    As homebuyers are becoming increasingly concerned about rising energy costs and the impact of fossil fuels as a major source of greenhouse gases, the returning new home market is beginning to demand energy-efficient and comfortable high-performance homes. In response to this, some innovative builders are gaining market share because they are able to market their homes’ comfort, better indoor air quality, and aesthetics, in addition to energy efficiency. The success and marketability of these high-performance homes is creating a builder demand for house plans and information about how to design, build, and sell their own low-energy homes. To help make these and other builders more successful in the transition to high-performance construction techniques, Pacific Northwest National Laboratory (PNNL) partnered with seven interested builders in the hot humid and mixed humid climates to provide technical and design assistance through two building science firms, Florida Home Energy and Resources Organization (FL HERO) and Calcs-Plus, and a designer that offers a line of stock plans designed specifically for energy efficiency, called Energy Smart Home Plans (ESHP). This report summarizes the findings of research on cost-effective high-performance whole-house solutions, focusing on real-world implementation and challenges and identifying effective solutions. The ensuing sections provide project background, profile each of the builders who participated in the program, and describe their houses’ construction characteristics, key challenges the builders encountered during the construction and transaction process); and present primary lessons learned to be applied to future projects. As a result of this technical assistance, 17 homes have been built featuring climate-appropriate efficient envelopes, ducts in conditioned space, and correctly sized and controlled heating, ventilation, and air-conditioning systems. In addition, most builders intend to integrate high

  10. Genome-wide analysis of alternative splicing events in Hordeum vulgare: Highlighting retention of intron-based splicing and its possible function through network analysis.

    PubMed

    Panahi, Bahman; Mohammadi, Seyed Abolghasem; Ebrahimi Khaksefidi, Reyhaneh; Fallah Mehrabadi, Jalil; Ebrahimie, Esmaeil

    2015-11-30

    In this study, using homology mapping of assembled expressed sequence tags against the genomic data, we identified alternative splicing events in barley. Results demonstrated that intron retention is frequently associated with specific abiotic stresses. Network analysis resulted in discovery of some specific sub-networks between miRNAs and transcription factors in genes with high number of alternative splicing, such as cross talk between SPL2, SPL10 and SPL11 regulated by miR156 and miR157 families. To confirm the alternative splicing events, elongation factor protein (MLOC_3412) was selected followed by experimental verification of the predicted splice variants by Semi quantitative Reverse Transcription PCR (qRT-PCR). Our novel integrative approach opens a new avenue for functional annotation of alternative splicing through regulatory-based network discovery. PMID:26454178

  11. Automated 3D Damaged Cavity Model Builder for Lower Surface Acreage Tile on Orbiter

    NASA Technical Reports Server (NTRS)

    Belknap, Shannon; Zhang, Michael

    2013-01-01

    The 3D Automated Thermal Tool for Damaged Acreage Tile Math Model builder was developed to perform quickly and accurately 3D thermal analyses on damaged lower surface acreage tiles and structures beneath the damaged locations on a Space Shuttle Orbiter. The 3D model builder created both TRASYS geometric math models (GMMs) and SINDA thermal math models (TMMs) to simulate an idealized damaged cavity in the damaged tile(s). The GMMs are processed in TRASYS to generate radiation conductors between the surfaces in the cavity. The radiation conductors are inserted into the TMMs, which are processed in SINDA to generate temperature histories for all of the nodes on each layer of the TMM. The invention allows a thermal analyst to create quickly and accurately a 3D model of a damaged lower surface tile on the orbiter. The 3D model builder can generate a GMM and the correspond ing TMM in one or two minutes, with the damaged cavity included in the tile material. A separate program creates a configuration file, which would take a couple of minutes to edit. This configuration file is read by the model builder program to determine the location of the damage, the correct tile type, tile thickness, structure thickness, and SIP thickness of the damage, so that the model builder program can build an accurate model at the specified location. Once the models are built, they are processed by the TRASYS and SINDA.

  12. eHeroes and Swiff: EC-Funded FP7 networks for modelling and observation of space weather events

    NASA Astrophysics Data System (ADS)

    Lapenta, Giovanni

    2013-04-01

    We report on two EC-FP7 funded projects: Swiff and eHeroes. Swiff (swiff.eu) is a modelling effort that aims at producing an integrated space weather modelling and forecasting network. We are unifying into a single approach fluid and kinetic models to track space weather events from their solar origin to their impact on the Earth environment, eHeores (eheroes.eu) is a observational and modelling effort that collects and processes data to produce new data services and new models to track and predict space weather. eHeores focuses on the impact of space weather on space exploration including the effects on spacecraft, on the exploration of the Moon and Mars. We will report on the results obtained in these projects highlighting their relevance to space weather and its impact on the Earth and on space exploration. The research leading to these results has received funding from the European Commission's Seventh Framework Programme (FP7/2007-2013) under the grant agreement SWIFF (project n° 263340, www.swiff.eu).

  13. Computer (PC/Network) Coordinator.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 22 subjects appropriate for use in a competency list for the occupation of computer (PC/network) coordinator, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 22 units are as…

  14. Widespread prevalence of cryptic Symbiodinium D in the key Caribbean reef builder, Orbicella annularis

    NASA Astrophysics Data System (ADS)

    Kennedy, Emma V.; Foster, Nicola L.; Mumby, Peter J.; Stevens, Jamie R.

    2015-06-01

    Symbiodinium D, a relatively rare clade of algal endosymbiont with a global distribution, has attracted interest as some of its sub-cladal types induce increased thermal tolerance and associated trade-offs, including reduced growth rate in its coral hosts. Members of Symbiodinium D are increasingly reported to comprise low-abundance `cryptic' (<10 %) proportions of mixed coral endosymbiont communities, with unknown ecological implications. Real-time PCR (RT-PCR) targeted to specific types is sufficiently sensitive to detect these background symbiont levels. In this study, RT-PCR was employed to screen 552 colonies of the key Caribbean reef builder Orbicella annularis sampled across a 5.4 million km2 range for the presence of cryptic Symbiodinium `D1' (i.e., the principal Caribbean ITS2 variants, D1 and D1-4). All but one out of 33 populations analysed were shown to host low abundances of Symbiodinium D1, with an average of >30 % of corals per site found to harbour the symbiont. When the same samples were analysed using the conventional screening technique, denaturing gradient gel electrophoresis, Symbiodinium D1 was only detected in 12 populations and appeared to be hosted by <12 % of colonies where present (in agreement with other reported low prevalence/absences in O. annularis). Cryptic Symbiodinium D1 showed a mainly uniform distribution across the wider Caribbean region, although significantly more Mesoamerican Barrier Reef corals hosted cryptic Symbiodinium D1 than might be expected by chance, possibly as a consequence of intense warming in the region in 1998. Widespread prevalence of thermally tolerant Symbiodinium in O. annularis may potentially reflect a capacity for the coral to temporarily respond to warming events through symbiont shuffling. However, association with reduced coral calcification means that the ubiquitous nature of Symbiodinium D1 in O. annularis populations is unlikely to prevent long-term declines in reef health, at a time when

  15. Self inflicted death following inhalation and ingestion of Builders Polyurethane expandable foam.

    PubMed

    Morgan, D R; Musa, M

    2010-11-01

    Builders Polyurethane (PU) expandable foam is a product used to fill voids and provide insulation in the building industry. It is easily available from DIY and hardware stores. Other uses include pest control. It can produce fumes, while curing, which can be toxic to humans, or induce asthma and there are reports of polyurethane foam being combustible unless a fire retardant is incorporated. Death due to can explosion when heated has occurred. A literature review revealed one definite case of attempted suicide, one possible attempt by ingestion of Builders PU expandable foam and one accidental non fatal injection of such foam into the lower urinary tract. There is one report of accidental non fatal inhalation of foam. To our knowledge this is the first case of fatal inhalation and ingestion of Builders Polyurethane expandable foam. PMID:21056881

  16. The Train Builder data acquisition system for the European-XFEL

    NASA Astrophysics Data System (ADS)

    Coughlan, J.; Day, C.; Galagedera, S.; Halsall, R.

    2011-11-01

    The Train Builder is an Advanced Telecom ATCA based custom data acquisition system designed to provide a common readout system for the large 2D Mega-pixel detectors presently under construction for the European-XFEL facility in Hamburg. Each detector outputs 10 GBytes/sec of raw data over multiple 10 Gbps SFP+ optical links. The Train Builder DAQ system will merge detector link image fragments from up to 512 X-ray pulses in each pulse train and send the complete detector ``movies'' of images to a farm of PCs. The image building will be carried out using FPGAs with analogue Crosspoint switches operating in a barrel shift mode. The Train Builder data links will operate with 10G UDP&TCP/IP based protocols implemented in FPGA logic.

  17. Neuroticism, social network, stressful life events: association with mood disorders, depressive symptoms and suicidal ideation in a community sample of women.

    PubMed

    Mandelli, Laura; Nearchou, Finiki A; Vaiopoulos, Chrysostomos; Stefanis, Costas N; Vitoratou, Silia; Serretti, Alessandro; Stefanis, Nicholas C

    2015-03-30

    According to the stress-diathesis hypothesis, depression and suicidal behavior may be precipitated by psychosocial stressors in vulnerable individuals. However, risk factors for mental health are often gender-specific. In the present study, we evaluated common risk factors for female depression in association with depressive symptoms and suicidal ideation in a community sample of women. The sample was composed by 415 women evaluated for mood disorders (MDs), depressive symptoms and suicidal ideation by structured interviews and the Beck depression inventory II (BDI II). All women also filled in the Eysenck personality questionnaire to evaluate neuroticism and were interviewed for social contact frequency and stressful life events (SLEs). In the whole sample, 19% of the women satisfied criteria for MD and suicidal ideation was reported by 12% of the women. Though stressful life events, especially personal and interpersonal problems, and poor social network were associated with all the outcome variables (mood disorder, depressive symptomatology and suicidal ideation), neuroticism survived to all multivariate analyses. Social network, together with neuroticism, also showed strong association with depressive severity, independently from current depressive state. Though we were unable to compare women and men, data obtained from the present study suggest that in women neurotic traits are strongly related to depression and suicidal ideation, and potentially mediate reporting of stressful life events and impaired social network. Independently from a current diagnosis of depression, impaired social network increases depressive symptoms in the women. PMID:25677396

  18. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Policy, Form MA-283. 308.409 Section 308.409 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF... of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk Builder's Risk Insurance Policy, Form MA-283 may be obtained from MARAD's underwriting agent or MARAD....

  19. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Policy, Form MA-283. 308.409 Section 308.409 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF... of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk Builder's Risk Insurance Policy, Form MA-283 may be obtained from the American War Risk Agency or MARAD....

  20. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Policy, Form MA-283. 308.409 Section 308.409 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF... of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk Builder's Risk Insurance Policy, Form MA-283 may be obtained from the American War Risk Agency or MARAD....

  1. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Policy, Form MA-283. 308.409 Section 308.409 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF... of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk Builder's Risk Insurance Policy, Form MA-283 may be obtained from the American War Risk Agency or MARAD....

  2. DOE Zero Energy Ready Home Case Study: New Town Builders — The ArtiZEN Plan, Denver, CO

    SciTech Connect

    none,

    2014-09-01

    The Grand Winner in the Production Builder category of the 2014 Housing Innovation Awards, this builder plans to convert all of its product lines to DOE Zero Energy Ready Home construction by the end of 2015. This home achieves HERS 38 without photovoltaics (PV) and HERS -3 with 8.0 kW of PV.

  3. Impaired target detection in schizophrenia and the ventral attentional network: Findings from a joint event-related potential–functional MRI analysis

    PubMed Central

    Wynn, Jonathan K.; Jimenez, Amy M.; Roach, Brian J.; Korb, Alexander; Lee, Junghee; Horan, William P.; Ford, Judith M.; Green, Michael F.

    2015-01-01

    Schizophrenia patients have abnormal neural responses to salient, infrequent events. We integrated event-related potentials (ERP) and fMRI to examine the contributions of the ventral (salience) and dorsal (sustained) attention networks to this dysfunctional neural activation. Twenty-one schizophrenia patients and 22 healthy controls were assessed in separate sessions with ERP and fMRI during a visual oddball task. Visual P100, N100, and P300 ERP waveforms and fMRI activation were assessed. A joint independent components analysis (jICA) on the ERP and fMRI data were conducted. Patients exhibited reduced P300, but not P100 or N100, amplitudes to targets and reduced fMRI neural activation in both dorsal and ventral attentional networks compared with controls. However, the jICA revealed that the P300 was linked specifically to activation in the ventral (salience) network, including anterior cingulate, anterior insula, and temporal parietal junction, with patients exhibiting significantly lower activation. The P100 and N100 were linked to activation in the dorsal (sustained) network, with no group differences in level of activation. This joint analysis approach revealed the nature of target detection deficits that were not discernable by either imaging methodology alone, highlighting the utility of a multimodal fMRI and ERP approach to understand attentional network deficits in schizophrenia. PMID:26448909

  4. Impaired target detection in schizophrenia and the ventral attentional network: Findings from a joint event-related potential-functional MRI analysis.

    PubMed

    Wynn, Jonathan K; Jimenez, Amy M; Roach, Brian J; Korb, Alexander; Lee, Junghee; Horan, William P; Ford, Judith M; Green, Michael F

    2015-01-01

    Schizophrenia patients have abnormal neural responses to salient, infrequent events. We integrated event-related potentials (ERP) and fMRI to examine the contributions of the ventral (salience) and dorsal (sustained) attention networks to this dysfunctional neural activation. Twenty-one schizophrenia patients and 22 healthy controls were assessed in separate sessions with ERP and fMRI during a visual oddball task. Visual P100, N100, and P300 ERP waveforms and fMRI activation were assessed. A joint independent components analysis (jICA) on the ERP and fMRI data were conducted. Patients exhibited reduced P300, but not P100 or N100, amplitudes to targets and reduced fMRI neural activation in both dorsal and ventral attentional networks compared with controls. However, the jICA revealed that the P300 was linked specifically to activation in the ventral (salience) network, including anterior cingulate, anterior insula, and temporal parietal junction, with patients exhibiting significantly lower activation. The P100 and N100 were linked to activation in the dorsal (sustained) network, with no group differences in level of activation. This joint analysis approach revealed the nature of target detection deficits that were not discernable by either imaging methodology alone, highlighting the utility of a multimodal fMRI and ERP approach to understand attentional network deficits in schizophrenia. PMID:26448909

  5. New Whole-House Solutions Case Study: New Town Builders' Power of Zero Energy Center - Denver, Colorado

    SciTech Connect

    2014-10-01

    New Town Builders, a builder of energy efficient homes in Denver, Colorado, offers a zero energy option for all the homes it builds. To attract a wide range of potential homebuyers to its energy efficient homes, New Town Builders created a "Power of Zero Energy Center" linked to its model home in the Stapleton community. This case study presents New Town Builders' marketing approach, which is targeted to appeal to homebuyers' emotions rather than overwhelming homebuyers with scientific details about the technology. The exhibits in the Power of Zero Energy Center focus on reduced energy expenses for the homeowner, improved occupant comfort, the reputation of the builder, and the lack of sacrificing the homebuyers' desired design features to achieve zero net energy in the home. This case study also contains customer and realtor testimonials related to the effectiveness of the Center in influencing homebuyers to purchase a zero energy home.

  6. Regional seismic event identification and improved locations with small arrays and networks. Final report, 7 May 1993-30 September 1995

    SciTech Connect

    Vernon, F.L.; Minster, J.B.; Orcutt, J.A.

    1995-09-20

    This final report contains a summary of our work on the use of seismic networks and arrays to improve locations and identify small seismic event. We have developed techniques to migrate 3-component array records of local, regional and teleseismic wavetrains to directly image buried two- and three-dimensional heterogeneities (e.g. layer irregularities, volumetric heterogeneities) in the vicinity of the array. We have developed a technique to empirically characterize local and regional seismic code by binning and stacking network recordings of dense aftershock sequences. The principle motivation for this work was to look for robust coda phases dependent on source depth. We have extended our ripple-fired event discriminant (based on the time-independence of coda produced by ripple firing) by looking for an independence of the coda from the recording direction (also indicative of ripple-firing).

  7. Enacting Social Justice to Teach Social Justice: The Pedagogy of Bridge Builders

    ERIC Educational Resources Information Center

    Eifler, Karen E.; Kerssen-Griep, Jeff; Thacker, Peter

    2008-01-01

    This article describes a particular endeavor, the Bridge Builders Academic Mentoring Program (BAMP), a partnership between a school of education in a Catholic university in the Northwest and a community-based rites of passage program for adolescent African American males. The partnership exemplifies tenets of Catholic social teaching, in that it…

  8. BioBuilder as a database development and functional annotation platform for proteins

    PubMed Central

    Navarro, J Daniel; Talreja, Naveen; Peri, Suraj; Vrushabendra, BM; Rashmi, BP; Padma, N; Surendranath, Vineeth; Jonnalagadda, Chandra Kiran; Kousthub, PS; Deshpande, Nandan; Shanker, K; Pandey, Akhilesh

    2004-01-01

    Background The explosion in biological information creates the need for databases that are easy to develop, easy to maintain and can be easily manipulated by annotators who are most likely to be biologists. However, deployment of scalable and extensible databases is not an easy task and generally requires substantial expertise in database development. Results BioBuilder is a Zope-based software tool that was developed to facilitate intuitive creation of protein databases. Protein data can be entered and annotated through web forms along with the flexibility to add customized annotation features to protein entries. A built-in review system permits a global team of scientists to coordinate their annotation efforts. We have already used BioBuilder to develop Human Protein Reference Database , a comprehensive annotated repository of the human proteome. The data can be exported in the extensible markup language (XML) format, which is rapidly becoming as the standard format for data exchange. Conclusions As the proteomic data for several organisms begins to accumulate, BioBuilder will prove to be an invaluable platform for functional annotation and development of customizable protein centric databases. BioBuilder is open source and is available under the terms of LGPL. PMID:15099404

  9. Initial Behavior Outcomes for the PeaceBuilders Universal School-Based Violence Prevention Program.

    ERIC Educational Resources Information Center

    Flannery, Daniel J.; And Others

    2003-01-01

    Assigned elementary schools to either immediate postbaseline intervention (PBI) with PeaceBuilders, a school-based violence prevention program, or to intervention 1 year later (PBD). Found significant gains in social competence for kindergarten through second-graders in Year 1, in peace-building behavior in Grades K to 5, and reduced aggression in…

  10. DOE Zero Energy Ready Home Case Study: BPC Green Builders, New Fairfield, Connecticut

    SciTech Connect

    none,

    2013-09-01

    This LEED Platinum home was built on the site of a 60-year-old bungalow that was demolished. It boasts views of Candlewood Lake, a great deal of daylight, and projected annual energy savings of almost $3,000. This home was awarded a 2013 Housing Innovation Award in the custom builder category.

  11. Improving Water Management: Applying ModelBuilder to site water impoundments using AEM survey data

    SciTech Connect

    Sams, J.I.; Lipinski, B.A.; Harbert, W.P.; Ackman, T.E.

    2007-01-01

    ArcGIS ModelBuilder was used to create a GIS-based decision support model that incorporated digital elevation data and electromagnetic geophysical results gathered by helicopter to screen potential sites for water disposal impoundments produced from coal bed natural gas.

  12. 77 FR 2310 - Notice of Submission of Proposed Information Collection to OMB; Builder's Certification/Guarantee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ... Certification/Guarantee and New Construction Subterranean Termite Soil Treatment Record AGENCY: Office of the... provide a record of any soil treatment methods used to prevent subterranean termite infestation. The...: Builder's Certification/Guarantee and New Construction Subterranean Termite Soil Treatment Record....

  13. 13 CFR 120.391 - What is the Builders Loan Program?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What is the Builders Loan Program? 120.391 Section 120.391 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS...)(9) of the Act, SBA may make or guarantee loans to finance small general contractors to construct...

  14. Three Adapted Science Skill Builders for Junior and Senior High School Orthopaedically Handicapped Students.

    ERIC Educational Resources Information Center

    Cardullias, Peter J.; And Others

    The study was designed to determine how standard science skill builder activities can be modified or adapted for use by orthopedically handicapped students. Nine secondary level science experiments were selected for initial review and from these, three were selected for adaptation--use of the microscope, use of graduated cylinders, and use of the…

  15. Builder 3 & 2. Naval Education and Training Command Rate Training Manual and Nonresident Career Course. Revised.

    ERIC Educational Resources Information Center

    Countryman, Gene L.

    This Rate Training Manual (Textbook) and Nonresident Career Course form a correspondence, self-study package to provide information related to tasks assigned to Builders Third and Second Class. Focus is on constructing, maintaining, and repairing wooden, concrete, and masonry structures, concrete pavement, and waterfront and underwater structures;…

  16. HVAC Design Strategy for a Hot-Humid Production Builder, Houston, Texas (Fact Sheet)

    SciTech Connect

    Not Available

    2014-03-01

    BSC worked directly with the David Weekley Homes - Houston division to redesign three floor plans in order to locate the HVAC system in conditioned space. The purpose of this project is to develop a cost effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses. This is in preparation for the upcoming code changes in 2015. The builder wishes to develop an upgrade package that will allow for a seamless transition to the new code mandate. The following research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.

  17. Heating, Ventilation, and Air Conditioning Design Strategy for a Hot-Humid Production Builder

    SciTech Connect

    Kerrigan, P.

    2014-03-01

    BSC worked directly with the David Weekley Homes - Houston division to redesign three floor plans in order to locate the HVAC system in conditioned space. The purpose of this project is to develop a cost effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses. This is in preparation for the upcoming code changes in 2015. The builder wishes to develop an upgrade package that will allow for a seamless transition to the new code mandate. The following research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.

  18. Clinical Correlates of Surveillance Events Detected by National Healthcare Safety Network Pneumonia and Lower Respiratory Infection Definitions-Pennsylvania, 2011-2012.

    PubMed

    See, Isaac; Chang, Julia; Gualandi, Nicole; Buser, Genevieve L; Rohrbach, Pamela; Smeltz, Debra A; Bellush, Mary Jo; Coffin, Susan E; Gould, Jane M; Hess, Debra; Hennessey, Patricia; Hubbard, Sydney; Kiernan, Andrea; O'Donnell, Judith; Pegues, David A; Miller, Jeffrey R; Magill, Shelley S

    2016-07-01

    OBJECTIVE To determine the clinical diagnoses associated with the National Healthcare Safety Network (NHSN) pneumonia (PNEU) or lower respiratory infection (LRI) surveillance events DESIGN Retrospective chart review SETTING A convenience sample of 8 acute-care hospitals in Pennsylvania PATIENTS All patients hospitalized during 2011-2012 METHODS Medical records were reviewed from a random sample of patients reported to the NHSN to have PNEU or LRI, excluding adults with ventilator-associated PNEU. Documented clinical diagnoses corresponding temporally to the PNEU and LRI events were recorded. RESULTS We reviewed 250 (30%) of 838 eligible PNEU and LRI events reported to the NHSN; 29 reported events (12%) fulfilled neither PNEU nor LRI case criteria. Differences interpreting radiology reports accounted for most misclassifications. Of 81 PNEU events in adults not on mechanical ventilation, 84% had clinician-diagnosed pneumonia; of these, 25% were attributed to aspiration. Of 43 adult LRI, 88% were in mechanically ventilated patients and 35% had no corresponding clinical diagnosis (infectious or noninfectious) documented at the time of LRI. Of 36 pediatric PNEU events, 72% were ventilator associated, and 70% corresponded to a clinical pneumonia diagnosis. Of 61 pediatric LRI patients, 84% were mechanically ventilated and 21% had no corresponding clinical diagnosis documented. CONCLUSIONS In adults not on mechanical ventilation and in children, most NHSN-defined PNEU events corresponded with compatible clinical conditions documented in the medical record. In contrast, NHSN LRI events often did not. As a result, substantial modifications to the LRI definitions were implemented in 2015. Infect Control Hosp Epidemiol 2016;37:818-824. PMID:27072043

  19. Performance evaluation of the retrieval of a two hours rainfall event through microwave tomography applied to a network of radio-base stations

    NASA Astrophysics Data System (ADS)

    Facheris, L.; Cuccoli, F.; Baldini, L.

    2012-04-01

    Critical precipitation events occurred over the Italian territory have been often characterized by high intensity and very fast development, frequently over small catchment areas. The detection of this kind of phenomena is a major issue that poses remarkable problems that cannot be tackled completely only with 'standard' instrumentation (even when available), such as a weather radars or raingauges. Indeed, the rainfall sampling modalities of these instruments may jeopardize the attempts to provide a sufficiently fast risk alert: - the point-like, time-integrated way of sampling of raingauges can completely/partially miss local rainfall cores of high intensity developing in the neighborhoods. Moreover, raingauges provide cumulated rainfall measurements intrinsically affected by a time delay. - In the case of weather radars, several factors may limit the advantages brought by range resolution and instantaneous sampling: precipitation might be sampled at an excessive height due to the distance of the radar site and/or the orography surrounding the valleys/catchments where the aforementioned kind of events is more likely to form up; distance may limit the resolution in the cross-range direction; beam screening due to orography causes a loss of power that is interpreted in the farther range bins as a reduced precipitation intensity. In this context, a positive role for flagging the criticality of a precipitation event can be played by signal attenuation measurements made along microwave links, as available through the infrastructure of a mobile communications network. Three are the interesting features of such networks: 1) the communications among radio-base stations occur where point-to-point electromagnetic visibility is guaranteed, namely along valleys or between tops/flanks of hills or mountains; 2) the extension of these links (few kilometres) is perfectly compatible with the detection of severe but localized precipitation events; 3) measurements can be made on a

  20. News CPD Event: Teaching day gives new perspectives Workshop: IOP network devolops its ideas Conference: Conference offers much to teachers Event: Physics is made easy in Liverpool Communication: IOSTE debates the complexities of STE Conference: Teaching event excites in Exeter Meeting Invitation: Wales physics meeting invites bookings CPD Event: Science teachers get hands on with development Research: Conference highlights liquid crytstal research in teaching Education: Teachers give positive feedback Science Fair: Science fair brings physics to students Teaching: Conference explores trends in teaching Forthcoming events

    NASA Astrophysics Data System (ADS)

    2010-09-01

    CPD Event: Teaching day gives new perspectives Workshop: IOP network devolops its ideas Conference: Conference offers much to teachers Event: Physics is made easy in Liverpool Communication: IOSTE debates the complexities of STE Conference: Teaching event excites in Exeter Meeting Invitation: Wales physics meeting invites bookings CPD Event: Science teachers get hands on with development Research: Conference highlights liquid crytstal research in teaching Education: Teachers give positive feedback Science Fair: Science fair brings physics to students Teaching: Conference explores trends in teaching Forthcoming events

  1. Complex Networks Approach for Analyzing the Correlation of Traditional Chinese Medicine Syndrome Evolvement and Cardiovascular Events in Patients with Stable Coronary Heart Disease

    PubMed Central

    Gao, Zhuye; Li, Siwei; Jiao, Yang; Zhou, Xuezhong; Fu, Changgeng; Shi, Dazhuo; Chen, Keji

    2015-01-01

    This is a multicenter prospective cohort study to analyze the correlation of traditional Chinese medicine (TCM) syndrome evolvement and cardiovascular events in patients with stable coronary heart disease (CHD). The impact of syndrome evolvement on cardiovascular events during the 6-month and 12-month follow-up was analyzed using complex networks approach. Results of verification using Chi-square test showed that the occurrence of cardiovascular events was positively correlated with syndrome evolvement when it evolved from toxic syndrome to Qi deficiency, blood stasis, or sustained toxic syndrome, when it evolved from Qi deficiency to blood stasis, toxic syndrome, or sustained Qi deficiency, and when it evolved from blood stasis to Qi deficiency. Blood stasis, Qi deficiency, and toxic syndrome are important syndrome factors for stable CHD. There are positive correlations between cardiovascular events and syndrome evolution from toxic syndrome to Qi deficiency or blood stasis, from Qi deficiency to blood stasis, or toxic syndrome and from blood stasis to Qi deficiency. These results indicate that stable CHD patients with pathogenesis of toxin consuming Qi, toxin leading to blood stasis, and mutual transformation of Qi deficiency and blood stasis are prone to recurrent cardiovascular events. PMID:25821500

  2. The isotopic aspects of the calcification of the reef builder gastropod Dendropoma petreaum - can vermetids serve as paleoceanographic proxy?

    NASA Astrophysics Data System (ADS)

    Sisma-Ventura, G.; Shemesh, A.

    2009-04-01

    The vermetid reef builder Dendropoma petraeum - a sessile irregular-coiled gastropods that develop dense aggregations on the abrasion platform edges and grow at Mean Sea Level, can serve as an archive of environmental conditions such as sea surface temperature and salinity provided that it deposits the calcitic skeleton in isotopic equilibrium. The large distribution of vermetid reefs in subtropical waters and across the Mediterranean allows their use as paleo-markers in areas that are void of corals for paleo-climate reconstruction. We studied the isotopic composition of vermetids retrieved from the coast of the Levantine Basin of the Mediterranean Sea. The δ18O of the calcitic shell of living vermetids indicate that skeletal deposition occurs under isotopic equilibrium and faithfully record the temperature and surface water δ18O during spring and summer. High-resolution δ18O and δ13C records obtained from several cores were used to reconstruct variations in the Levantine basin sea surface temperature, hydrology and productivity during the past 500 years. The correlation with global climatic events will be discussed.

  3. Experimental evidence of the synergistic effects of warming and invasive algae on a temperate reef-builder coral

    PubMed Central

    Kersting, Diego K; Cebrian, Emma; Casado, Clara; Teixidó, Núria; Garrabou, Joaquim; Linares, Cristina

    2015-01-01

    In the current global climate change scenario, stressors overlap in space and time, and knowledge on the effects of their interaction is highly needed to understand and predict the response and resilience of organisms. Corals, among many other benthic organisms, are affected by an increasing number of global change-related stressors including warming and invasive species. In this study, the cumulative effects between warming and invasive algae were experimentally assessed on the temperate reef-builder coral Cladocora caespitosa. We first investigated the potential local adaptation to thermal stress in two distant populations subjected to contrasting thermal and necrosis histories. No significant differences were found between populations. Colonies from both populations suffered no necrosis after long-term exposure to temperatures up to 29 °C. Second, we tested the effects of the interaction of both warming and the presence of invasive algae. The combined exposure triggered critical synergistic effects on photosynthetic efficiency and tissue necrosis. At the end of the experiment, over 90% of the colonies subjected to warming and invasive algae showed signs of necrosis. The results are of particular concern when considering the predicted increase of extreme climatic events and the spread of invasive species in the Mediterranean and other seas in the future. PMID:26692424

  4. Experimental evidence of the synergistic effects of warming and invasive algae on a temperate reef-builder coral.

    PubMed

    Kersting, Diego K; Cebrian, Emma; Casado, Clara; Teixidó, Núria; Garrabou, Joaquim; Linares, Cristina

    2015-01-01

    In the current global climate change scenario, stressors overlap in space and time, and knowledge on the effects of their interaction is highly needed to understand and predict the response and resilience of organisms. Corals, among many other benthic organisms, are affected by an increasing number of global change-related stressors including warming and invasive species. In this study, the cumulative effects between warming and invasive algae were experimentally assessed on the temperate reef-builder coral Cladocora caespitosa. We first investigated the potential local adaptation to thermal stress in two distant populations subjected to contrasting thermal and necrosis histories. No significant differences were found between populations. Colonies from both populations suffered no necrosis after long-term exposure to temperatures up to 29 °C. Second, we tested the effects of the interaction of both warming and the presence of invasive algae. The combined exposure triggered critical synergistic effects on photosynthetic efficiency and tissue necrosis. At the end of the experiment, over 90% of the colonies subjected to warming and invasive algae showed signs of necrosis. The results are of particular concern when considering the predicted increase of extreme climatic events and the spread of invasive species in the Mediterranean and other seas in the future. PMID:26692424

  5. Effect of densifying the GNSS GBAS network on monitoring the troposphere zenith total delay and precipitable water vapour content during severe weather events

    NASA Astrophysics Data System (ADS)

    Kapłon, Jan; Stankunavicius, Gintautas

    2016-04-01

    The dense ground based augmentation networks can provide the important information for monitoring the state of neutral atmosphere. The GNSS&METEO research group at Wroclaw University of Environmental and Life Sciences (WUELS) is operating the self-developed near real-time service estimating the troposphere parameters from GNSS data for the area of Poland. The service is operational since December 2012 and it's results calculated from ASG-EUPOS GBAS network (120 stations) data are supporting the EGVAP (http://egvap.dmi.dk) project. At first the zenith troposphere delays (ZTD) were calculated in hourly intervals, but since September 2015 the service was upgraded to include SmartNet GBAS network (Leica Geosystems Polska - 150 stations). The upgrade included as well: increasing the result interval to 30 minutes, upgrade from Bernese GPS Software v. 5.0 to Bernese GNSS Software v. 5.2 and estimation of the ZTD and it's horizontal gradients. Processing includes nowadays 270 stations. The densification of network from 70 km of mean distance between stations to 40 km created the opportunity to investigate on it's impact on resolution of estimated ZTD and integrated water vapour content (IWV) fields during the weather events of high intensity. Increase in density of ZTD measurements allows to define better the meso-scale features within different synoptic systems (e.g. frontal waves, meso-scale convective systems, squall lines etc). These meso-scale structures, as a rule are short living but fast developing and hardly predictable by numerical models. Even so, such limited size systems can produce very hazardous phenomena - like widespread squalls and thunderstorms, tornadoes, heavy rains, snowfalls, hail etc. because of prevalence of Cb clouds with high concentration of IWV. Study deals with two meteorological events: 2015-09-01 with the devastating squalls and rainfall bringing 2M Euro loss of property in northern Poland and 2015-10-12 with the very active front bringing

  6. Networks.

    ERIC Educational Resources Information Center

    Maughan, George R.; Petitto, Karen R.; McLaughlin, Don

    2001-01-01

    Describes the connectivity features and options of modern campus communication and information system networks, including signal transmission (wire-based and wireless), signal switching, convergence of networks, and network assessment variables, to enable campus leaders to make sound future-oriented decisions. (EV)

  7. Heating, Ventilation, and Air Conditioning Design Strategy for a Hot-Humid Production Builder

    SciTech Connect

    Kerrigan, P.

    2014-03-01

    Building Science Corporation (BSC) worked directly with the David Weekley Homes - Houston division to develop a cost-effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses in preparation for the upcoming code changes in 2015. This research project addressed the following questions: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost?

  8. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations : VQone MATLAB toolbox.

    PubMed

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page ( http://www.helsinki.fi/psychology/groups/visualcognition/ ). PMID:25595311

  9. The GlycanBuilder and GlycoWorkbench glycoinformatics tools: updates and new developments.

    PubMed

    Damerell, David; Ceroni, Alessio; Maass, Kai; Ranzinger, Rene; Dell, Anne; Haslam, Stuart M

    2012-11-01

    During the EUROCarbDB project our group developed the GlycanBuilder and GlycoWorkbench glycoinformatics tools. This short communication summarizes the capabilities of these two tools and updates which have been made since the original publications in 2007 and 2008. GlycanBuilder is a tool that allows for the fast and intuitive drawing of glycan structures; this tool can be used standalone, embedded in web pages and can also be integrated into other programs. GlycoWorkbench has been designed to semi-automatically annotate glycomics data. This tool can be used to annotate mass spectrometry (MS) and MS/MS spectra of free oligosaccharides, N and O-linked glycans, GAGs (glycosaminoglycans) and glycolipids, as well as MS spectra of glycoproteins. PMID:23109548

  10. Measurement of sediment loads during flash flood events: 14 years of results from a six stream monitoring network on the southern Colorado Plateau

    NASA Astrophysics Data System (ADS)

    Griffiths, R. E.; Topping, D. J.

    2015-12-01

    In in arid and semi-arid environments, short-duration, high-intensity rainfall events—flash floods—are the primary driver of sediment transport in ephemeral streams. The spatial and temporal variability of these rainfall events results in episodic and irregular stream flow and resultant sediment transport. As a result of limited-flow durations, measuring discharge and collecting suspended-sediment samples on ephemeral streams in arid regions is difficult and time-consuming. Because of these limitations, few sediment-monitoring programs on ephemeral streams have been developed; some examples of sediment-monitoring gages and gaging networks constructed on arid ephemeral streams include Walnut Gulch, United States, Nahal Yael, Israel, and the Luni River Basin, India. The difficulty in making measurements of discharge and suspended-sediment concentration on arid ephemeral streams has led many researchers to use methods such as regional sediment-yield equations, sediment-rating curves, and peak discharge to total-sediment load relations. These methods can provide a cost-effective estimation of sediment yield from ungaged tributaries. However, these approaches are limited by, among other factors, time averaging, hysteresis, and differences in local and regional geology, rainfall, and vegetation. A monitoring network was established in 2000 on six ephemeral tributaries of the Colorado River in lower Glen and upper Marble canyons. Results from this monitoring network show that annual suspended-sediment loads for individual streams can vary by 5 orders of magnitude while the annual suspended-sediment load for the entire network may vary annually by 2 orders of magnitude, suspended-sediment loads during an individual flood event do not typically correlate with discharge, and local geology has a strong control on the sediment yield of a drainage basin. Comparing our results to previous estimates of sediment load from these drainages found that previous, indirect, methods

  11. Event-based distributed set-membership filtering for a class of time-varying non-linear systems over sensor networks with saturation effects

    NASA Astrophysics Data System (ADS)

    Wei, Guoliang; Liu, Shuai; Wang, Licheng; Wang, Yongxiong

    2016-07-01

    In this paper, based on the event-triggered mechanism, the problem of distributed set-membership filtering is concerned for a class of time-varying non-linear systems over sensor networks subject to saturation effects. Different from the traditional periodic sample-data approach, the filter is updated only when the predefined event is satisfied, which the event is defined according to the measurement output. For each node, the proposed novel event-triggered mechanism can reduce the unnecessary information transmission between sensors and filters. The purpose of the addressed problem is to design a series of distributed set-membership filters, for all the admissible unknown but bounded noises, non-linearities and sensor saturation, such that the set of all possible states can be determined. The desired filter parameters are obtained by solving a recursive linear matrix inequality that can be computed recursively using the available MATLAB toolbox. Finally, a simulation example is exploited to show the effectiveness of the proposed design approach in this paper.

  12. DOE Zero Energy Ready Home Case Study: BPC Green Builders — Trolle Residence, Danbury, CT

    SciTech Connect

    none,

    2014-09-01

    The builder of this 1,650-ft2 cabin won a Custom Home honor in the 2014 Housing Innovations Awards. The home meets Passive House Standards with 5.5-in. of foil-faced polysiocyanurate foam boards lining the outside walls, R-55 of rigid EPS foam under the slab, R-86 of blown cellulose in the attic, triple-pane windows, and a single ductless heat pump to heat and cool the entire home.

  13. LipidBuilder: A Framework To Build Realistic Models for Biological Membranes.

    PubMed

    Bovigny, Christophe; Tamò, Giorgio; Lemmin, Thomas; Maïno, Nicolas; Dal Peraro, Matteo

    2015-12-28

    The physical and chemical characterization of biological membranes is of fundamental importance for understanding the functional role of lipid bilayers in shaping cells and organelles, steering vesicle trafficking and promoting membrane-protein signaling. Molecular dynamics simulations stand as a powerful tool to probe the properties of membranes at atomistic level. However, the biological membrane is highly complex, and closely mimicking its physiological constitution in silico is not a straightforward task. Here, we present LipidBuilder, a framework for creating and storing models of biologically relevant phospholipid species with acyl tails of heterogeneous composition. LipidBuilder also enables the assembly of these database-stored lipids into realistic bilayers featuring asymmetric distribution on layer leaflets and concentration of given membrane constituents as defined, for example, by lipidomics experiments. The ability of LipidBuilder to assemble robust membrane models was validated by simulating membranes of homogeneous lipid composition for which experimental data are available. Furthermore, taking advantage of the extensive lipid headgroup repertoire, we assembled models of membranes of heterogeneous nature as naturally found in viral (phage PRD1), bacterial (Salmonella enterica, Laurinavicius , S. ; Kakela , R. ; Somerharju , P. ; Bamford , D. H. ; Virology 2004 , 322 , 328 - 336 ) and plant (Chlorella kessleri, Rezanka , T. ; Podojil , M. ; J. Chromatogr. 1989 , 463 , 397 - 408 ) organisms. These realistic membrane models were built using a near-exact lipid composition revealed from analytical chemistry experiments. We suggest LipidBuilder as a useful tool to model biological membranes of near-biological complexity, and as a robust complement to the current efforts to characterize the biophysical properties of biological membrane using molecular simulation. PMID:26606666

  14. Development of a beam builder for automatic fabrication of large composite space structures

    NASA Technical Reports Server (NTRS)

    Bodle, J. G.

    1979-01-01

    The composite material beam builder which will produce triangular beams from pre-consolidated graphite/glass/thermoplastic composite material through automated mechanical processes is presented, side member storage, feed and positioning, ultrasonic welding, and beam cutoff are formed. Each process lends itself to modular subsystem development. Initial development is concentrated on the key processes for roll forming and ultrasonic welding composite thermoplastic materials. The construction and test of an experimental roll forming machine and ultrasonic welding process control techniques are described.

  15. Validation of bioelectrical impedance analysis to hydrostatic weighing in male body builders.

    PubMed

    Volpe, Stella Lucia; Melanson, Edward L; Kline, Gregory

    2010-03-01

    The purpose of this study was to compare bioelectrical impedance analysis (BIA) to hydrostatic weighing (HW) in male weight lifters and body builders. Twenty-two male body builders and weight lifters, 23 +/- 3 years of age (mean +/- SD), were studied to determine the efficacy of BIA to HW in this population. Subjects were measured on two separate occasions, 6 weeks apart, for test-retest reliability purposes. Participants recorded 3-day dietary intakes and average work-out times and regimens between the two testing periods. Subjects were, on average, 75 +/- 8 kg of body weight and 175 +/- 7 cm tall. Validation results were as follows: constant error for HW-BIA = 0.128 +/- 3.7%, r for HW versus BIA = -0.294. Standard error of the estimate for BIA = 2.32% and the total error for BIA = 3.6%. Percent body fat was 7.8 +/- 1% from BIA and 8.5 +/- 2% from HW (P > 0.05). Subjects consumed 3,217 +/- 1,027 kcals; 1,848 +/- 768 kcals from carbohydrates; 604 +/- 300 kcals from protein; and 783 +/- 369 kcals from fat. Although work-outs differed among one another, within subject training did not vary. These results suggest that measurement of percent body fat in male body builders and weight trainers is equally as accurate using BIA or HW. PMID:19219400

  16. Solving a real-world problem using an evolving heuristically driven schedule builder.

    PubMed

    Hart, E; Ross, P; Nelson, J

    1998-01-01

    This work addresses the real-life scheduling problem of a Scottish company that must produce daily schedules for the catching and transportation of large numbers of live chickens. The problem is complex and highly constrained. We show that it can be successfully solved by division into two subproblems and solving each using a separate genetic algorithm (GA). We address the problem of whether this produces locally optimal solutions and how to overcome this. We extend the traditional approach of evolving a "permutation + schedule builder" by concentrating on evolving the schedule builder itself. This results in a unique schedule builder being built for each daily scheduling problem, each individually tailored to deal with the particular features of that problem. This results in a robust, fast, and flexible system that can cope with most of the circumstances imaginable at the factory. We also compare the performance of a GA approach to several other evolutionary methods and show that population-based methods are superior to both hill-climbing and simulated annealing in the quality of solutions produced. Population-based methods also have the distinct advantage of producing multiple, equally fit solutions, which is of particular importance when considering the practical aspects of the problem. PMID:10021741

  17. Time-related patient data retrieval for the case studies from the pharmacogenomics research network

    PubMed Central

    Zhu, Qian; Tao, Cui; Ding, Ying; Chute, Christopher G.

    2012-01-01

    There are lots of question-based data elements from the pharmacogenomics research network (PGRN) studies. Many data elements contain temporal information. To semantically represent these elements so that they can be machine processiable is a challenging problem for the following reasons: (1) the designers of these studies usually do not have the knowledge of any computer modeling and query languages, so that the original data elements usually are represented in spreadsheets in human languages; and (2) the time aspects in these data elements can be too complex to be represented faithfully in a machine-understandable way. In this paper, we introduce our efforts on representing these data elements using semantic web technologies. We have developed an ontology, CNTRO, for representing clinical events and their temporal relations in the web ontology language (OWL). Here we use CNTRO to represent the time aspects in the data elements. We have evaluated 720 time-related data elements from PGRN studies. We adapted and extended the knowledge representation requirements for EliXR-TIME to categorize our data elements. A CNTRO-based SPARQL query builder has been developed to customize users’ own SPARQL queries for each knowledge representation requirement. The SPARQL query builder has been evaluated with a simulated EHR triple store to ensure its functionalities. PMID:23076712

  18. Modulation of a Fronto-Parietal Network in Event-Based Prospective Memory: An rTMS Study

    ERIC Educational Resources Information Center

    Bisiacchi, P. S.; Cona, G.; Schiff, S.; Basso, D.

    2011-01-01

    Event-based prospective memory (PM) is a multi-component process that requires remembering the delayed execution of an intended action in response to a pre-specified PM cue, while being actively engaged in an ongoing task. Some neuroimaging studies have suggested that both prefrontal and parietal areas are involved in the maintenance and…

  19. News Competition: Physics Olympiad hits Thailand Report: Institute carries out survey into maths in physics at university Event: A day for everyone teaching physics Conference: Welsh conference celebrates birthday Schools: Researchers in Residence scheme set to close Teachers: A day for new physics teachers Social: Network combines fun and physics Forthcoming events

    NASA Astrophysics Data System (ADS)

    2011-09-01

    Competition: Physics Olympiad hits Thailand Report: Institute carries out survey into maths in physics at university Event: A day for everyone teaching physics Conference: Welsh conference celebrates birthday Schools: Researchers in Residence scheme set to close Teachers: A day for new physics teachers Social: Network combines fun and physics Forthcoming events

  20. Large ice-wedge networks and tundra gley horizons in Northern France Upper Pleistocene loess: evidences of extreme cold events and cyclic millennial changes

    NASA Astrophysics Data System (ADS)

    Antoine, Pierre; Moine, Olivier; Guerin, Gilles

    2015-04-01

    Northern France loess-palaeosol sequences from the last interglacial-glacial cycle (Eemian-Weichselian) have been intensely studied during the last 20 years (about 100 individual sequences). Despite thickness variations of the different stratigraphic units, the sequences from the last interglacial-glacial cycle exhibit a particularly constant pedosedimentary pattern, including well-identified pedological and periglacial marker horizons that can be followed north- and eastward in Belgium and Germany. Within this system, new field investigations and luminescence (OSL) datings put in evidence at least four generations of large ice-wedge networks (10-14 m) preserved by loess deposits between ca. 50 and 20 ka. The best- and most systematically preserved network is presently dated at about 31-32 ka according to the OSL ages from its loess infilling. This main ice-wedge cast horizon systematically occurs at the boundary between Middle Pleniglacial brown soil complexes and the base of the Upper Pleniglacial typical loess cover. Consequently, it represents a major stratigraphic marker for correlations in Western Europe. According to recent OSL dating results, the first thick typical loess unit of the Upper Pleniglacial, covering the main ice-wedge cast horizon, has been deposited shortly after GIS-5 interstadial and could be contemporaneous of H3 event in deep-sea cores. In addition, it is shown that all the large ice wedge casts are developed from the surface of a tundra gley horizon (0.3 to 0.5 m in thickness). As it has been previously demonstrated that tundra gley layers were mainly formed during short interstadial events (malacology, sedimentology), a model linking tundra gley horizons, and ice wedges network regarding to DO stadial-interstadial cycles during the last glacial is proposed.

  1. Networking.

    ERIC Educational Resources Information Center

    Duvall, Betty

    Networking is an information giving and receiving system, a support system, and a means whereby women can get ahead in careers--either in new jobs or in current positions. Networking information can create many opportunities: women can talk about how other women handle situations and tasks, and previously established contacts can be used in…

  2. Improving short-term forecasting during ramp events by means of Regime-Switching Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Gallego, C.; Costa, A.; Cuerva, A.

    2010-09-01

    Since nowadays wind energy can't be neither scheduled nor large-scale storaged, wind power forecasting has been useful to minimize the impact of wind fluctuations. In particular, short-term forecasting (characterised by prediction horizons from minutes to a few days) is currently required by energy producers (in a daily electricity market context) and the TSO's (in order to keep the stability/balance of an electrical system). Within the short-term background, time-series based models (i.e., statistical models) have shown a better performance than NWP models for horizons up to few hours. These models try to learn and replicate the dynamic shown by the time series of a certain variable. When considering the power output of wind farms, ramp events are usually observed, being characterized by a large positive gradient in the time series (ramp-up) or negative (ramp-down) during relatively short time periods (few hours). Ramp events may be motivated by many different causes, involving generally several spatial scales, since the large scale (fronts, low pressure systems) up to the local scale (wind turbine shut-down due to high wind speed, yaw misalignment due to fast changes of wind direction). Hence, the output power may show unexpected dynamics during ramp events depending on the underlying processes; consequently, traditional statistical models considering only one dynamic for the hole power time series may be inappropriate. This work proposes a Regime Switching (RS) model based on Artificial Neural Nets (ANN). The RS-ANN model gathers as many ANN's as different dynamics considered (called regimes); a certain ANN is selected so as to predict the output power, depending on the current regime. The current regime is on-line updated based on a gradient criteria, regarding the past two values of the output power. 3 Regimes are established, concerning ramp events: ramp-up, ramp-down and no-ramp regime. In order to assess the skillness of the proposed RS-ANN model, a single

  3. Adaptive Wireless Ad-hoc Sensor Networks for Long-term and Event-oriented Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Bumberger, Jan; Mollenhauer, Hannes; Remmler, Paul; Schaedler, Martin; Schima, Robert; Mollenhauer, Olaf; Hutschenreuther, Tino; Toepfer, Hannes; Dietrich, Peter

    2015-04-01

    Ecosystems are often characterized by their high heterogeneity, complexity and dynamic. Hence, single point measurements are often not sufficient for their complete representation. The application of wireless sensor networks in terrestrial and aquatic environmental systems offer significant benefits as a better consideration to the local test conditions, due to the simple adjustment of the sensor distribution, the sensor types and the sample rate. Another advantage of wireless ad-hoc sensor networks is their self-organizing behavior, resulting in a major reduction in installation and operation costs and time. In addition, individual point measurements with a sensor are significantly improved by measuring at several points continuously. In this work a concept and realization for Long-term ecosystem research is given in the field monitoring of micrometeorology and soil parameters for the interaction of biotic and abiotic processes. This long term analyses are part of the Global Change Experimental Facility (GCEF), a large field-based experimental platform to assess the effects of climate change on ecosystem functions and processes under different land-use scenarios. Regarding to the adaptive behavior of the network, also a mobile version was developed to overcome the lack of information of temporally and spatially fixed measurements for the detection and recording of highly dynamic or time limited processes. First results of different field campaigns are given to present the potentials and limitations of this application in environmental science, especially for the monitoring of the interaction of biotic and abiotic processes, soil-atmosphere interaction and the validation of remote sensing data.

  4. Adaptive Wireless Ad-hoc Sensor Networks for Long-term and Event-oriented Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Bumberger, Jan; Mollenhauer, Hannes; Remmler, Paul; Chirila, Andrei Marian; Mollenhauer, Olaf; Hutschenreuther, Tino; Toepfer, Hannes; Dietrich, Peter

    2016-04-01

    Ecosystems are often characterized by their high heterogeneity, complexity and dynamic. Hence, single point measurements are often not sufficient for their complete representation. The application of wireless sensor networks in terrestrial and aquatic environmental systems offer significant benefits as a better consideration to the local test conditions, due to the simple adjustment of the sensor distribution, the sensor types and the sample rate. Another advantage of wireless ad-hoc sensor networks is their self-organizing behavior, resulting in a major reduction in installation and operation costs and time. In addition, individual point measurements with a sensor are significantly improved by measuring at several points continuously. In this work a concept and realization for Long-term ecosystem research is given in the field monitoring of micrometeorology and soil parameters for the interaction of biotic and abiotic processes. This long term analyses are part of the Global Change Experimental Facility (GCEF), a large field-based experimental platform to assess the effects of climate change on ecosystem functions and processes under different land-use scenarios. Regarding to the adaptive behavior of the network, also a mobile version was developed to overcome the lack of information of temporally and spatially fixed measurements for the detection and recording of highly dynamic or time limited processes. First results of different field campaigns are given to present the potentials and limitations of this application in environmental science, especially for the monitoring of the interaction of biotic and abiotic processes, soil-atmosphere interaction and the validation of remote sensing data.

  5. Advanced Decentralized Water/Energy Network Design for Sustainable Infrastructure presentation at the 2012 National Association of Home Builders (NAHB) International Builders'Show

    EPA Science Inventory

    A university/industry panel will report on the progress and findings of a fivesteve-year project funded by the US Environmental Protection Agency. The project's end product will be a Web-based, 3D computer-simulated residential environment with a decision support system to assist...

  6. Palaeoseismic events recorded in Archaean gold-quartz vein networks, Val d'Or, Abitibi, Quebec, Canada

    NASA Astrophysics Data System (ADS)

    Boullier, Anne-Marie; Robert, François

    1992-02-01

    Archaean gold-quartz vein deposits are commonly hosted in high-angle reverse shear zones and are interpreted to have formed in a regime of horizontal compression and high fluid pressure environment. This paper presents the results of a combined structural and fluid inclusion study on three gold-quartz vein deposits of the Val d'Or area (Abitibi, Quebec) consisting of subhorizontal extensional veins and E-W steeply dipping shear veins. Crack-seal structures, tourmaline fibres, stretched quartz crystals and open-space filling textures indicate that the subhorizontal veins formed by hydraulic fracturing under supralithostatic fluid pressure. CO 2-rich and H 2O + NaCl fluid inclusions, interpreted as two coexisting immiscible fluids, occur typically in microcracks of different orientations interpreted to have formed in the σ1- σ2 plane. Horizontal CO 2-rich fluid inclusion planes are contemporaneous with the opening of these veins (σ 3 vertical). Vertical H 2O + NaCl fluid inclusion planes, as well as some microstructures, such as deformed minerals, indicate that the same extensional veins have experienced episodic vertical shortening (σ 3 horizontal) alternating with the opening events. Deformation and slip/opening also occurred in shear veins in which preferred orientation of fluid inclusion planes is not clear, except that the H 2O + NaCl fluid inclusion planes tend to be oriented at high angles to the slip direction. The successive opening and collapse events in subhorizontal extensional veins are correlated with deformation and slip/opening events in shear veins, respectively, and are attributed to cyclic fluid pressure fluctuations in the system. They are thus consistent with the fault-valve model: sudden drop in fluid pressure from supralithostatic to lower values induces fluid unmixing and occurs immediately post-failure following seismic rupturing along the shear zone. Sealing of the shear veins allows the fluid pressure to build up again and the

  7. Multiple Horizontal Gene Transfer Events and Domain Fusions Have Created Novel Regulatory and Metabolic Networks in the Oomycete Genome

    PubMed Central

    Morris, Paul Francis; Schlosser, Laura Rose; Onasch, Katherine Diane; Wittenschlaeger, Tom; Austin, Ryan; Provart, Nicholas

    2009-01-01

    Complex enzymes with multiple catalytic activities are hypothesized to have evolved from more primitive precursors. Global analysis of the Phytophthora sojae genome using conservative criteria for evaluation of complex proteins identified 273 novel multifunctional proteins that were also conserved in P. ramorum. Each of these proteins contains combinations of protein motifs that are not present in bacterial, plant, animal, or fungal genomes. A subset of these proteins were also identified in the two diatom genomes, but the majority of these proteins have formed after the split between diatoms and oomycetes. Documentation of multiple cases of domain fusions that are common to both oomycetes and diatom genomes lends additional support for the hypothesis that oomycetes and diatoms are monophyletic. Bifunctional proteins that catalyze two steps in a metabolic pathway can be used to infer the interaction of orthologous proteins that exist as separate entities in other genomes. We postulated that the novel multifunctional proteins of oomycetes could function as potential Rosetta Stones to identify interacting proteins of conserved metabolic and regulatory networks in other eukaryotic genomes. However ortholog analysis of each domain within our set of 273 multifunctional proteins against 39 sequenced bacterial and eukaryotic genomes, identified only 18 candidate Rosetta Stone proteins. Thus the majority of multifunctional proteins are not Rosetta Stones, but they may nonetheless be useful in identifying novel metabolic and regulatory networks in oomycetes. Phylogenetic analysis of all the enzymes in three pathways with one or more novel multifunctional proteins was conducted to determine the probable origins of individual enzymes. These analyses revealed multiple examples of horizontal transfer from both bacterial genomes and the photosynthetic endosymbiont in the ancestral genome of Stramenopiles. The complexity of the phylogenetic origins of these metabolic pathways and

  8. Automated builder and database of protein/membrane complexes for molecular dynamics simulations.

    PubMed

    Jo, Sunhwan; Kim, Taehoon; Im, Wonpil

    2007-01-01

    Molecular dynamics simulations of membrane proteins have provided deeper insights into their functions and interactions with surrounding environments at the atomic level. However, compared to solvation of globular proteins, building a realistic protein/membrane complex is still challenging and requires considerable experience with simulation software. Membrane Builder in the CHARMM-GUI website (http://www.charmm-gui.org) helps users to build such a complex system using a web browser with a graphical user interface. Through a generalized and automated building process including system size determination as well as generation of lipid bilayer, pore water, bulk water, and ions, a realistic membrane system with virtually any kinds and shapes of membrane proteins can be generated in 5 minutes to 2 hours depending on the system size. Default values that were elaborated and tested extensively are given in each step to provide reasonable options and starting points for both non-expert and expert users. The efficacy of Membrane Builder is illustrated by its applications to 12 transmembrane and 3 interfacial membrane proteins, whose fully equilibrated systems with three different types of lipid molecules (DMPC, DPPC, and POPC) and two types of system shapes (rectangular and hexagonal) are freely available on the CHARMM-GUI website. One of the most significant advantages of using the web environment is that, if a problem is found, users can go back and re-generate the whole system again before quitting the browser. Therefore, Membrane Builder provides the intuitive and easy way to build and simulate the biologically important membrane system. PMID:17849009

  9. Lipid profile of body builders with and without self-administration of anabolic steroids.

    PubMed

    Fröhlich, J; Kullmer, T; Urhausen, A; Bergmann, R; Kindermann, W

    1989-01-01

    Twenty-four top-level body builders [13 anabolic steroid users (A); 11 non-users (N)] and 11 performance-matched controls (C) were examined to determine the effect on lipids, lipoproteins and apolipoproteins of many years of body building with and without simultaneous intake of anabolic steroids and testosterone. After an overnight fast, triglycerides (TG), total cholesterol (TOTC), high density lipoprotein cholesterol (HDLC), low density lipoprotein cholesterol (LDLC), the HDLC subfractions HDL2C and HDL3C, as well as apolipoprotein A-I (Apo A-I), apolipoprotein A-II (Apo A-II) and apolipoprotein B (Apo B) were determined. Both A and N, compared to C, showed significantly lower HDLC and higher LDLC concentrations, with the differences between A and C clearly pronounced. In a subgroup of 6 body builders taking anabolic steroids at the time of the study, HDLC, HDL2C, HDL3C, Apo A-I and Apo A-II were all significantly lower and LDLC was significantly higher than in a second subgroup of 7 body builders who had discontinued their intake of anabolic steroids at least 4 weeks prior to the study. In some single cases HDLC was barely detectable (2-7 mg.dl-1). The TG and TOTC remained unchanged. The present findings suggest that many years of body building among top-level athletes have no beneficial effect on lipoproteins and apolipoproteins. Simultaneous use of anabolic steroids results in part in extreme alterations in lipoproteins and apolipoproteins, representing an atherogenic profile. After discontinuing the use of anabolic steroids, the changes in lipid metabolism appear to be reversible. PMID:2583156

  10. New Whole-House Solutions Case Study: A Production Builder's Passive House - Denver, Colorado

    SciTech Connect

    2015-05-01

    Brookfield Home’s first project is in a community called Midtown in Denver, Colorado, in which the builder took on the challenge of increased energy efficiency by creating a Passive House (PH)-certified model home. Brookfield worked with the U.S. Department of Energy’s Building America research team IBACOS to create the home, evaluate advanced building technologies, and use the home as a marketing tool for potential homebuyers. Brookfield also worked with KGA studio architects to create a new floor plan that would be constructed to the PH standard as an upgrade option.

  11. Building America Solution Center Shows Builders How to Save Materials Costs While Saving Energy

    SciTech Connect

    Gilbride, Theresa L.

    2015-06-15

    This short article was prepared for the U.S. Department of Energy's Building America Update newsletter. The article identifies energy and cost-saving benefits of using advanced framing techniques in new construction identified by research teams working with the DOE's Building America program. The article also provides links to guides in the Building America Solution Center that give how-to instructions for builders who want to implement advanced framing construction. The newsletter is issued monthly and can be accessed at http://energy.gov/eere/buildings/building-america-update-newsletter

  12. Regional transportation network blocked by snowdrifts: assessment of risk reduction strategies by the example of the wind event of February 2015 in the Canton of Vaud, Switzerland

    NASA Astrophysics Data System (ADS)

    Voumard, Jérémie; Jaboyedoff, Michel; Derron, Marc-Henri

    2016-04-01

    where an accessibility is crucial to be maintained. We analyze then the road network to highlight the roads vulnerability from snowdrifts with topographic and meteorological indicators. We also assess the ratio cost/benefit of different measures limiting snowdrifts. We finally discuss strategies to reduce the risk of this winter meteorological event.

  13. Analysis of Modern Techniques for Nuclear-test Yield Determination of NTS Events Using Data From the Leo Brady Seismic Network

    NASA Astrophysics Data System (ADS)

    Schramm, K. A.; Bilek, S. L.; Abbott, R. E.

    2007-12-01

    Nuclear test detection is a challenging, but important task for treaty verification. Many techniques have been developed to discriminate between an explosion and an earthquake and if an explosion is detected, to determine its yield. Sandia National Laboratories (SNL) has maintained the Leo Brady Seismic Network (LBSN) since 1960 to record nuclear tests at the Nevada Test Site (NTS), providing a unique data set for yield determination. The LBSN is comprised of five permanent stations surrounding the NTS at regional distances, and data (in digital from post 1983) exists for almost all tests. Modern seismic data processing techniques can be used with this data to apply new methods to better determine the seismic yield. Using mb(Lg) we found that, when compared to published yields, our estimates were low for events over 100 kilotons (kt) and near the published value for events under 40 kt. We are currently measuring seismic-phase amplitudes, examining body- and surface-wave spectra and using seismic waveform modeling techniques to determine the seismic yield of NTS explosions using the waveforms from the LBSN.

  14. University Builders.

    ERIC Educational Resources Information Center

    Pearce, Martin

    This publication explores a diverse collection of new university buildings. Ranging from the design of vast new campuses, such as that by Wilford and Stirling at Temasek, Singapore, through to the relatively modest yet strategically important, such as the intervention by Allies and Morrison at Southampton, this book examines the new higher…

  15. Career Builders

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2012-01-01

    While a main goal for corporate trainers traditionally has been to train employees to reach organizational goals, many trainers may find their roles expanding. With companies cutting back on staffing and consolidating multiple job roles into single positions, career development has taken on a much larger significance. At forward-thinking…

  16. Energy Builders.

    ERIC Educational Resources Information Center

    Instructor, 1982

    1982-01-01

    Due to increasing energy demands and decreasing supplies, it is important for teachers to provide students with a solid foundation for energy decision making. Activities are presented which offer hands-on experiences with four sources of energy: wind, water, sun, and fossil fuels. (JN)

  17. BioPartsBuilder: a synthetic biology tool for combinatorial assembly of biological parts

    PubMed Central

    Yang, Kun; Stracquadanio, Giovanni; Luo, Jingchuan; Boeke, Jef D.; Bader, Joel S.

    2016-01-01

    Summary: Combinatorial assembly of DNA elements is an efficient method for building large-scale synthetic pathways from standardized, reusable components. These methods are particularly useful because they enable assembly of multiple DNA fragments in one reaction, at the cost of requiring that each fragment satisfies design constraints. We developed BioPartsBuilder as a biologist-friendly web tool to design biological parts that are compatible with DNA combinatorial assembly methods, such as Golden Gate and related methods. It retrieves biological sequences, enforces compliance with assembly design standards and provides a fabrication plan for each fragment. Availability and implementation: BioPartsBuilder is accessible at http://public.biopartsbuilder.org and an Amazon Web Services image is available from the AWS Market Place (AMI ID: ami-508acf38). Source code is released under the MIT license, and available for download at https://github.com/baderzone/biopartsbuilder. Contact: joel.bader@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26568632

  18. News Report: The career paths of physics graduates Education: Network day to hold workshops for teaching ideas Experiments: PhysHOME brings innovators together Meeting: Physics Education Networks collaborate at WCPE Workshop: World experts in physics education meet Training: Something for everyone at SPEED 2012 Conference: Sun, cocktails and physics create a buzz at WCPE Students: The physics paralympian 2012 Forthcoming events

    NASA Astrophysics Data System (ADS)

    2012-09-01

    Report: The career paths of physics graduates Education: Network day to hold workshops for teaching ideas Experiments: PhysHOME brings innovators together Meeting: Physics Education Networks collaborate at WCPE Workshop: World experts in physics education meet Training: Something for everyone at SPEED 2012 Conference: Sun, cocktails and physics create a buzz at WCPE Students: The physics paralympian 2012 Forthcoming events

  19. Car Builder: Design, Construct and Test Your Own Cars. School Version with Lesson Plans. [CD-ROM].

    ERIC Educational Resources Information Center

    Highsmith, Joni Bitman

    Car Builder is a scientific CD-ROM-based simulation program that lets students design, construct, modify, test, and compare their own cars. Students can design sedans, four-wheel-drive vehicles, vans, sport cars, and hot rods. They may select for aerodynamics, power, and racing ability, or economic and fuel efficiency. It is a program that teaches…

  20. Integrating Commercially-Available Educational Software into a Learning Environment with the QuiltSpace Builder Tool.

    ERIC Educational Resources Information Center

    Williamson, Mary

    The QuiltSpace Builder enables Microsoft's multimedia encyclopedia "Encarta" to "fit" into an established home or institutional learning environment so that "Encarta" can be used for productive research. Early observations of the difficulties encountered by Encarta users, coupled with a survey of presently available commercial software products,…

  1. Energy conservation manual for builders in the Mid-Columbia Basin area

    SciTech Connect

    Mazzucchi, R.P.; Nieves, L.A.; Hopp, W.J.

    1981-03-01

    Results of a comprehensive cost-effectiveness evaluation of energy conservation measures currently available for use in typical residential buildings are presented. Section 2 discusses construction techniques for energy-efficient buildings and presents estimates of the cost of incorporating the conservation measures in the prototype building, the resultant annual energy savings, and the value of that annual energy savings based upon typical regional fuel prices. In Section 3 this information is summarized to prioritize conservation investments according to their economic effectiveness and offer general recommendations to home builders. Appendix A contains detailed information pertaining to the energy consumption calculations. Appendix B presents the methodology, assumptions, and results of a detail cash flow analysis of each of the conservation items for which sufficient performance and cost data are currently available. (MCW)

  2. Strategies for fitting nonlinear ecological models in R, AD Model Builder, and BUGS

    USGS Publications Warehouse

    Bolker, Benjamin M.; Gardner, Beth; Maunder, Mark; Berg, Casper W.; Brooks, Mollie; Comita, Liza; Crone, Elizabeth; Cubaynes, Sarah; Davies, Trevor; de Valpine, Perry; Ford, Jessica; Gimenez, Olivier; Kéry, Marc; Kim, Eun Jung; Lennert-Cody, Cleridy; Magunsson, Arni; Martell, Steve; Nash, John; Nielson, Anders; Regentz, Jim; Skaug, Hans; Zipkin, Elise

    2013-01-01

    1. Ecologists often use nonlinear fitting techniques to estimate the parameters of complex ecological models, with attendant frustration. This paper compares three open-source model fitting tools and discusses general strategies for defining and fitting models. 2. R is convenient and (relatively) easy to learn, AD Model Builder is fast and robust but comes with a steep learning curve, while BUGS provides the greatest flexibility at the price of speed. 3. Our model-fitting suggestions range from general cultural advice (where possible, use the tools and models that are most common in your subfield) to specific suggestions about how to change the mathematical description of models to make them more amenable to parameter estimation. 4. A companion web site (https://groups.nceas.ucsb.edu/nonlinear-modeling/projects) presents detailed examples of application of the three tools to a variety of typical ecological estimation problems; each example links both to a detailed project report and to full source code and data.

  3. Advance Liquid Metal Reactor Discrete Dynamic Event Tree/Bayesian Network Analysis and Incident Management Guidelines (Risk Management for Sodium Fast Reactors)

    SciTech Connect

    Denman, Matthew R.; Groth, Katrina M.; Cardoni, Jeffrey N.; Wheeler, Timothy A.

    2015-04-01

    Accident management is an important component to maintaining risk at acceptable levels for all complex systems, such as nuclear power plants. With the introduction of self-correcting, or inherently safe, reactor designs the focus has shifted from management by operators to allowing the system's design to manage the accident. Inherently and passively safe designs are laudable, but nonetheless extreme boundary conditions can interfere with the design attributes which facilitate inherent safety, thus resulting in unanticipated and undesirable end states. This report examines an inherently safe and small sodium fast reactor experiencing a beyond design basis seismic event with the intend of exploring two issues : (1) can human intervention either improve or worsen the potential end states and (2) can a Bayesian Network be constructed to infer the state of the reactor to inform (1). ACKNOWLEDGEMENTS The authors would like to acknowledge the U.S. Department of Energy's Office of Nuclear Energy for funding this research through Work Package SR-14SN100303 under the Advanced Reactor Concepts program. The authors also acknowledge the PRA teams at Argonne National Laboratory, Oak Ridge National Laboratory, and Idaho National Laboratory for their continue d contributions to the advanced reactor PRA mission area.

  4. Single and combinatorial chromatin coupling events underlies the function of transcript factor krüppel-like factor 11 in the regulation of gene networks

    PubMed Central

    2014-01-01

    Background Krüppel-like factors (KLFs) are a group of master regulators of gene expression conserved from flies to human. However, scant information is available on either the mechanisms or functional impact of the coupling of KLF proteins to chromatin remodeling machines, a deterministic step in transcriptional regulation. Results and discussion In the current study, we use genome-wide analyses of chromatin immunoprecipitation (ChIP-on-Chip) and Affymetrix-based expression profiling to gain insight into how KLF11, a human transcription factor involved in tumor suppression and metabolic diseases, works by coupling to three co-factor groups: the Sin3-histone deacetylase system, WD40-domain containing proteins, and the HP1-histone methyltransferase system. Our results reveal that KLF11 regulates distinct gene networks involved in metabolism and growth by using single or combinatorial coupling events. Conclusion This study, the first of its type for any KLF protein, reveals that interactions with multiple chromatin systems are required for the full gene regulatory function of these proteins. PMID:24885560

  5. City of Austin: Green habitat learning project. A green builder model home project

    SciTech Connect

    1995-12-01

    The purpose of the Year 14 UCETF project was to design and construct a residential structure that could serve as a demonstration facility, training site, and testing and monitoring laboratory for issues related to the implementation of sustainable building practices and materials. The Model Home Project builds on the previous and existing efforts, partially funded by the UCETF, of the City of Austin Green Builder Program to incorporate sustainable building practices into mainstream building activities. The Green Builder Program uses the term {open_quotes}green{close_quotes} as a synonym for sustainability. In the research and analysis that was completed for our earlier reports in Years 12 and 13, we characterized specific elements that we associate with sustainability and, thus, green building. In general, we refer to a modified life cycle assessment to ascertain if {open_quotes}green{close_quotes} building options reflect similar positive cyclical patterns found in nature (i.e. recyclability, recycled content, renewable resources, etc.). We additionally consider economic, human health and synergistic ecological impacts associated with our building choices and characterize the best choices as {open_quotes}green.{close_quotes} Our ultimate goal is to identify and use those {open_quotes}green{close_quotes} materials and processes that provide well for us now and do not compromise similar benefits for future generations. The original partnership developed for this project shifted during the year from a project stressing advanced (many prototypical) {open_quotes}green{close_quotes} building materials and techniques in a research and demonstration context, to off-the-shelf but underutilized {open_quotes}green{close_quotes} materials in the practical social context of using {open_quotes}green{close_quotes} technologies for low income housing. That project, discussed in this report, is called the Green Habitat Learning Project.

  6. Amphetamine use and its associated factors in body builders: a study from Tehran, Iran

    PubMed Central

    Narenjiha, Hooman; Tayyebi, Behnoosh; Ghassabian, Akhgar; Ahmadi, Gelareh; Assari, Shervin

    2012-01-01

    Introduction Epidemiological studies on all types of illicit drug use among athletes are essential for both the sport community and drug control achievements. Here, we investigated the prevalence and associated factors of amphetamine use in body builders in Tehran, Iran, 2007. Material and methods This study is a secondary analysis of a substance use survey done in 103 randomly selected gymnasia in Tehran (capital city of Iran). The survey was conducted from November 2007 to January 2008 and included 843 randomly selected bodybuilders (aged 40 years or less). By interviews via questionnaires the following data were obtained: age, job, marital status, education level, housing status, average monthly family income, number of family members, gymnasium area (m2), number of trainers, number of gymnasium members, initiation time (months), weekly duration of the sporting activity (h), monthly cost of the sporting activity, purpose of participating in sporting activity, and history of anabolic steroid and amphetamine use. Results One hundred twenty (13.3%) body builders reported a history of amphetamine use. According to the results of regression analysis, being married (risk ratio – RR = 0.540), and participating in body building to enhance self-esteem (RR = 0.423) or to enhance sport performance (RR = 0.545) had protective effects on amphetamine use. However, having university qualifications (RR = 1.843), using anabolic steroids (RR = 1.803) and participating in sport to maintain fitness (RR = 2.472) were linked to increased risk of amphetamine use. Conclusions Well-educated bodybuilders were more likely to use amphetamines, and why this is so needs to be discovered. If further studies show that they are not aware of the dangers associated with amphetamine use, providing them with information should be considered. PMID:22662012

  7. Cross-correlation analysis of 2012-2014 seismic events in Central-Northern Italy: insights from the geochemical monitoring network of Tuscany

    NASA Astrophysics Data System (ADS)

    Pierotti, Lisa; Facca, Gianluca; Gherardi, Fabrizio

    2015-04-01

    Since late 2002, a geochemical monitoring network is operating in Tuscany, Central Italy, to collect data and possibly identify geochemical anomalies that characteristically occur before regionally significant (i.e. with magnitude > 3) seismic events. The network currently consists of 6 stations located in areas already investigated in detail for their geological setting, hydrogeological and geochemical background and boundary conditions. All these stations are equipped for remote, continuous monitoring of selected physicochemical parameters (temperature, pH, redox potential, electrical conductivity), and dissolved concentrations of CO2 and CH4. Additional information are obtained through in situ discrete monitoring. Field surveys are periodically performed to guarantee maintenance and performance control of the sensors of the automatic stations, and to collect water samples for the determination of the chemical and stable isotope composition of all the springs investigated for seismic precursors. Geochemical continuous signals are numerically processed to remove outliers, monitoring errors and aseismic effects from seasonal and climatic fluctuations. The elaboration of smoothed, long-term time series (more than 200000 data available today for each station) allows for a relatively accurate definition of geochemical background values. Geochemical values out of the two-sigma relative standard deviation domain are inspected as possible indicators of physicochemical changes related to regional seismic activity. Starting on November 2011, four stations of the Tuscany network located in two separate mountainous areas of Northern Apennines separating Tuscany from Emilia-Romagna region (Equi Terme and Gallicano), and Tuscany from Emilia-Romagna and Umbria regions (Vicchio and Caprese Michelangelo), started to register anomalous values in pH and CO2 partial pressure (PCO2). Cross-correlation analysis indicates an apparent relationship between the most important seismic

  8. Seafloor spreading event in western Gulf of Aden during the November 2010 - March 2011 period captured by regional seismic networks: Evidence for diking events and interactions with a nascent transform zone

    NASA Astrophysics Data System (ADS)

    Abdulhakim, Ahmed; Cécile, Doubre; Sylvie, Leroy; Kassim, Mohamed; Derek, Keir; Abayazid, Ahmadine; Julie, Perrot; Laurence, Audin; Jérome, Vergne; Alexandre, Nercessian; Eric, Jacques; Khaled, Khanbari; Jamal, Sholan; Frédérique, Rolandone; Ismael, Alganad

    2016-02-01

    In November 2010, intense seismic activity including 29 events with a magnitude above 5.0, started in the western part of the Gulf of Aden, where the structure of the oceanic spreading ridge is characterized by a series of N115°-trending slow-spreading segments set within an EW-trending rift. Using signals recorded by permanent and temporary networks in Djibouti and Yemen, we located 1122 earthquakes, with a magnitude ranging from 2.1 to 5.6 from 01 November 2010 to 31 March 2011. By looking in detail at the space-time distribution of the overall seismicity, and both the frequency and the moment tensor of large earthquakes, we reexamine the chronology of this episode. In addition we also interpret the origin of the activity using high-resolution bathymetric data, as well as from observations of sea-floor cable damage caused by high temperatures and lava flows. The analysis allows us to identify distinct active areas. Firstly, we interpret that this episode is mainly related to a diking event along a specific ridge segment, located at E044°. In light of previous diking episodes in nearby subaerial rift segments, for which field constraints and both seismic and geodetic data exist, we interpret the space-time evolution of the seismicity of the first few days. Migration of earthquakes suggests initial magma ascent below the segment center. This is followed by a southeastward dike propagation below the rift immediately followed by a northwestward dike propagation below the rift ending below the northern ridge wall. The cumulative seismic moment associated with this sequence reaches 9.1 × 1017 Nm, and taking into account a very low seismic versus geodetic moment, we estimate an horizontal opening of ˜0.58 to 2.9 m. The seismic activity that followed occurred through several bursts of earthquakes aligned along the segment axis, which are interpreted as short dike intrusions implying fast replenishment of the crustal magma reservoir feeding the dikes. Over the whole

  9. Seafloor spreading event in western Gulf of Aden during the November 2010-March 2011 period captured by regional seismic networks: evidence for diking events and interactions with a nascent transform zone

    NASA Astrophysics Data System (ADS)

    Ahmed, Abdulhakim; Doubre, Cécile; Leroy, Sylvie; Kassim, Mohamed; Keir, Derek; Abayazid, Ahmadine; Julie, Perrot; Laurence, Audin; Vergne, Jérome; Alexandre, Nercessian; Jacques, Eric; Khanbari, Khaled; Sholan, Jamal; Rolandone, Frédérique; Al-Ganad, Ismael

    2016-05-01

    In November 2010, intense seismic activity including 29 events with a magnitude above 5.0, started in the western part of the Gulf of Aden, where the structure of the oceanic spreading ridge is characterized by a series of N115°-trending slow-spreading segments set within an EW-trending rift. Using signals recorded by permanent and temporary networks in Djibouti and Yemen, we located 1122 earthquakes, with a magnitude ranging from 2.1 to 5.6 from 2010 November 1 to 2011 March 31. By looking in detail at the space-time distribution of the overall seismicity, and both the frequency and the moment tensor of large earthquakes, we re-examine the chronology of this episode. In addition, we also interpret the origin of the activity using high-resolution bathymetric data, as well as from observations of seafloor cable damage caused by high temperatures and lava flows. The analysis allows us to identify distinct active areas. First, we interpret that this episode is mainly related to a diking event along a specific ridge segment, located at E044°. In light of previous diking episodes in nearby subaerial rift segments, for which field constraints and both seismic and geodetic data exist, we interpret the space-time evolution of the seismicity of the first few days. Migration of earthquakes suggests initial magma ascent below the segment centre. This is followed by a southeastward dike propagation below the rift immediately followed by a northwestward dike propagation below the rift ending below the northern ridge wall. The cumulative seismic moment associated with this sequence reaches 9.1 × 1017 Nm, and taking into account a very low seismic versus geodetic moment, we estimate a horizontal opening of ˜0.58-2.9 m. The seismic activity that followed occurred through several bursts of earthquakes aligned along the segment axis, which are interpreted as short dike intrusions implying fast replenishment of the crustal magma reservoir feeding the dikes. Over the whole period

  10. Building America Best Practices Series: Volume 3; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in Cold and Very Cold Climates

    SciTech Connect

    Not Available

    2005-08-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the cold and very cold climates. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builder's team-from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  11. Building America Best Practices Series: Volume 5; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Marine Climate

    SciTech Connect

    Baechler, M. C.; Taylor, Z. T.; Bartlett, R.; Gilbride, T.; Hefty, M.; Steward, H.; Love, P. M.; Palmer, J. A.

    2006-10-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the Marine climate region. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builder's team--from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  12. Building America Best Practices Series: Volume 4; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Mixed-Humid Climate

    SciTech Connect

    Baechler, M. C.; Taylor, Z. T.; Bartlett, R.; Gilbride, T.; Hefty, M.; Steward, H.; Love, P. M.; Palmer, J. A.

    2005-09-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the mixed-humid climate region. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builders team-from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  13. Seismicity Along the Endeavour Segment of the Juan de Fuca Ridge: Automated Event Locations for an Ocean-Bottom Seismometer Network

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Wilcock, W. S.; Hooft, E. E.; Toomey, D. R.; McGill, P. R.

    2004-12-01

    From 2003-2006, the W.M. Keck Foundation supported the operation of a network of eight ocean-bottom seismometers (OBSs) that were deployed with a remotely operated vehicle along the central portion of the Endeavour Segment of the Juan de Fuca mid-ocean ridge as part of a multidisciplinary prototype NEPTUNE experiment. Data from 2003-2004 were initially analyzed during a research apprenticeship class at the University of Washington's Friday Harbor Laboratories. Eight student analysts located ~13,000 earthquakes along the Endeavour Segment. Analysis of data from 2004-2005 has to date been limited to locating ~6,000 earthquakes associated with a swarm in February-March 2005 near the northern end of the Endeavour Segment. The remaining data includes several significant swarms and it is anticipated that tens of thousands of earthquakes still need to be located. In order to efficiently obtain a complete catalog of high-quality locations for the 3-year experiment, we are developing an automatic method for earthquake location. We first apply a 5-Hz high-pass filter and identify triggers when the ratio of the root-mean square (RMS) amplitudes in short- and long- term windows exceeds a specified threshold. We search for events that are characterized by triggers within a short time interval on the majority of stations and use the signal spectra to eliminate events that are the result of 20-Hz Fin and Blue whale vocalizations. An autoregressive technique is applied to a short time window centered on the trigger time to pick P-wave times on each station's vertical channel. We locate the earthquake with these picks and either attempt to repick or eliminate arrivals with unacceptable residuals. Preliminary S-wave picks are then made on the horizontal channels by applying a 5-12 Hz bandpass filter, identifying the peak RMS amplitude for a short running window, and making a pick at the time the RMS amplitude rises above 50% of this value. The picks are refined using the

  14. Seismicity Along the Endeavour Segment of the Juan de Fuca Ridge: Automated Event Locations for an Ocean-Bottom Seismometer Network

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Wilcock, W. S.; Hooft, E. E.; Toomey, D. R.; McGill, P. R.

    2007-12-01

    From 2003-2006, the W.M. Keck Foundation supported the operation of a network of eight ocean-bottom seismometers (OBSs) that were deployed with a remotely operated vehicle along the central portion of the Endeavour Segment of the Juan de Fuca mid-ocean ridge as part of a multidisciplinary prototype NEPTUNE experiment. Data from 2003-2004 were initially analyzed during a research apprenticeship class at the University of Washington's Friday Harbor Laboratories. Eight student analysts located ~13,000 earthquakes along the Endeavour Segment. Analysis of data from 2004-2005 has to date been limited to locating ~6,000 earthquakes associated with a swarm in February-March 2005 near the northern end of the Endeavour Segment. The remaining data includes several significant swarms and it is anticipated that tens of thousands of earthquakes still need to be located. In order to efficiently obtain a complete catalog of high-quality locations for the 3-year experiment, we are developing an automatic method for earthquake location. We first apply a 5-Hz high-pass filter and identify triggers when the ratio of the root-mean square (RMS) amplitudes in short- and long- term windows exceeds a specified threshold. We search for events that are characterized by triggers within a short time interval on the majority of stations and use the signal spectra to eliminate events that are the result of 20-Hz Fin and Blue whale vocalizations. An autoregressive technique is applied to a short time window centered on the trigger time to pick P-wave times on each station's vertical channel. We locate the earthquake with these picks and either attempt to repick or eliminate arrivals with unacceptable residuals. Preliminary S-wave picks are then made on the horizontal channels by applying a 5-12 Hz bandpass filter, identifying the peak RMS amplitude for a short running window, and making a pick at the time the RMS amplitude rises above 50% of this value. The picks are refined using the

  15. The effects of high-frequency oscillations in hippocampal electrical activities on the classification of epileptiform events using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Chiu, Alan W. L.; Jahromi, Shokrollah S.; Khosravani, Houman; Carlen, Peter L.; Bardakjian, Berj L.

    2006-03-01

    The existence of hippocampal high-frequency electrical activities (greater than 100 Hz) during the progression of seizure episodes in both human and animal experimental models of epilepsy has been well documented (Bragin A, Engel J, Wilson C L, Fried I and Buzsáki G 1999 Hippocampus 9 137-42 Khosravani H, Pinnegar C R, Mitchell J R, Bardakjian B L, Federico P and Carlen P L 2005 Epilepsia 46 1-10). However, this information has not been studied between successive seizure episodes or utilized in the application of seizure classification. In this study, we examine the dynamical changes of an in vitro low Mg2+ rat hippocampal slice model of epilepsy at different frequency bands using wavelet transforms and artificial neural networks. By dividing the time-frequency spectrum of each seizure-like event (SLE) into frequency bins, we can analyze their burst-to-burst variations within individual SLEs as well as between successive SLE episodes. Wavelet energy and wavelet entropy are estimated for intracellular and extracellular electrical recordings using sufficiently high sampling rates (10 kHz). We demonstrate that the activities of high-frequency oscillations in the 100-400 Hz range increase as the slice approaches SLE onsets and in later episodes of SLEs. Utilizing the time-dependent relationship between different frequency bands, we can achieve frequency-dependent state classification. We demonstrate that activities in the frequency range 100-400 Hz are critical for the accurate classification of the different states of electrographic seizure-like episodes (containing interictal, preictal and ictal states) in brain slices undergoing recurrent spontaneous SLEs. While preictal activities can be classified with an average accuracy of 77.4 ± 6.7% utilizing the frequency spectrum in the range 0-400 Hz, we can also achieve a similar level of accuracy by using a nonlinear relationship between 100-400 Hz and <4 Hz frequency bands only.

  16. Building America Best Practices Series: Builders Challenge Guide to 40% Whole-House Energy Savings in the Marine Climate (Volume 11)

    SciTech Connect

    Pacific Northwest National Laboratory

    2010-09-01

    With the measures described in this guide, builders in the marine climate can build homes that have whole-house energy savings of 40% over the Building America benchmark with no added overall costs for consumers.

  17. Building America Best Practices Series: Volume 3; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Cold and Very Cold Climates

    SciTech Connect

    2005-08-01

    The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the cold and very cold climates.

  18. Building America Best Practices Series: Volume 4; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Mixed-Humid Climate

    SciTech Connect

    2005-09-01

    This guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the mixed-humid climate region.

  19. Staged Event Architecture

    Energy Science and Technology Software Center (ESTSC)

    2005-05-30

    Sea is a framework for a Staged Event Architecture, designed around non-blocking asynchronous communication facilities that are decoupled from the threading model chosen by any given application, Components for P networking and in-memory communication are provided. The Sea Java library encapsulates these concepts. Sea is used to easily build efficient and flexible low-level network clients and servers, and in particular as a basic communication substrate for Peer-to-Peer applications.

  20. The design and implementation of signal decomposition system of CL multi-wavelet transform based on DSP builder

    NASA Astrophysics Data System (ADS)

    Huang, Yan; Wang, Zhihui

    2015-12-01

    With the development of FPGA, DSP Builder is widely applied to design system-level algorithms. The algorithm of CL multi-wavelet is more advanced and effective than scalar wavelets in processing signal decomposition. Thus, a system of CL multi-wavelet based on DSP Builder is designed for the first time in this paper. The system mainly contains three parts: a pre-filtering subsystem, a one-level decomposition subsystem and a two-level decomposition subsystem. It can be converted into hardware language VHDL by the Signal Complier block that can be used in Quartus II. After analyzing the energy indicator, it shows that this system outperforms Daubenchies wavelet in signal decomposition. Furthermore, it has proved to be suitable for the implementation of signal fusion based on SoPC hardware, and it will become a solid foundation in this new field.

  1. electronic Ligand Builder and Optimisation Workbench (eLBOW): A tool for ligand coordinate and restraint generation

    SciTech Connect

    Moriarty, Nigel; Grosse-Kunstleve, Ralf; Adams, Paul

    2009-07-01

    The electronic Ligand Builder and Optimisation Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It's designed to be a flexible procedure using simple and fast quantum chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow for the attainment of a number of diverse goals including geometry optimisation and generation of restraints.

  2. electronic Ligand Builder and Optimization Workbench (eLBOW): a tool for ligand coordinate and restraint generation

    PubMed Central

    Moriarty, Nigel W.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.

    2009-01-01

    The electronic Ligand Builder and Optimization Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It is designed to be a flexible procedure that uses simple and fast quantum-chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow the attainment of a number of diverse goals including geometry optimization and generation of restraints. PMID:19770504

  3. Quantifying the spatio-temporal pattern of the ground impact of space weather events using dynamical networks formed from the SuperMAG database of ground based magnetometer stations.

    NASA Astrophysics Data System (ADS)

    Dods, Joe; Chapman, Sandra; Gjerloev, Jesper

    2016-04-01

    Quantitative understanding of the full spatial-temporal pattern of space weather is important in order to estimate the ground impact. Geomagnetic indices such as AE track the peak of a geomagnetic storm or substorm, but cannot capture the full spatial-temporal pattern. Observations by the ~100 ground based magnetometers in the northern hemisphere have the potential to capture the detailed evolution of a given space weather event. We present the first analysis of the full available set of ground based magnetometer observations of substorms using dynamical networks. SuperMAG offers a database containing ground station magnetometer data at a cadence of 1min from 100s stations situated across the globe. We use this data to form dynamic networks which capture spatial dynamics on timescales from the fast reconfiguration seen in the aurora, to that of the substorm cycle. Windowed linear cross-correlation between pairs of magnetometer time series along with a threshold is used to determine which stations are correlated and hence connected in the network. Variations in ground conductivity and differences in the response functions of magnetometers at individual stations are overcome by normalizing to long term averages of the cross-correlation. These results are tested against surrogate data in which phases have been randomised. The network is then a collection of connected points (ground stations); the structure of the network and its variation as a function of time quantify the detailed dynamical processes of the substorm. The network properties can be captured quantitatively in time dependent dimensionless network parameters and we will discuss their behaviour for examples of 'typical' substorms and storms. The network parameters provide a detailed benchmark to compare data with models of substorm dynamics, and can provide new insights on the similarities and differences between substorms and how they correlate with external driving and the internal state of the

  4. New Whole-House Solutions Case Study: HVAC Design Strategy for a Hot-Humid Production Builder, Houston, Texas

    SciTech Connect

    2014-03-01

    Building Science Corporation (BSC) worked directly with the David Weekley Homes - Houston division to develop a cost-effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses in preparation for the upcoming code changes in 2015. The following research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.

  5. A comprehensive insight into the combined effects of Fenton's reagent and skeleton builders on sludge deep dewatering performance.

    PubMed

    Liu, Huan; Yang, Jiakuan; Zhu, Nairuo; Zhang, Hao; Li, Ye; He, Shu; Yang, Changzhu; Yao, Hong

    2013-08-15

    Conditioning sewage sludge with Fenton's reagent and skeleton builders has been proved to be an effective mean to achieve deep dewatering. This work aimed to give a comprehensive insight into the mechanism involved. The results show that significant synergistic effect existed between Fenton's reagent and skeleton builders. With the optimum dosage, water content of dewatered sludge cake could be reduced to 49.5±0.5%. Furthermore, raw sludge existed in the form of zoogloea and its flocs surface was plate-like. After Fenton oxidation, partial of extracellular polymeric substances (EPS) was destroyed and the amounts of protein and polysaccharide dissolved in filtrate increased. Meanwhile, sludge flocs turned into smaller ones. After adding skeleton builders, constantly-changing environment promoted senescence and death of microorganism. A large area of plate-like structure disappeared, instead of which were holes. Irregular non-living things inlayed or pierced microbial cells, promoting the conversion from bound water to free water as well as further reduction of the sludge particle size. Additionally, these irregular substances could form a rigid porous structure under high pressure, which could transmit the stresses to the sludge internal parts and provide outflow channels for free water. Consequently, conditioned sludge was suitable for high pressure deep dewatering. PMID:23721731

  6. Technological process supervising using vision systems cooperating with the LabVIEW vision builder

    NASA Astrophysics Data System (ADS)

    Hryniewicz, P.; Banaś, W.; Gwiazda, A.; Foit, K.; Sękala, A.; Kost, G.

    2015-11-01

    One of the most important tasks in the production process is to supervise its proper functioning. Lack of required supervision over the production process can lead to incorrect manufacturing of the final element, through the production line downtime and hence to financial losses. The worst result is the damage of the equipment involved in the manufacturing process. Engineers supervise the production flow correctness use the great range of sensors supporting the supervising of a manufacturing element. Vision systems are one of sensors families. In recent years, thanks to the accelerated development of electronics as well as the easier access to electronic products and attractive prices, they become the cheap and universal type of sensors. These sensors detect practically all objects, regardless of their shape or even the state of matter. The only problem is considered with transparent or mirror objects, detected from the wrong angle. Integrating the vision system with the LabVIEW Vision and the LabVIEW Vision Builder it is possible to determine not only at what position is the given element but also to set its reorientation relative to any point in an analyzed space. The paper presents an example of automated inspection. The paper presents an example of automated inspection of the manufacturing process in a production workcell using the vision supervising system. The aim of the work is to elaborate the vision system that could integrate different applications and devices used in different production systems to control the manufacturing process.

  7. Automated Geometric Model Builder Using Range Image Sensor Data: Final Acquistion

    SciTech Connect

    Diegert, C.; Sackos, J.

    1999-02-01

    This report documents a data collection where we recorded redundant range image data from multiple views of a simple scene, and recorded accurate survey measurements of the same scene. Collecting these data was a focus of the research project Automated Geometric Model Builder Using Range Image Sensor Data (96-0384), supported by Sandia's Laboratory-Directed Research and Development (LDRD) Program during fiscal years 1996, 1997, and 1998. The data described here are available from the authors on CDROM, or electronically over the Internet. Included in this data distribution are Computer-Aided Design (CAD) models we constructed from the survey measurements. The CAD models are compatible with the SolidWorks 98 Plus system, the modern Computer-Aided Design software system that is central to Sandia's DeskTop Engineering Project (DTEP). Integration of our measurements (as built) with the constructive geometry process of the CAD system (as designed) delivers on a vision of the research project. This report on our final data collection will also serve as a final report on the project.

  8. Building energy analysis of Electrical Engineering Building from DesignBuilder tool: calibration and simulations

    NASA Astrophysics Data System (ADS)

    Cárdenas, J.; Osma, G.; Caicedo, C.; Torres, A.; Sánchez, S.; Ordóñez, G.

    2016-07-01

    This research shows the energy analysis of the Electrical Engineering Building, located on campus of the Industrial University of Santander in Bucaramanga - Colombia. This building is a green pilot for analysing energy saving strategies such as solar pipes, green roof, daylighting, and automation, among others. Energy analysis was performed by means of DesignBuilder software from virtual model of the building. Several variables were analysed such as air temperature, relative humidity, air velocity, daylighting, and energy consumption. According to two criteria, thermal load and energy consumption, critical areas were defined. The calibration and validation process of the virtual model was done obtaining error below 5% in comparison with measured values. The simulations show that the average indoor temperature in the critical areas of the building was 27°C, whilst relative humidity reached values near to 70% per year. The most critical discomfort conditions were found in the area of the greatest concentration of people, which has an average annual temperature of 30°C. Solar pipes can increase 33% daylight levels into the areas located on the upper floors of the building. In the case of the green roofs, the simulated results show that these reduces of nearly 31% of the internal heat gains through the roof, as well as a decrease in energy consumption related to air conditioning of 5% for some areas on the fourth and fifth floor. The estimated energy consumption of the building was 69 283 kWh per year.

  9. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    SciTech Connect

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang; Lo, Chaomei; Gorton, Ian; Liu, Yan

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to be extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.

  10. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  11. Closed Crawl Space Performance: Proof of Concept in the Production Builder Marketplace

    SciTech Connect

    Malkin-Weber, Melissa; Dastur, Cyrus; Mauceri, Maria; Hannas, Benjamin

    2008-10-30

    characteristics with regard to infiltration and duct leakage. Researchers settled on two closed crawl space designs, one with insulation located in the framed floor structure above the crawl space and one with insulation on the crawl space perimeter wall, as the designs with the most widespread potential for application. Researchers based this assessment not only on the field performance, but also on input from residential builders, pest control professionals, code officials, installers, and building scientists active in the region. The key findings from the field demonstration were that (1) closed crawl spaces stay substantially drier than traditional wall-vented crawl spaces during humid climate conditions, and (2) the houses built on the closed crawl space foundations saved, on average, 15% or more on annual energy used for space heating and cooling. A comparison of the actual energy performance of the homes versus the performance predicted by a popular HERS software application showed that the software was unable to predict the demonstrated savings, in some cases predicting an energy penalty. Findings from the 2005 project were summarized in a publication titled Closed Crawl Spaces: An Introduction to Design, Construction and Performance. Since its release, the publication has received widespread use by builders, homeowners, installers and code officials concerned about crawl space construction. The findings were also used to create major revisions to the NC Residential Code, which were adopted in 2004 and immediately began to reduce the regulatory barriers to widespread commercialization of the technology in NC, particularly in new residential construction. Full project details are located at www.crawlspaces.org.

  12. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  13. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  14. docBUILDER - Building Your Useful Metadata for Earth Science Data and Services.

    NASA Astrophysics Data System (ADS)

    Weir, H. M.; Pollack, J.; Olsen, L. M.; Major, G. R.

    2005-12-01

    The docBUILDER tool, created by NASA's Global Change Master Directory (GCMD), assists the scientific community in efficiently creating quality data and services metadata. Metadata authors are asked to complete five required fields to ensure enough information is provided for users to discover the data and related services they seek. After the metadata record is submitted to the GCMD, it is reviewed for semantic and syntactic consistency. Currently, two versions are available - a Web-based tool accessible with most browsers (docBUILDERweb) and a stand-alone desktop application (docBUILDERsolo). The Web version is available through the GCMD website, at http://gcmd.nasa.gov/User/authoring.html. This version has been updated and now offers: personalized templates to ease entering similar information for multiple data sets/services; automatic population of Data Center/Service Provider URLs based on the selected center/provider; three-color support to indicate required, recommended, and optional fields; an editable text window containing the XML record, to allow for quick editing; and improved overall performance and presentation. The docBUILDERsolo version offers the ability to create metadata records on a computer wherever you are. Except for installation and the occasional update of keywords, data/service providers are not required to have an Internet connection. This freedom will allow users with portable computers (Windows, Mac, and Linux) to create records in field campaigns, whether in Antarctica or the Australian Outback. This version also offers a spell-checker, in addition to all of the features found in the Web version.

  15. Section Builder: A finite element tool for analysis and design of composite beam cross-sections

    NASA Astrophysics Data System (ADS)

    Chakravarty, Uttam Kumar

    SectionBuilder is an innovative finite element based tool, developed for analysis and design of composite beam cross-sections. The tool can handle the cross-sections with parametric shapes and arbitrary configurations. It can also handle arbitrary lay-ups for predefined beam cross-section geometries in a consistent manner. The material properties for each layer of the cross-section can be defined on the basis of the design requirements. This tool is capable of dealing with multi-cell composite cross-sections with arbitrary lay-ups. It has also the benefit of handling the variation of thickness of skin and D-spars for beams such as rotor blades. A typical cross-section is considered as a collection of interconnected walls. Walls with arbitrary lay-ups based on predefined geometries and material properties are generated first. The complex composite beam cross-sections are developed by connecting the walls using various types of connectors. These connectors are compatible with the walls, i.e., the thickness of the layers of the walls must match with those of the connectors at the place of connection. Cross-sections are often reinforced by core material for constructing realistic rotor blade cross-sections. The tool has the ability to integrate core materials into the cross-sections. A mapped mesh is considered for meshing parametric shapes, walls and various connectors, whereas a free mesh is considered for meshing the core materials. A new algorithm based on the Delaunay refinement algorithm is developed for creating the best possible free mesh for core materials. After meshing the cross-section, the tool determines the sectional properties using finite element analysis. This tool computes sectional properties including stiffness matrix, compliance matrix, mass matrix, and principal axes. A visualization environment is integrated with the tool for visualizing the stress and strain distributions over the cross-section.

  16. Event Perception

    PubMed Central

    Radvansky, Gabriel; Zacks, Jeffrey M.

    2012-01-01

    Events are central elements of human experience. Formally, they can be individuated in terms of the entities that compose them, the features of those entities, and the relations amongst entities. Psychologically, representations of events capture their spatiotemporal location, the people and objects involved, and the relations between these elements. Here, we present an account of the nature of psychological representations of events and how they are constructed and updated. Event representations are like images in that they are isomorphic to the situations they represent. However, they are like models or language in that they are constructed of components rather than being holistic. Also, they are partial representations that leave out some elements and abstract others. Representations of individual events are informed by schematic knowledge about general classes of events. Event representations are constructed in a process that segments continuous activity into discrete events. The construction of a series of event representations forms a basis for predicting the future, planning for that future, and imagining alternatives. PMID:23082236

  17. Skeletal muscle hypertrophy and structure and function of skeletal muscle fibres in male body builders

    PubMed Central

    D'Antona, Giuseppe; Lanfranconi, Francesca; Pellegrino, Maria Antonietta; Brocca, Lorenza; Adami, Raffaella; Rossi, Rosetta; Moro, Giorgio; Miotti, Danilo; Canepari, Monica; Bottinelli, Roberto

    2006-01-01

    Needle biopsy samples were taken from vastus lateralis muscle (VL) of five male body builders (BB, age 27.4 ± 0.93 years; mean ±s.e.m.), who had being performing hypertrophic heavy resistance exercise (HHRE) for at least 2 years, and from five male active, but untrained control subjects (CTRL, age 29.9 ± 2.01 years). The following determinations were performed: anatomical cross-sectional area and volume of the quadriceps and VL muscles in vivo by magnetic resonance imaging (MRI); myosin heavy chain isoform (MHC) distribution of the whole biopsy samples by SDS-PAGE; cross-sectional area (CSA), force (Po), specific force (Po/CSA) and maximum shortening velocity (Vo) of a large population (n= 524) of single skinned muscle fibres classified on the basis of MHC isoform composition by SDS-PAGE; actin sliding velocity (Vf) on pure myosin isoforms by in vitro motility assays. In BB a preferential hypertrophy of fast and especially type 2X fibres was observed. The very large hypertrophy of VL in vivo could not be fully accounted for by single muscle fibre hypertrophy. CSA of VL in vivo was, in fact, 54% larger in BB than in CTRL, whereas mean fibre area was only 14% larger in BB than in CTRL. MHC isoform distribution was shifted towards 2X fibres in BB. Po/CSA was significantly lower in type 1 fibres from BB than in type 1 fibres from CTRL whereas both type 2A and type 2X fibres were significantly stronger in BB than in CTRL. Vo of type 1 fibres and Vf of myosin 1 were significantly lower in BB than in CTRL, whereas no difference was observed among fast fibres and myosin 2A. The findings indicate that skeletal muscle of BB was markedly adapted to HHRE through extreme hypertrophy, a shift towards the stronger and more powerful fibre types and an increase in specific force of muscle fibres. Such adaptations could not be fully accounted for by well known mechanisms of muscle plasticity, i.e. by the hypertrophy of single muscle fibre (quantitative mechanism) and by a

  18. 76 FR 19466 - Masco Builder Cabinet Group Including On-Site Leased Workers From Reserves Network, Reliable...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-07

    ... in the Federal Register on December 11, 2009 (74 FR 65797). The notice was amended on December 22..., 2011 (76 FR 2145). The notice was amended again February 24, 2011 to include on-site leased workers... 10, 2011 (76 FR 13226-13227). At the request of the State agency, the Department reviewed...

  19. Operation of a digital seismic network on Mount St. Helens volcano and observations of long period seismic events that originate under the volcano

    SciTech Connect

    Fehler, M.; Chouet, B.

    1982-09-01

    A 9 station digital seismic array was operated on Mount St. Helens volcano in Washington State during 1981. One of the stations was placed inside the crater of the volcano, six were located on the flanks of the volcano within two km of the crater and two were approximately ten km from the crater. Four of the instruments recorded three components of motion and the remaining five recorded only the vertical component. A one day experiment was carried out during which the crater monitoring seismometer was complimented by the addition of two ink recording instruments. During the one day experiment six observers recorded times of rockfall, felt-earthquake occurrences, and changes in steam emissions from the dome in the crater. Using information obtained during the one day experiment seismic events recorded by the digital instruments were classified as earthquakes, rockfalls, helicopter noise and a type of event that is unique to volcanoes which is called long period. Waveforms of these long period events have a duration of up to 30 seconds and a spectrum that is peaked at approximately 2 Hz. The frequency at which the peak in the spectrum occurs is nearly the same at all stations which means that the unique waveform of long period events is due to a source effect, not a path effect. The peak frequency is fairly insensitive to the amplitude of the signal which means that the size of the source region is constant, independent of the signal amplitude. Long period events were not felt and were accompanied by no visible changes inside the crater which lead to the conclusion that they are some sort of seismic disturbance generated inside the Volcano.

  20. Motor Readiness Increases Brain Connectivity Between Default-Mode Network and Motor Cortex: Impact on Sampling Resting Periods from fMRI Event-Related Studies.

    PubMed

    Bazán, Paulo Rodrigo; Biazoli, Claudinei Eduardo; Sato, João Ricardo; Amaro, Edson

    2015-12-01

    The default-mode network (DMN) has been implicated in many conditions. One particular function relates to its role in motor preparation. However, the possibly complex relationship between DMN activity and motor preparation has not been fully explored. Dynamic interactions between default mode and motor networks may compromise the ability to evaluate intrinsic connectivity using resting period data extracted from task-based experiments. In this study, we investigated alterations in connectivity between the DMN and the motor network that are associated with motor readiness during the intervals between motor task trials. fMRI data from 20 normal subjects were acquired under three conditions: pure resting state; resting state interleaved with brief, cued right-hand movements at constant intervals (lower readiness); and resting state interleaved with the same movements at unpredictable intervals (higher readiness). The functional connectivity between regions of motor and DMNs was assessed separately for movement periods and intertask intervals. We found a negative relationship between the DMN and the left sensorimotor cortex during the task periods for both motor conditions. Furthermore, during the intertask intervals of the unpredictable condition, the DMN showed a positive relationship with right sensorimotor cortex and a negative relation with the left sensorimotor cortex. These findings indicate a specific modulation on motor processing according to the state of motor readiness. Therefore, connectivity studies using task-based fMRI to probe DMN should consider the influence of motor system modulation when interpreting the results. PMID:26414865

  1. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  2. Extensin network formation in Vitis vinifera callus cells is an essential and causal event in rapid and H2O2-induced reduction in primary cell wall hydration

    PubMed Central

    2011-01-01

    Background Extensin deposition is considered important for the correct assembly and biophysical properties of primary cell walls, with consequences to plant resistance to pathogens, tissue morphology, cell adhesion and extension growth. However, evidence for a direct and causal role for the extensin network formation in changes to cell wall properties has been lacking. Results Hydrogen peroxide treatment of grapevine (Vitis vinifera cv. Touriga) callus cell walls was seen to induce a marked reduction in their hydration and thickness. An analysis of matrix proteins demonstrated this occurs with the insolubilisation of an abundant protein, GvP1, which displays a primary structure and post-translational modifications typical of dicotyledon extensins. The hydration of callus cell walls free from saline-soluble proteins did not change in response to H2O2, but fully regained this capacity after addition of extensin-rich saline extracts. To assay the specific contribution of GvP1 cross-linking and other wall matrix proteins to the reduction in hydration, GvP1 levels in cell walls were manipulated in vitro by binding selected fractions of extracellular proteins and their effect on wall hydration during H2O2 incubation assayed. Conclusions This approach allowed us to conclude that a peroxidase-mediated formation of a covalently linked network of GvP1 is essential and causal in the reduction of grapevine callus wall hydration in response to H2O2. Importantly, this approach also indicated that extensin network effects on hydration was only partially irreversible and remained sensitive to changes in matrix charge. We discuss this mechanism and the importance of these changes to primary wall properties in the light of extensin distribution in dicotyledons. PMID:21672244

  3. ChemSkill Builder 2000, Version 6.1 [CD-ROM] (by James D. Spain and Harold J. Peters)

    NASA Astrophysics Data System (ADS)

    Keeney-Kennicutt, Reviewed By Wendy L.

    2000-07-01

    One of the major challenges for faculty teaching general chemistry is how to encourage students to practice solving problems. We know that for students to develop chemical intuition and problem-solving skills, they must "get their hands dirty" as they decipher and unravel problems inherent to our discipline. One tool that I've used since its release in 1996 is the ChemSkill Builder, an electronic homework package. The latest version, ChemSkill Builder (CSB) 2000, version 6.1, is an excellent, effective integration of teaching and testing most quantitative and conceptual learning objectives in an interactive way. It is inexpensive and easy to use for both students and faculty. The CSB 2000 package of personalized problem sets, specifically designed to complement most general chemistry courses, is a program on CD-ROM for PC Windows users (3.1, 95, or 98), with more than 1500 questions and a 3 1/2-in. record-management disk. There is a separate grade-management disk for the instructor. It has 24 gradable chapters, each with 5 or 6 sections, plus two new chapters that are not graded: Polymer Chemistry and an Appendix of Chemical Skills. Each section begins with a short review of the topic and many have interactive explanations. If students miss an answer, they are given a second chance for 70% credit. If they still miss, the worked-out solution is presented in detail. Students can work each section as many times as they wish to improve their scores. Periodically, the students download their data directly into a PC set up by the instructor. The data can be easily converted into an ASCII file and merged with a spreadsheet. The use of CD-ROM solves the sporadic problems associated with previous versions on 3 1/2-in. disks: software glitches, failed disks, and system incompatibilities. The quality and number of graphics and interactive exercises are much improved in this latest version. I particularly enjoyed the interactive explanations of significant figures and

  4. Rapid surveillance for health events following a mass meningococcal B vaccine program in a university setting: A Canadian Immunization Research Network study.

    PubMed

    Langley, J M; MacDougall, D M; Halperin, B A; Swain, A; Halperin, S A; Top, K A; McNeil, S A; MacKinnon-Cameron, D; Marty, K; De Serres, G; Dubé, E; Bettinger, J A

    2016-07-25

    An outbreak of Neisseria meningitidis serotype B infection occurred at a small residential university; public health announced an organizational vaccination program with the 4-component Meningococcal B (4CMenB) vaccine (Bexsero(TM), Novartis/GlaxoSmithKline Inc.) several days later. Since there were limited published data on reactogenicity of 4CMenB in persons over 17years of age, this study sought to conduct rapid surveillance of health events in vaccinees and controls using an online survey. Vaccine uptake was 84.7% for dose 1 (2967/3500) and 70% (2456/3500) for dose 2; the survey response rates were 33.0% (987/2967) and 18.7% (459/2456) in dose 1 and dose 1 recipients respectively, and 12% in unvaccinated individuals (63/533). Most students were 20-29years of age (vaccinees, 64.0%; controls, 74.0). A new health problem or worsening of an existing health problem was reported by 30.0% and 30.3% of vaccine recipients after doses 1 and 2 respectively; and by 15.9% of controls. These health problems interfered with the ability to perform normal activities in most vaccinees reporting these events (74.7% post dose 1; 62.6% post dose 2), and in 60% of controls. The health problems led to a health care provider visit (including emergency room) in 12.8% and 14.4% of vaccinees post doses 1 and 2, respectively and in 40% of controls. The most common reactions in vaccinees were injection site reactions (20.6% post dose 1, 16.1% post dose 20 and non-specific systemic complaints (22.6% post dose 1, 17.6% post dose 2). No hospitalizations were reported. An online surveillance program during an emergency meningococcal B vaccine program was successfully implemented, and detected higher rates of health events in vaccinees compared to controls, and high rates of both vaccinees and controls seeking medical attention. The types of adverse events reported by young adult vaccinees were consistent with those previously. PMID:27302338

  5. Building America Best Practices Series Volume 11. Builders Challenge Guide to 40% Whole-House Energy Savings in the Marine Climate

    SciTech Connect

    Baechler, Michael C.; Gilbride, Theresa L.; Hefty, Marye G.; Cole, Pamala C.; Williamson, Jennifer L.; Love, Pat M.

    2010-09-01

    This best practices guide is the eleventh in a series of guides for builders produced by the U.S. Department of Energy’s Building America Program. This guide book is a resource to help builders design and construct homes that are among the most energy-efficient available, while addressing issues such as building durability, indoor air quality, and occupant health, safety, and comfort. With the measures described in this guide, builders in the marine climate (portions of Washington, Oregon, and California) can achieve homes that have whole house energy savings of 40% over the Building America benchmark (a home built to mid-1990s building practices roughly equivalent to the 1993 Model Energy Code) with no added overall costs for consumers. These best practices are based on the results of research and demonstration projects conducted by Building America’s research teams. The guide includes information for managers, designers, marketers, site supervisors, and subcontractors, as well as case studies of builders who are successfully building homes that cut energy use by 40% in the marine climate. This document is available on the web at www.buildingamerica.gov. This report was originally cleared 06-29-2010. This version is Rev 1 cleared in Nov 2010. The only change is the reference to the Energy Star Windows critieria shown on pg 8.25 was updated to match the criteria - Version 5.0, 04/07/2009, effective 01/04/2010.

  6. Quantitative Microbial Risk Assessment Tutorial – SDMProjectBuilder: Import Local Data Files to Identify and Modify Contamination Sources and Input Parameters

    EPA Science Inventory

    Twelve example local data support files are automatically downloaded when the SDMProjectBuilder is installed on a computer. They allow the user to modify values to parameters that impact the release, migration, fate, and transport of microbes within a watershed, and control delin...

  7. Building America Best Practices Series: Volume 1; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Hot and Humid Climate

    SciTech Connect

    2004-12-01

    This Building America Best Practices guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the hot and humid climate.

  8. Building America Best Practices Series: Volume 5; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Marine Climate

    SciTech Connect

    2006-10-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in the Marine climate region.

  9. Building America Best Practices Series, Volume 9: Builders Challenge Guide to 40% Whole-House Energy Savings in the Hot-Dry and Mixed-Dry Climates

    SciTech Connect

    Baechler, Michael C.; Gilbride, Theresa L.; Hefty, Marye G.; Williamson, Jennifer L.; Ruiz, Kathleen A.; Bartlett, Rosemarie; Love, Pat M.

    2009-10-23

    This best practices guide is the ninth in a series of guides for builders produced by the U.S. Department of Energy’s Building America Program. This guide book is a resource to help builders design and construct homes that are among the most energy-efficient available, while addressing issues such as building durability, indoor air quality, and occupant health, safety, and comfort. With the measures described in this guide, builders in the hot-dry and mixed-dry climates can achieve homes that have whole house energy savings of 40% over the Building America benchmark (a home built to mid-1990s building practices roughly equivalent to the 1993 Model Energy Code) with no added overall costs for consumers. These best practices are based on the results of research and demonstration projects conducted by Building America’s research teams. The guide includes information for managers, designers, marketers, site supervisors, and subcontractors, as well as case studies of builders who are successfully building homes that cut energy use by 40% in the hot-dry and mixed-dry climates.

  10. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  11. Time-lapse seismic tomography using the data of microseismic monitoring network and analysis of mine-induced events, seismic tomography results and technological data in Pyhäsalmi mine, Finland

    NASA Astrophysics Data System (ADS)

    Nevalainen, Jouni; Kozlovskaya, Elena

    2016-04-01

    We present results of a seismic travel-time tomography applied to microseismic data from the Pyhäsalmi mine, Finland. The data about microseismic events in the mine is recorded since 2002 when the passive microseismic monitoring network was installed in the mine. Since that over 130000 microseismic events have been observed. The first target of our study was to test can the passive microseismic monitoring data be used with travel-time tomography. In this data set the source-receiver geometry is based on non-even distribution of natural and mine-induced events inside and in the vicinity of the mine and hence, is a non-ideal one for the travel-time tomography. The tomographic inversion procedure was tested with the synthetic data and real source-receiver geometry from Pyhäsalmi mine and with the real travel-time data of the first arrivals of P-waves from the microseismic events. The results showed that seismic tomography is capable to reveal differences in seismic velocities in the mine area corresponding to different rock types. For example, the velocity contrast between the ore body and surrounding rock is detectable. The velocity model recovered agrees well with the known geological structures in the mine area. The second target of the study was to apply the travel-time tomography to microseismic monitoring data recorded during different time periods in order to track temporal changes in seismic velocities within the mining area as the excavation proceeds. The result shows that such a time-lapse travel-time tomography can recover such changes. In order to obtain good ray coverage and good resolution, the time interval for a single tomography round need to be selected taking into account the number of events and their spatial distribution. The third target was to compare and analyze mine-induced event locations, seismic tomography results and mining technological data (for example, mine excavation plans) in order to understand the influence of mining technology

  12. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  13. Building America Best Practices Series Volume 12: Builders Challenge Guide to 40% Whole-House Energy Savings in the Cold and Very Cold Climates

    SciTech Connect

    Baechler, Michael C.; Gilbride, Theresa L.; Hefty, Marye G.; Cole, Pamala C.; Love, Pat M.

    2011-02-01

    This best practices guide is the twelfth in a series of guides for builders produced by PNNL for the U.S. Department of Energy’s Building America program. This guide book is a resource to help builders design and construct homes that are among the most energy-efficient available, while addressing issues such as building durability, indoor air quality, and occupant health, safety, and comfort. With the measures described in this guide, builders in the cold and very cold climates can build homes that have whole-house energy savings of 40% over the Building America benchmark with no added overall costs for consumers. The best practices described in this document are based on the results of research and demonstration projects conducted by Building America’s research teams. Building America brings together the nation’s leading building scientists with over 300 production builders to develop, test, and apply innovative, energy-efficient construction practices. Building America builders have found they can build homes that meet these aggressive energy-efficiency goals at no net increased costs to the homeowners. Currently, Building America homes achieve energy savings of 40% greater than the Building America benchmark home (a home built to mid-1990s building practices roughly equivalent to the 1993 Model Energy Code). The recommendations in this document meet or exceed the requirements of the 2009 IECC and 2009 IRC and thos erequirements are highlighted in the text. This document will be distributed via the DOE Building America website: www.buildingamerica.gov.

  14. The Misuse of Anabolic-Androgenic Steroids among Iranian Recreational Male Body-Builders and Their Related Psycho-Socio-Demographic factors

    PubMed Central

    ANGOORANI, Hooman; HALABCHI, Farzin

    2015-01-01

    Background: The high prevalence and potential side effects of anabolic-androgenic steroids (AAS) misuse by athletes has made it a major public health concern. Epidemiological studies on the abuse of such drugs are mandatory for developing effective preventive drug control programs in sports community. This study aimed to investigate the prevalence of AAS abuse and their association with some psycho-socio-demographic factors in Iranian male recreational body-builders. Methods: Between March and October 2011; 906 recreational male body-builders from 103 randomly selected bodybuilding clubs in Tehran, Iran were participated in this study. Some psycho-socio- demographic factors including age, job, average family income, family size, sport experience (months), weekly duration of the sporting activity (h), purpose of participation in sporting activity, mental health as well as body image (via General Health Questionnaire and Multidimensional Body-Self Relations Questionnaire, respectively), and history of AAS use were obtained by interviews using questionnaires. Results: Participants were all recreational male body-builders [mean age (SD): 25.7 (7.1), ranging 14–56 yr]. Self-report of AAS abuse was registered in 150 body-builders (16.6%). Among different psycho-socio-demographic factors, only family income and sport experience were inversely associated with AAS abuse. Conclusion: Lifetime prevalence of AAS abuse is relatively high among recreational body-builders based on their self-report. Some psycho-socio-demographic factors including family income and sport experience may influence the prevalence of AAS abuse. PMID:26811817

  15. Charter Schools as Nation Builders: Democracy Prep and Civic Education. Policy Brief 4

    ERIC Educational Resources Information Center

    Lautzenheiser, Daniel; Kelly, Andrew P.

    2013-01-01

    This policy brief is the first in a series of in-depth case studies exploring how top-performing charter schools have incorporated civic learning in their school curriculum and school culture. This paper introduces Democracy Prep, a network of seven public charter schools with a civic mission at its core. Democracy Prep's founder and…

  16. Mind Builders: Multidisciplinary Challenges for Cooperative Team-Building and Competition

    ERIC Educational Resources Information Center

    Fleisher, Paul; Ziegler, Donald

    2006-01-01

    For more than twenty years, the Richmond, Virginia Public Schools' program for gifted students has conducted an interscholastic competition similar to the nationally known competition, Destination Imagination. In the featured contest of this yearly event, teams of five students present solutions to engineering problems that they have worked on for…

  17. Extreme Events

    NASA Astrophysics Data System (ADS)

    Nott, Jonathan

    2006-04-01

    The assessment of risks posed by natural hazards such as floods, droughts, earthquakes, tsunamis or cyclones, is often based on short-term historical records that may not reflect the full range or magnitude of events possible. As human populations grow, especially in hazard-prone areas, methods for accurately assessing natural hazard risks are becoming increasingly important. In Extreme Events Jonathan Nott describes the many methods used to reconstruct such hazards from natural long-term records. He demonstrates how long-term (multi-century to millennial) records are essential in gaining a realistic understanding of the variability of natural hazards, and how short-term historical records can often misrepresent the likely risks associated with natural hazards. This book will form a useful resource for students taking courses covering natural hazards and risk assessment. It will also be valuable for urban planners, policy makers and non-specialists as a guide to understanding and reconstructing long-term records of natural hazards. Explains mechanisms that cause extreme events and discusses their prehistoric records Describes how to reconstruct long-term records of natural hazards in order to make accurate risk assessments Demonstrates that natural hazards can follow cycles over time and do not occur randomly

  18. Network Views

    ERIC Educational Resources Information Center

    Alexander, Louis

    2010-01-01

    The world changed in 2008. The financial crisis brought with it a deepening sense of insecurity, and the desire to be connected to a network increased. Throughout the summer and fall of 2008, events were unfolding with alarming rapidity. The Massachusetts Institute of Technology (MIT) Alumni Association wanted to respond to this change in the…

  19. Responding to the Event Deluge

    NASA Technical Reports Server (NTRS)

    Williams, Roy D.; Barthelmy, Scott D.; Denny, Robert B.; Graham, Matthew J.; Swinbank, John

    2012-01-01

    We present the VOEventNet infrastructure for large-scale rapid follow-up of astronomical events, including selection, annotation, machine intelligence, and coordination of observations. The VOEvent.standard is central to this vision, with distributed and replicated services rather than centralized facilities. We also describe some of the event brokers, services, and software that .are connected to the network. These technologies will become more important in the coming years, with new event streams from Gaia, LOF AR, LIGO, LSST, and many others

  20. Building America Best Practices Series: Volume 2; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Hot-Dry and Mixed-Dry Climates

    SciTech Connect

    2005-09-01

    This guidebook is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the hot-dry and mixed-dry climates.

  1. Solar project description for Gill Harrop Builders single-family detached residence, Big Flats, New York

    NASA Astrophysics Data System (ADS)

    1982-04-01

    A house with approximately 1360 square feet of conditioned space heated by a direct gain system with manually operated insulated curtains is discussed. Solar heating is augmented by electric resistance heating, and a wood burning stove may be installed. Sunlight is admitted through both south facing windows and through clerestory collector panels and is absorbed and stored as heat in a concrete floor and wall. Heat is then distributed by natural convection and radiation. Temperature regulation is assisted by Earth beams. Three modes of operation are described: collector-to-storage, storage-to-space heating, and passive space cooling, which is accomplished by shading, movable insulation, and ventilation. The instrumentation for the National Solar Data Network is described. The solar energy portion of the construction costs is estimated.

  2. Events diary

    NASA Astrophysics Data System (ADS)

    2000-01-01

    as Imperial College, the Royal Albert Hall, the Royal College of Art, the Natural History and Science Museums and the Royal Geographical Society. Under the heading `Shaping the future together' BA2000 will explore science, engineering and technology in their wider cultural context. Further information about this event on 6 - 12 September may be obtained from Sandra Koura, BA2000 Festival Manager, British Association for the Advancement of Science, 23 Savile Row, London W1X 2NB (tel: 0171 973 3075, e-mail: sandra.koura@britassoc.org.uk ). Details of the creating SPARKS events may be obtained from creating.sparks@britassoc.org.uk or from the website www.britassoc.org.uk . Other events 3 - 7 July, Porto Alegre, Brazil VII Interamerican conference on physics education: The preparation of physicists and physics teachers in contemporary society. Info: IACPE7@if.ufrgs.br or cabbat1.cnea.gov.ar/iacpe/iacpei.htm 27 August - 1 September, Barcelona, Spain GIREP conference: Physics teacher education beyond 2000. Info: www.blues.uab.es/phyteb/index.html

  3. HBP Builder: A Tool to Generate Hyperbranched Polymers and Hyperbranched Multi-Arm Copolymers for Coarse-grained and Fully Atomistic Molecular Simulations

    NASA Astrophysics Data System (ADS)

    Yu, Chunyang; Ma, Li; Li, Shanlong; Tan, Haina; Zhou, Yongfeng; Yan, Deyue

    2016-05-01

    Computer simulation has been becoming a versatile tool that can investigate detailed information from the microscopic scale to the mesoscopic scale. However, the crucial first step of molecular simulation is model building, particularly for hyperbranched polymers (HBPs) and hyperbranched multi-arm copolymers (HBMCs) with complex and various topological structures. Unlike well-defined polymers, not only the molar weight of HBPs/HBMCs with polydispersity, but the HBPs/HBMCs with the same degree of polymerization (DP) and degree of branching (DB) also have many possible topological structures, thus making difficulties for user to build model in molecular simulation. In order to build a bridge between model building and molecular simulation of HBPs and HBMCs, we developed HBP Builder, a C language open source HBPs/HBMCs building toolkit. HBP Builder implements an automated protocol to build various coarse-grained and fully atomistic structures of HBPs/HBMCs according to user’s specific requirements. Meanwhile, coarse-grained and fully atomistic output structures can be directly employed in popular simulation packages, including HOOMD, Tinker and Gromacs. Moreover, HBP Builder has an easy-to-use graphical user interface and the modular architecture, making it easy to extend and reuse it as a part of other program.

  4. HBP Builder: A Tool to Generate Hyperbranched Polymers and Hyperbranched Multi-Arm Copolymers for Coarse-grained and Fully Atomistic Molecular Simulations.

    PubMed

    Yu, Chunyang; Ma, Li; Li, Shanlong; Tan, Haina; Zhou, Yongfeng; Yan, Deyue

    2016-01-01

    Computer simulation has been becoming a versatile tool that can investigate detailed information from the microscopic scale to the mesoscopic scale. However, the crucial first step of molecular simulation is model building, particularly for hyperbranched polymers (HBPs) and hyperbranched multi-arm copolymers (HBMCs) with complex and various topological structures. Unlike well-defined polymers, not only the molar weight of HBPs/HBMCs with polydispersity, but the HBPs/HBMCs with the same degree of polymerization (DP) and degree of branching (DB) also have many possible topological structures, thus making difficulties for user to build model in molecular simulation. In order to build a bridge between model building and molecular simulation of HBPs and HBMCs, we developed HBP Builder, a C language open source HBPs/HBMCs building toolkit. HBP Builder implements an automated protocol to build various coarse-grained and fully atomistic structures of HBPs/HBMCs according to user's specific requirements. Meanwhile, coarse-grained and fully atomistic output structures can be directly employed in popular simulation packages, including HOOMD, Tinker and Gromacs. Moreover, HBP Builder has an easy-to-use graphical user interface and the modular architecture, making it easy to extend and reuse it as a part of other program. PMID:27188541

  5. HBP Builder: A Tool to Generate Hyperbranched Polymers and Hyperbranched Multi-Arm Copolymers for Coarse-grained and Fully Atomistic Molecular Simulations

    PubMed Central

    Yu, Chunyang; Ma, Li; Li, Shanlong; Tan, Haina; Zhou, Yongfeng; Yan, Deyue

    2016-01-01

    Computer simulation has been becoming a versatile tool that can investigate detailed information from the microscopic scale to the mesoscopic scale. However, the crucial first step of molecular simulation is model building, particularly for hyperbranched polymers (HBPs) and hyperbranched multi-arm copolymers (HBMCs) with complex and various topological structures. Unlike well-defined polymers, not only the molar weight of HBPs/HBMCs with polydispersity, but the HBPs/HBMCs with the same degree of polymerization (DP) and degree of branching (DB) also have many possible topological structures, thus making difficulties for user to build model in molecular simulation. In order to build a bridge between model building and molecular simulation of HBPs and HBMCs, we developed HBP Builder, a C language open source HBPs/HBMCs building toolkit. HBP Builder implements an automated protocol to build various coarse-grained and fully atomistic structures of HBPs/HBMCs according to user’s specific requirements. Meanwhile, coarse-grained and fully atomistic output structures can be directly employed in popular simulation packages, including HOOMD, Tinker and Gromacs. Moreover, HBP Builder has an easy-to-use graphical user interface and the modular architecture, making it easy to extend and reuse it as a part of other program. PMID:27188541

  6. News Conference: Bloodhound races into history Competition: School launches weather balloon Course: Update weekends inspire teachers Conference: Finland hosts GIREP conference Astronomy: AstroSchools sets up schools network to share astronomy knowledge Teaching: Delegates praise science events in Wales Resources: ELI goes from strength to strength International: South Sudan teachers receive training Workshop: Delegates experience universality

    NASA Astrophysics Data System (ADS)

    2011-11-01

    Conference: Bloodhound races into history Competition: School launches weather balloon Course: Update weekends inspire teachers Conference: Finland hosts GIREP conference Astronomy: AstroSchools sets up schools network to share astronomy knowledge Teaching: Delegates praise science events in Wales Resources: ELI goes from strength to strength International: South Sudan teachers receive training Workshop: Delegates experience universality

  7. Fast fitting of non-Gaussian state-space models to animal movement data via Template Model Builder.

    PubMed

    Albertsen, Christoffer Moesgaard; Whoriskey, Kim; Yurkowski, David; Nielsen, Anders; Mills, Joanna

    2015-10-01

    State-space models (SSM) are often used for analyzing complex ecological processes that are not observed directly, such as marine animal movement. When outliers are present in the measurements, special care is needed in the analysis to obtain reliable location and process estimates. Here we recommend using the Laplace approximation combined with automatic differentiation (as implemented in the novel R package Template Model Builder; TMB) for the fast fitting of continuous-time multivariate non-Gaussian SSMs. Through Argos satellite tracking data, we demonstrate that the use of continuous-time t-distributed measurement errors for error-prone data is more robust to outliers and improves the location estimation compared to using discretized-time t-distributed errors (implemented with a Gibbs sampler) or using continuous-time Gaussian errors (as with the Kalman filter). Using TMB, we are able to estimate additional parameters compared to previous methods, all without requiring a substantial increase in computational time. The model implementation is made available through the R package argosTrack. PMID:26649381

  8. AIRID: an application of the KAS/Prospector expert system builder to airplane identification

    SciTech Connect

    Aldridge, J.P.

    1984-01-01

    The Knowledge Acquisition System/Prospector expert system building tool developed by SRI, International, has been used to construct an expert system to identify aircraft on the basis of observables such as wing shape, engine number/location, fuselage shape, and tail assembly shape. Additional detailed features are allowed to influence the identification as other favorable features. Constraints on the observations imposed by bad weather and distant observations have been included as contexts to the models. Models for Soviet and US fighter aircraft have been included. Inclusion of other types of aircraft such as bombers, transports, and reconnaissance craft is straightforward. Two models permit exploration of the interaction of semantic and taxonomic networks with the models. A full set of text data for fluid communication with the user has been included. The use of demons as triggered output responses to enhance utility to the user has been explored. This paper presents discussion of the ease of building the expert system using this powerful tool and problems encountered in the construction process.

  9. Synchronous changes in the seismicity rate and ocean-bottom hydrostatic pressures along the Nankai trough: A possible slow slip event detected by the Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET)

    NASA Astrophysics Data System (ADS)

    Suzuki, Kensuke; Nakano, Masaru; Takahashi, Narumi; Hori, Takane; Kamiya, Shinichiro; Araki, Eiichiro; Nakata, Ryoko; Kaneda, Yoshiyuki

    2016-06-01

    We detected long-term hydrostatic pressure changes at ocean-bottom stations of the Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET) along the Nankai trough, off southwestern Japan. We detected these changes after removing the contributions of ocean mass variations and sensor drift from the records. In addition, we detected a decrease in the background seismicity rate of a nearby earthquake cluster that was synchronous with the hydrostatic pressure changes. We interpreted these observed hydrostatic pressure changes to reflect vertical deformation of the ocean floor of 3-8 cm, and we consider the cause of the seafloor crustal deformation to be a slow slip event (SSE) beneath the stations. Because the pressure changes were observed at stations with distances less than 20 km to each other, we inferred that the SSE occurred in the shallow part of the sedimentary wedge, such as on a splay fault system. The synchronous observation of an SSE and a seismicity rate change suggests that both were triggered by a change in the regional stress that may be associated with stress accumulation and release processes occurring along the Nankai trough. These data show that continuous and careful monitoring of crustal activities by DONET stations provides an effective way to detect seismic and geodetic signals related to the occurrence of megathrust or other types of large earthquakes.

  10. On the usability of frequency distributions and source attribution of Cs-137 detections encountered in the IMS radio-nuclide network for radionuclide event screening and climate change monitoring

    NASA Astrophysics Data System (ADS)

    Becker, A.; Wotawa, G.; Zähringer, M.

    2009-04-01

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), airborne radioactivity is measured by means of high purity Germanium gamma ray detectors deployed in a global monitoring network. Almost 60 of the scheduled 80 stations have been put in provisional operations by the end of 2008. Each station daily sends the 24 hour samples' spectroscopic data to the Vienna based Provisional Technical Secretariat (PTS) of the CTBT Organization (CTBTO) for review for treaty-relevant nuclides. Cs-137 is one of these relevant isotopes. Its typical minimum detectable concentration is in the order of a few Bq/m3. However, this isotope is also known to occur in atmospheric trace concentrations, due to known non CTBT relevant processes and sources related to, for example, the re-suspension of cesium from historic nuclear tests and/or the Chernobyl reactor disaster, temporarily enhanced by bio-mass burning (Wotawa et al. 2006). Properly attributed cesium detections can be used as a proxy to detect Aeolian dust events (Igarashi et al, 2001) that potentially carry cesium from all aforementioned sources but are also known to play an important role for the radiative forcing in the atmosphere (shadow effect), at the surface (albedo) and the carbon dioxide cycle when interacting with oceanic phytoplankton (Mikami and Shi, 2005). In this context this paper provides a systematic attribution of recent Cs-137 detections in the PTS monitoring network in order to Characterize those stations which are regularly affected by Cs-137 Provide input for procedures that distinguish CTBT relevant detection from other sources (event screening) Explore on the capability of certain stations to use their Cs-137 detections as a proxy to detect aeolian dust events and to flag the belonging filters to be relevant for further investigations in this field (-> EGU-2009 Session CL16/AS4.6/GM10.1: Aeolian dust: initiator, player, and recorder of environmental change). References Igarashi, Y., M

  11. The Loudest Gravitational Wave Events

    NASA Astrophysics Data System (ADS)

    Chen, Hsin-Yu; Holz, Daniel

    2014-03-01

    Compact binary coalescences are likely to be the source of the first gravitational wave (GW) detections. While most Advanced LIGO-Virgo detections are expected to have signal-to-noise ratios (SNR) near the detection threshold, there will be a distribution of events to higher SNR. Assuming the space density of the sources is uniform in the nearby Universe, we derive the universal distribution of SNR in an arbitrary GW network, as well as the SNR distribution of the loudest event. These distributions only depend on the detection threshold and the number of detections; they are independent of the detector network, sensitivity, and the distribution of source variables such as the binary masses and spins. We also derive the SNR distribution for each individual detector within a network as a function of the detector orientation. We find that, in 90% of cases, the loudest event out of the first four Advanced LIGO-Virgo detections should be louder than SNR of 15.8 (for a threshold of 12), increasing to an SNR of 31 for 40 detections. We expect these loudest events to provide the best constraints on their source parameters, and therefore play an important role in extracting astrophysics from GW sources.

  12. 75 FR 43990 - Agency Information Collection Activities; Proposed Collection; Comment Request; Pet Event...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... Collection; Comment Request; Pet Event Tracking Network--State, Federal Cooperation to Prevent Spread of Pet... on the paperwork requirements for the proposed Pet Event Tracking Network (PETNet) cooperative... technology. Pet Event Tracking Network--State, Federal Cooperation to Prevent Spread of Pet Food...

  13. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  14. Initial Results on Soil Moisture in Relation to Timing of Snowpack, Temperature, and Heavy Vs. Moderate Rain Events from a New Soil Monitoring Network in the Southern Rocky Mountains

    NASA Astrophysics Data System (ADS)

    Osenga, E. C.; Schnissel, J.; Katzenberger, J.

    2014-12-01

    The Roaring Fork Valley (RFV) in the Rocky Mountains of Colorado is comprised of a diversity of ecosystems occurring within a single watershed. From one end of the valley to the other, the landscape undergoes an over 1500m gain in elevation, creating a unique opportunity for comparison of conditions in different habitats of close geographic proximity. Interested in comparing the ecological responses of these different habitats in the context of rising global temperatures, the Aspen Global Change Institute (AGCI) partnered with City of Aspen, Pitkin Country Open Space and Trails, and the Aspen Center for Environmental Studies to install soil monitoring stations at multiple elevations within the watershed. Soil moisture was identified as the primary indicator for monitoring because there is a dearth of local soil moisture data, and soil moisture plays a vital role in plant survival and correlates closely with precipitation and air temperature. Additionally, as precipitation regimes shift in the future, there is a need to better understand the interplay between vegetative water availability during the critical early growing season and timing, areal extent, and depth of snowpack. Two initial soil monitoring stations were installed in undeveloped, montane ecosystems of the Roaring Fork Watershed in 2012. Each station measures air temperature; relative humidity; rainfall; and soil moisture at 5, 20, and 52 cm depths. Two additional soil monitoring stations are being established over the summer of 2014, and additional stations within the Roaring Fork soil moisture network are planned for future years. Early data from the existing sites indicate the importance of timing of snowmelt in maintaining soil moisture through the early dry months of summer and dissimilarity between the impact of moderate and heavy rain events on soil moisture at different depths. These data have implications for restoration, management, and planning for local ecosystems and have significance for

  15. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGESBeta

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  16. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  17. A framework for incorporating DTI Atlas Builder registration into tract-based spatial statistics and a simulated comparison to standard TBSS

    NASA Astrophysics Data System (ADS)

    Leming, Matthew; Steiner, Rachel; Styner, Martin

    2016-03-01

    Tract-based spatial statistics (TBSS)6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder7. To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS10). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder's registration enhances TBSS group-based studies.

  18. CAISSON: Interconnect Network Simulator

    NASA Technical Reports Server (NTRS)

    Springer, Paul L.

    2006-01-01

    Cray response to HPCS initiative. Model future petaflop computer interconnect. Parallel discrete event simulation techniques for large scale network simulation. Built on WarpIV engine. Run on laptop and Altix 3000. Can be sized up to 1000 simulated nodes per host node. Good parallel scaling characteristics. Flexible: multiple injectors, arbitration strategies, queue iterators, network topologies.

  19. Catalysis in reaction networks.

    PubMed

    Gopalkrishnan, Manoj

    2011-12-01

    We define catalytic networks as chemical reaction networks with an essentially catalytic reaction pathway: one which is "on" in the presence of certain catalysts and "off" in their absence. We show that examples of catalytic networks include synthetic DNA molecular circuits that have been shown to perform signal amplification and molecular logic. Recall that a critical siphon is a subset of the species in a chemical reaction network whose absence is forward invariant and stoichiometrically compatible with a positive point. Our main theorem is that all weakly-reversible networks with critical siphons are catalytic. Consequently, we obtain new proofs for the persistence of atomic event-systems of Adleman et al., and normal networks of Gnacadja. We define autocatalytic networks, and conjecture that a weakly-reversible reaction network has critical siphons if and only if it is autocatalytic. PMID:21503834

  20. A discrete event method for wave simulation

    SciTech Connect

    Nutaro, James J

    2006-01-01

    This article describes a discrete event interpretation of the finite difference time domain (FDTD) and digital wave guide network (DWN) wave simulation schemes. The discrete event method is formalized using the discrete event system specification (DEVS). The scheme is shown to have errors that are proportional to the resolution of the spatial grid. A numerical example demonstrates the relative efficiency of the scheme with respect to FDTD and DWN schemes. The potential for the discrete event scheme to reduce numerical dispersion and attenuation errors is discussed.

  1. Assessing Special Events.

    ERIC Educational Resources Information Center

    Neff, Bonita Dostal

    Special events defined as being "newsworthy events" are becoming a way of American life. They are also a means for making a lot of money. Examples of special events that are cited most frequently are often the most minor of events; e.g., the open house, the new business opening day gala, or a celebration of some event in an organization. Little…

  2. Building America Best Practices Series: Volume 2. Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Hot-Dry and Mixed-Dry Climates

    SciTech Connect

    Baechler, M. C.; Taylor, Z. T.; Bartlett, R.; Gilbride, T.; Hefty, M.; Love, P. M.

    2005-09-01

    This best practices guide is part of a series produced by Building America. The guidebook is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the hot-dry and mixed-dry climates. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builder's team—from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  3. Event Segmentation Ability Uniquely Predicts Event Memory

    PubMed Central

    Sargent, Jesse Q.; Zacks, Jeffrey M.; Hambrick, David Z.; Zacks, Rose T.; Kurby, Christopher A.; Bailey, Heather R.; Eisenberg, Michelle L.; Beck, Taylor M.

    2013-01-01

    Memory for everyday events plays a central role in tasks of daily living, autobiographical memory, and planning. Event memory depends in part on segmenting ongoing activity into meaningful units. This study examined the relationship between event segmentation and memory in a lifespan sample to answer the following question: Is the ability to segment activity into meaningful events a unique predictor of subsequent memory, or is the relationship between event perception and memory accounted for by general cognitive abilities? Two hundred and eight adults ranging from 20 to 79 years old segmented movies of everyday events and attempted to remember the events afterwards. They also completed psychometric ability tests and tests measuring script knowledge for everyday events. Event segmentation and script knowledge both explained unique variance in event memory above and beyond the psychometric measures, and did so as strongly in older as in younger adults. These results suggest that event segmentation is a basic cognitive mechanism, important for memory across the lifespan. PMID:23942350

  4. Automated beam builder

    NASA Technical Reports Server (NTRS)

    Muench, W. K.

    1980-01-01

    Requirements for the space fabrication of large space structures are considered with emphasis on the design, development, manufacture, and testing of a machine which automatically produces a basic building block aluminum beam. Particular problems discussed include those associated with beam cap forming; brace storage, dispensing, and transporting; beam component fastening; and beam cut-off. Various critical process tests conducted to develop technology for a machine to produce composite beams are also discussed.

  5. Autonomous Module Builder

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Conventional cotton harvesting requires many seasonal laborers. To reduce labor requirements, equipment manufacturers have recently introduced harvesters with on-board module building capabilities; however, this feature is only available on pickers and these machines are expensive. Conventional mo...

  6. Event-Based Science.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    1992-01-01

    Suggests that an event-based science curriculum can provide the framework for deciding what to retain in an overloaded science curriculum. Provides examples of current events and the science concepts explored related to the event. (MDH)

  7. A wireless time synchronized event control system

    NASA Astrophysics Data System (ADS)

    Klug, Robert; Williams, Jonathan; Scheffel, Peter

    2014-05-01

    McQ has developed a wireless, time-synchronized, event control system to control, monitor, and record events with precise timing over large test sites for applications such as high speed rocket sled payload testing. Events of interest may include firing rocket motors and launch sleds, initiating flares, ejecting bombs, ejecting seats, triggering high speed cameras, measuring sled velocity, and triggering events based on a velocity window or other criteria. The system consists of Event Controllers, a Launch Controller, and a wireless network. The Event Controllers can be easily deployed at areas of interest within the test site and maintain sub-microsecond timing accuracy for monitoring sensors, electronically triggering other equipment and events, and providing timing signals to other test equipment. Recorded data and status information is reported over the wireless network to a server and user interface. Over the wireless network, the user interface configures the system based on a user specified mission plan and provides real time command, control, and monitoring of the devices and data. An overview of the system, its features, performance, and potential uses is presented.

  8. Visualizing gravitational-wave event candidates using the coherent event display

    NASA Astrophysics Data System (ADS)

    Mercer, R. A.; Klimenko, S.

    2008-09-01

    As a worldwide network of gravitational-wave detectors is now operating with an unprecedented sensitivity it is becoming increasingly important to be able to easily visualize gravitational-wave event candidates from various search algorithms using these detector networks. The coherent event display (CED) has been developed with the goal of providing a simple and easy to use tool for performing follow up analyses of burst gravitational-wave event candidates. The CED produces a web page detailing reconstructed parameters, time frequency maps, reconstructed detector responses, likelihood time frequency maps and reconstructed parameter skymaps. The CED supports events from all 2, 3, 4 and 5 detector network combinations of the LIGO, GEO600 and Virgo detectors.

  9. Using Bayesian Belief Networks and event trees for volcanic hazard assessment and decision support : reconstruction of past eruptions of La Soufrière volcano, Guadeloupe and retrospective analysis of 1975-77 unrest.

    NASA Astrophysics Data System (ADS)

    Komorowski, Jean-Christophe; Hincks, Thea; Sparks, Steve; Aspinall, Willy; Legendre, Yoann; Boudon, Georges

    2013-04-01

    Since 1992, mild but persistent seismic and fumarolic unrest at La Soufrière de Guadeloupe volcano has prompted renewed concern about hazards and risks, crisis response planning, and has rejuvenated interest in geological studies. Scientists monitoring active volcanoes frequently have to provide science-based decision support to civil authorities during such periods of unrest. In these circumstances, the Bayesian Belief Network (BBN) offers a formalized evidence analysis tool for making inferences about the state of the volcano from different strands of data, allowing associated uncertainties to be treated in a rational and auditable manner, to the extent warranted by the strength of the evidence. To illustrate the principles of the BBN approach, a retrospective analysis is undertaken of the 1975-77 crisis, providing an inferential assessment of the evolving state of the magmatic system and the probability of subsequent eruption. Conditional dependencies and parameters in the BBN are characterized quantitatively by structured expert elicitation. Revisiting data available in 1976 suggests the probability of magmatic intrusion would have been evaluated high at the time, according with subsequent thinking about the volcanological nature of the episode. The corresponding probability of a magmatic eruption therefore would have been elevated in July and August 1976; however, collective uncertainty about the future course of the crisis was great at the time, even if some individual opinions were certain. From this BBN analysis, while the more likely appraised outcome - based on observational trends at 31 August 1976 - might have been 'no eruption' (mean probability 0.5; 5-95 percentile range 0.8), an imminent magmatic eruption (or blast) could have had a probability of about 0.4, almost as substantial. Thus, there was no real scientific basis to assert one scenario was more likely than the other. This retrospective evaluation adds objective probabilistic expression to

  10. LHCb Online event processing and filtering

    NASA Astrophysics Data System (ADS)

    Alessio, F.; Barandela, C.; Brarda, L.; Frank, M.; Franek, B.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Köstner, S.; Moine, G.; Neufeld, N.; Somogyi, P.; Stoica, R.; Suman, S.

    2008-07-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed.

  11. VAFLE: visual analytics of firewall log events

    NASA Astrophysics Data System (ADS)

    Ghoniem, Mohammad; Shurkhovetskyy, Georgiy; Bahey, Ahmed; Otjacques, Benoît.

    2013-12-01

    In this work, we present VAFLE, an interactive network security visualization prototype for the analysis of firewall log events. Keeping it simple yet effective for analysts, we provide multiple coordinated interactive visualizations augmented with clustering capabilities customized to support anomaly detection and cyber situation awareness. We evaluate the usefulness of the prototype in a use case with network traffic datasets from previous VAST Challenges, illustrating its effectiveness at promoting fast and well-informed decisions. We explain how a security analyst may spot suspicious traffic using VAFLE. We further assess its usefulness through a qualitative evaluation involving network security experts, whose feedback is reported and discussed.

  12. Episodes, events, and models.

    PubMed

    Khemlani, Sangeet S; Harrison, Anthony M; Trafton, J Gregory

    2015-01-01

    We describe a novel computational theory of how individuals segment perceptual information into representations of events. The theory is inspired by recent findings in the cognitive science and cognitive neuroscience of event segmentation. In line with recent theories, it holds that online event segmentation is automatic, and that event segmentation yields mental simulations of events. But it posits two novel principles as well: first, discrete episodic markers track perceptual and conceptual changes, and can be retrieved to construct event models. Second, the process of retrieving and reconstructing those episodic markers is constrained and prioritized. We describe a computational implementation of the theory, as well as a robotic extension of the theory that demonstrates the processes of online event segmentation and event model construction. The theory is the first unified computational account of event segmentation and temporal inference. We conclude by demonstrating now neuroimaging data can constrain and inspire the construction of process-level theories of human reasoning. PMID:26578934

  13. Episodes, events, and models

    PubMed Central

    Khemlani, Sangeet S.; Harrison, Anthony M.; Trafton, J. Gregory

    2015-01-01

    We describe a novel computational theory of how individuals segment perceptual information into representations of events. The theory is inspired by recent findings in the cognitive science and cognitive neuroscience of event segmentation. In line with recent theories, it holds that online event segmentation is automatic, and that event segmentation yields mental simulations of events. But it posits two novel principles as well: first, discrete episodic markers track perceptual and conceptual changes, and can be retrieved to construct event models. Second, the process of retrieving and reconstructing those episodic markers is constrained and prioritized. We describe a computational implementation of the theory, as well as a robotic extension of the theory that demonstrates the processes of online event segmentation and event model construction. The theory is the first unified computational account of event segmentation and temporal inference. We conclude by demonstrating now neuroimaging data can constrain and inspire the construction of process-level theories of human reasoning. PMID:26578934

  14. News Education: Physics Education Networks meeting has global scale Competition: Competition seeks the next Brian Cox Experiment: New measurement of neutrino time-of-flight consistent with the speed of light Event: A day for all those who teach physics Conference: Students attend first Anglo-Japanese international science conference Celebration: Will 2015 be the 'Year of Light'? Teachers: Challenging our intuition in spectacular fashion: the fascinating world of quantum physics awaits Research: Science sharpens up sport Learning: Kittinger and Baumgartner: on a mission to the edge of space International: London International Youth Science Forum calls for leading young scientists Competition: Physics paralympian challenge needs inquisitive, analytical, artistic and eloquent pupils Forthcoming events

    NASA Astrophysics Data System (ADS)

    2012-05-01

    Education: Physics Education Networks meeting has global scale Competition: Competition seeks the next Brian Cox Experiment: New measurement of neutrino time-of-flight consistent with the speed of light Event: A day for all those who teach physics Conference: Students attend first Anglo-Japanese international science conference Celebration: Will 2015 be the 'Year of Light'? Teachers: Challenging our intuition in spectacular fashion: the fascinating world of quantum physics awaits Research: Science sharpens up sport Learning: Kittinger and Baumgartner: on a mission to the edge of space International: London International Youth Science Forum calls for leading young scientists Competition: Physics paralympian challenge needs inquisitive, analytical, artistic and eloquent pupils Forthcoming events

  15. The global event system

    SciTech Connect

    Winans, J.

    1994-03-02

    The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.

  16. FTA Basic Event & Cut Set Ranking.

    Energy Science and Technology Software Center (ESTSC)

    1999-05-04

    Version 00 IMPORTANCE computes various measures of probabilistic importance of basic events and minimal cut sets to a fault tree or reliability network diagram. The minimal cut sets, the failure rates and the fault duration times (i.e., the repair times) of all basic events contained in the minimal cut sets are supplied as input data. The failure and repair distributions are assumed to be exponential. IMPORTANCE, a quantitative evaluation code, then determines the probability ofmore » the top event and computes the importance of minimal cut sets and basic events by a numerical ranking. Two measures are computed. The first describes system behavior at one point in time; the second describes sequences of failures that cause the system to fail in time. All measures are computed assuming statistical independence of basic events. In addition, system unavailability and expected number of system failures are computed by the code.« less

  17. A framework for incorporating DTI Atlas Builder registration into Tract-Based Spatial Statistics and a simulated comparison to standard TBSS

    PubMed Central

    Leming, Matthew; Steiner, Rachel; Styner, Martin

    2016-01-01

    Tract-based spatial statistics (TBSS)6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder7. To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS10). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder’s registration enhances TBSS group-based studies.

  18. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  19. Creating Reality: How TV News Distorts Events.

    ERIC Educational Resources Information Center

    Altheide, David L.

    A three-year research project, including more than one year in a network affiliate station, provided the material for an analysis of current practices in television news programming. Based on the thesis that the organization of news encourages the oversimplification of events, this analysis traces the foundation of the bias called the "news…

  20. NASA Integrated Network COOP

    NASA Technical Reports Server (NTRS)

    Anderson, Michael L.; Wright, Nathaniel; Tai, Wallace

    2012-01-01

    Natural disasters, terrorist attacks, civil unrest, and other events have the potential of disrupting mission-essential operations in any space communications network. NASA's Space Communications and Navigation office (SCaN) is in the process of studying options for integrating the three existing NASA network elements, the Deep Space Network, the Near Earth Network, and the Space Network, into a single integrated network with common services and interfaces. The need to maintain Continuity of Operations (COOP) after a disastrous event has a direct impact on the future network design and operations concepts. The SCaN Integrated Network will provide support to a variety of user missions. The missions have diverse requirements and include anything from earth based platforms to planetary missions and rovers. It is presumed that an integrated network, with common interfaces and processes, provides an inherent advantage to COOP in that multiple elements and networks can provide cross-support in a seamless manner. The results of trade studies support this assumption but also show that centralization as a means of achieving integration can result in single points of failure that must be mitigated. The cost to provide this mitigation can be substantial. In support of this effort, the team evaluated the current approaches to COOP, developed multiple potential approaches to COOP in a future integrated network, evaluated the interdependencies of the various approaches to the various network control and operations options, and did a best value assessment of the options. The paper will describe the trade space, the study methods, and results of the study.

  1. Vaccine Adverse Events

    MedlinePlus

    ... Vaccines, Blood & Biologics Animal & Veterinary Cosmetics Tobacco Products Vaccines, Blood & Biologics Home Vaccines, Blood & Biologics Safety & Availability ( ... Center for Biologics Evaluation & Research Vaccine Adverse Events Vaccine Adverse Events Share Tweet Linkedin Pin it More ...

  2. Quartets and unrooted phylogenetic networks.

    PubMed

    Gambette, Philippe; Berry, Vincent; Paul, Christophe

    2012-08-01

    Phylogenetic networks were introduced to describe evolution in the presence of exchanges of genetic material between coexisting species or individuals. Split networks in particular were introduced as a special kind of abstract network to visualize conflicts between phylogenetic trees which may correspond to such exchanges. More recently, methods were designed to reconstruct explicit phylogenetic networks (whose vertices can be interpreted as biological events) from triplet data. In this article, we link abstract and explicit networks through their combinatorial properties, by introducing the unrooted analog of level-k networks. In particular, we give an equivalence theorem between circular split systems and unrooted level-1 networks. We also show how to adapt to quartets some existing results on triplets, in order to reconstruct unrooted level-k phylogenetic networks. These results give an interesting perspective on the combinatorics of phylogenetic networks and also raise algorithmic and combinatorial questions. PMID:22809417

  3. Networks model of the East Turkistan terrorism

    NASA Astrophysics Data System (ADS)

    Li, Ben-xian; Zhu, Jun-fang; Wang, Shun-guo

    2015-02-01

    The presence of the East Turkistan terrorist network in China can be traced back to the rebellions on the BAREN region in Xinjiang in April 1990. This article intends to research the East Turkistan networks in China and offer a panoramic view. The events, terrorists and their relationship are described using matrices. Then social network analysis is adopted to reveal the network type and the network structure characteristics. We also find the crucial terrorist leader. Ultimately, some results show that the East Turkistan network has big hub nodes and small shortest path, and that the network follows a pattern of small world network with hierarchical structure.

  4. VOEventNet: Event Messaging for Astronomy

    NASA Astrophysics Data System (ADS)

    Drake, Andrew J.; Djorgovski, G.; Graham, M.; Williams, R.; Mahabal, A.; Donalek, C.; Glikman, E.; Bloom, J.; Vastrand, T.; White, R.; Rabinowitz, D.; Baltay, C.

    2006-12-01

    The time domain remains one of the the least explored areas in modern astronomy. In the near future the next generation of large synoptic sky surveys (Pan-STARRs, Skymapper, LSST) will probe the time dependent nature of the sky by detecting hundreds of thousands of astronomical transients (variable stars, asteroids, GRBs, lensing events). A global event distribution and follow-up network is required to characterize the nature of these transients. For over a year the VOEventNet project has been in the process of implementing a transient event follow-up network which distributes crafted structured data packets called VOEvents. These packets have been designed to be general enough to contain metadata for transients seen at all wavelengths, yet interpretable by robotic telescope systems (which are already automatically responding with follow-up observations). The VOEventNet project currently has transient event follow-up with the Palomar 60 and 200in (Caltech), RAPTOR (LANL), PARITEL and KAIT (UCB) as well as UK telescopes. VOEventNet transient event streams are publicly available. The subscription, publication and reception of VOEvents is implimented with a number of open source software clients. The software and details of how to receive streams of events are available from http://www.voeventnet.org. Current event streams include OGLE microlensing events, SDSS Supernovae, GCN GRBs, Raptor and Palomar-Quest optical transients. In the near future, many additional streams of VOEvents will be available, including optical transients from the ESSENCE, Planet and MOA projects, as well as those from UKIRT and JCMT telescopes. We also expect that transient event alerts will be available from Solar, X-ray and Radio telescopes.

  5. Dialogue on private events

    PubMed Central

    Palmer, David C.; Eshleman, John; Brandon, Paul; Layng, T. V. Joe; McDonough, Christopher; Michael, Jack; Schoneberger, Ted; Stemmer, Nathan; Weitzman, Ray; Normand, Matthew

    2004-01-01

    In the fall of 2003, the authors corresponded on the topic of private events on the listserv of the Verbal Behavior Special Interest Group. Extracts from that correspondence raised questions about the role of response amplitude in determining units of analysis, whether private events can be investigated directly, and whether covert behavior differs from other behavior except in amplitude. Most participants took a cautious stance, noting not only conceptual pitfalls and empirical difficulties in the study of private events, but doubting the value of interpretive exercises about them. Others argued that despite such obstacles, in domains where experimental analyses cannot be done, interpretation of private events in the light of laboratory principles is the best that science can offer. One participant suggested that the notion that private events can be behavioral in nature be abandoned entirely; as an alternative, the phenomena should be reinterpreted only as physiological events. PMID:22477293

  6. Cluster Analysis for CTBT Seismic Event Monitoring

    SciTech Connect

    Carr, Dorthe B.; Young, Chris J.; Aster, Richard C.; Zhang, Xioabing

    1999-08-03

    Mines at regional distances are expected to be continuing sources of small, ambiguous events which must be correctly identified as part of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring process. Many of these events are small enough that they are only seen by one or two stations, so locating them by traditional methods maybe impossible or at best leads to poorly resolved parameters. To further complicate matters, these events have parametric characteristics (explosive sources, shallow depths) which make them difficult to identify as definite non-nuclear events using traditional discrimination methods. Fortunately, explosions from the same mines tend to have similar waveforms, making it possible to identify an unknown event by comparison with characteristic archived events that have been associated with specific mines. In this study we examine the use of hierarchical cluster methods to identify groups of similar events. These methods produce dendrograms, which are tree-like structures showing the relationships between entities. Hierarchical methods are well-suited to use for event clustering because they are well documented, easy to implement, computationally cheap enough to run multiple times for a given data set, and because these methods produce results which can be readily interpreted. To aid in determining the proper threshold value for defining event families for a given dendrogram, we use cophenetic correlation (which compares a model of the similarity behavior to actual behavior), variance, and a new metric developed for this study. Clustering methods are compared using archived regional and local distance mining blasts recorded at two sites in the western U.S. with different tectonic and instrumentation characteristics: the three-component broadband DSVS station in Pinedale, Wyoming and the short period New Mexico Tech (NMT) network in central New Mexico. Ground truth for the events comes from the mining industry and local network locations

  7. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  8. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  9. Features, Events, and Processes: Disruptive Events

    SciTech Connect

    J. King

    2004-03-31

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  10. Event-by-Event Fission with FREYA

    SciTech Connect

    Randrup, J; Vogt, R

    2010-11-09

    The recently developed code FREYA (Fission Reaction Event Yield Algorithm) generates large samples of complete fission events, consisting of two receding product nuclei as well as a number of neutrons and photons, all with complete kinematic information. Thus it is possible to calculate arbitrary correlation observables whose behavior may provide unique insight into the fission process. The presentation first discusses the present status of FREYA, which has now been extended up to energies where pre-equilibrium emission becomes significant and one or more neutrons may be emitted prior to fission. Concentrating on {sup 239}Pu(n,f), we discuss the neutron multiplicity correlations, the dependence of the neutron energy spectrum on the neutron multiplicity, and the relationship between the fragment kinetic energy and the number of neutrons and their energies. We also briefly suggest novel fission observables that could be measured with modern detectors.

  11. Features, Events, and Processes: Disruptive Events

    SciTech Connect

    P. Sanchez

    2004-11-08

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA).

  12. Activating Event Knowledge

    ERIC Educational Resources Information Center

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or…

  13. Events and Constructs

    ERIC Educational Resources Information Center

    Smith, Noel W.

    2007-01-01

    Psychology has largely ignored the distinction between constructs and events and what comprises a scientific construct, yet this distinction is basic to some of the major divisions of thought within the discipline. Several kinds of constructs are identified and compared with events, and improper use of constructs is noted of which the mind…

  14. Committed Sport Event Volunteers

    ERIC Educational Resources Information Center

    Han, Keunsu; Quarterman, Jerome; Strigas, Ethan; Ha, Jaehyun; Lee, Seungbum

    2013-01-01

    The purpose of this study was to investigate the relationships among selected demographic characteristics (income, education and age), motivation and commitment of volunteers at a sporting event. Three-hundred and five questionnaires were collected from volunteers in a marathon event and analyzed using structural equation modeling (SEM). Based on…

  15. Event generator overview

    SciTech Connect

    Pang, Y.

    1997-12-01

    Due to their ability to provide detailed and quantitative predictions, the event generators have become an important part of studying relativistic heavy ion physics and of designing future experiments. In this talk, the author will briefly summarize recent progress in developing event generators for the relativistic heavy ion collisions.

  16. Contrasting Large Solar Events

    NASA Astrophysics Data System (ADS)

    Lanzerotti, Louis J.

    2010-10-01

    After an unusually long solar minimum, solar cycle 24 is slowly beginning. A large coronal mass ejection (CME) from sunspot 1092 occurred on 1 August 2010, with effects reaching Earth on 3 August and 4 August, nearly 38 years to the day after the huge solar event of 4 August 1972. The prior event, which those of us engaged in space research at the time remember well, recorded some of the highest intensities of solar particles and rapid changes of the geomagnetic field measured to date. What can we learn from the comparisons of these two events, other than their essentially coincident dates? One lesson I took away from reading press coverage and Web reports of the August 2010 event is that the scientific community and the press are much more aware than they were nearly 4 decades ago that solar events can wreak havoc on space-based technologies.

  17. The ATLAS Event Service: A new approach to event processing

    NASA Astrophysics Data System (ADS)

    Calafiura, P.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.

    2015-12-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre-staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabilities, its architecture and the highly scalable tools and technologies employed in its implementation, and its applications in ATLAS processing on HPCs, commercial cloud resources, volunteer computing, and grid resources. Notice: This manuscript has been authored by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  18. Collaboration in the School Social Network

    ERIC Educational Resources Information Center

    Schultz-Jones, Barbara

    2009-01-01

    Social networks are fundamental to all people. Their social network describes how they are connected to others: close relationships, peripheral relationships, and those relationships that help connect them to other people, events, or things. As information specialists, school librarians develop a multidimensional social network that enables them…

  19. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  20. Pure, single phase, high crystalline, chamfered-edge zeolite 4A synthesized from coal fly ash for use as a builder in detergents.

    PubMed

    Hui, K S; Chao, C Y H

    2006-09-01

    Single phase chamfered-edge zeolite 4A samples in pure form with a high crystallinity were synthesized by applying step-change of synthesis temperature during hydrothermal treatment of coal fly ash. The calcium binding capacity of these zeolite 4A samples (prepared from coal fly ash) and the commercial detergent grade zeolite 4A were tested for usage as a detergent builder. The results show that these zeolite 4A samples behaved similarly as the commercial one in removing calcium ions during the washing cycle. Moreover, from the leaching tests (evaluation of toxicological safety), the results show that these zeolite 4A samples leached the same elements (Sb, As, Se and Tl) as the commercial one with the concentrations in the same order of magnitude. This shows that the toxicological effect of the coal fly ash converted zeolite 4A was not worse than that of the commercial sample. Finally, economic and environmental aspects of converting coal fly ash to useful products were discussed. PMID:16621273

  1. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  2. Unseen GLEs (Ground Level Events)

    NASA Astrophysics Data System (ADS)

    Christian, Eric R.; Boezio, M.; Bravar, Ulisse; Bruno, A.; de Nolfo, Georgia; Martucci, M.; Merge, M.; Mocchiutti, E.; Munini, R.; Ricci, M.; Ryan, James Michael; Stochaj, Steven

    2015-04-01

    Over the last seventy years, solar energetic particle (SEP) ground level events (GLEs) have been observed by ground-based neutron monitors and muon telescopes at a rate of slightly more than one per year. Ground-based detectors only measure secondary particles, and matching their observations with SEP in-situ measurements from spacecraft has been difficult. Now, the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) instrument provides in-situ measurements that also include composition and pitch-angle distribution and bridge the energy between long-term SEP monitors in space (e.g. ACE and GOES) and the ground-based observations. The PAMELA data show that there are a few SEP events (e.g. 23 Jan 2012) where PAMELA sees high-energy (> 1 GeV) particles, yet these are not registered as GLEs. We will present evidence that the anisotropic distribution of these SEPs may miss the global network of neutron monitors.

  3. 47 CFR 25.261 - Procedures for avoidance of in-line interference events for Non Geostationary Satellite Orbit...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... interference events for Non Geostationary Satellite Orbit (NGSO) Satellite Network Operations in the Fixed... avoidance of in-line interference events for Non Geostationary Satellite Orbit (NGSO) Satellite Network... procedures in this section apply to non-Federal-Government NGSO FSS satellite networks operating in...

  4. 47 CFR 25.261 - Procedures for avoidance of in-line interference events for Non Geostationary Satellite Orbit...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... interference events for Non Geostationary Satellite Orbit (NGSO) Satellite Network Operations in the Fixed... avoidance of in-line interference events for Non Geostationary Satellite Orbit (NGSO) Satellite Network... procedures in this section apply to non-Federal-Government NGSO FSS satellite networks operating in...

  5. News Teaching Support: New schools network launched Competition: Observatory throws open doors to a select few Festival: Granada to host 10th Ciencia en Acción Centenary: Science Museum celebrates 100 years Award: Queen's birthday honour for science communicator Teacher Training: Training goes where it's needed Conference: Physics gets creative in Christchurch Conference: Conference is packed with ideas Poster Campaign: Bus passengers learn about universe Forthcoming events

    NASA Astrophysics Data System (ADS)

    2009-09-01

    Teaching Support: New schools network launched Competition: Observatory throws open doors to a select few Festival: Granada to host 10th Ciencia en Acción Centenary: Science Museum celebrates 100 years Award: Queen's birthday honour for science communicator Teacher Training: Training goes where it's needed Conference: Physics gets creative in Christchurch Conference: Conference is packed with ideas Poster Campaign: Bus passengers learn about universe Forthcoming events

  6. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    SciTech Connect

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.

  7. United States National seismograph network

    USGS Publications Warehouse

    Masse, R.P.; Filson, J.R.; Murphy, A.

    1989-01-01

    The USGS National Earthquake Information Center (NEIC) has planned and is developing a broadband digital seismograph network for the United States. The network will consist of approximately 150 seismograph stations distributed across the contiguous 48 states and across Alaska, Hawaii, Puerto Rico and the Virgin Islands. Data transmission will be via two-way satellite telemetry from the network sites to a central recording facility at the NEIC in Golden, Colorado. The design goal for the network is the on-scale recording by at least five well-distributed stations of any seismic event of magnitude 2.5 or greater in all areas of the United States except possibly part of Alaska. All event data from the network will be distributed to the scientific community on compact disc with read-only memory (CD-ROM). ?? 1989.

  8. FLOOD EVENT MAPPING IMAGES

    EPA Science Inventory

    OSEI flood products (FLD) include multichannel color composite imagery and single-channel grayscale imagery of enlarged river areas or increased sediment flow. Typically, these events are displayed by comparison to imagery taken when flooding was not occurring.

  9. Holter and Event Monitors

    MedlinePlus

    ... Holter and event monitors are similar to an EKG (electrocardiogram). An EKG is a simple test that detects and records ... for diagnosing heart rhythm problems. However, a standard EKG only records the heartbeat for a few seconds. ...

  10. Event shape sorting

    NASA Astrophysics Data System (ADS)

    Kopečná, Renata; Tomášik, Boris

    2016-04-01

    We propose a novel method for sorting events of multiparticle production according to the azimuthal anisotropy of their momentum distribution. Although the method is quite general, we advocate its use in analysis of ultra-relativistic heavy-ion collisions where a large number of hadrons is produced. The advantage of our method is that it can automatically sort out samples of events with histograms that indicate similar distributions of hadrons. It takes into account the whole measured histograms with all orders of anisotropy instead of a specific observable ( e.g., v_2 , v_3 , q_2 . It can be used for more exclusive experimental studies of flow anisotropies which are then more easily compared to theoretical calculations. It may also be useful in the construction of mixed-events background for correlation studies as it allows to select events with similar momentum distribution.

  11. "Universe" event at AIMS

    NASA Astrophysics Data System (ADS)

    2008-06-01

    Report of event of 11 May 2008 held at the African Institute of Mathematical Sciences (Muizenberg, Cape), with speakers Michael Griffin (Administrator of NASA), Stephen Hawking (Cambridge), David Gross (Kavli Institute, Santa Barbara) and George Smoot (Berkeley).

  12. CHED Events: New Orleans

    NASA Astrophysics Data System (ADS)

    Wink, Donald J.

    2008-03-01

    These Division of Chemical Education (CHED) Committee meetings and events are planned for the Spring 2008 ACS Meeting in New Orleans. Most will take place in the Hilton Riverside Hotel, 2 Poydras Street; this includes the Sunday evening Reception and Social Event; there will be no CHED Banquet. Exceptions are the Sunday evening Poster Session and the Undergraduate Poster Sessions, which will be in Hall A of the Morial Convention Center.

  13. Spaces of Abstract Events

    NASA Astrophysics Data System (ADS)

    Chajda, Ivan; Länger, Helmut

    2013-06-01

    We generalize the concept of a space of numerical events in such a way that this generalization corresponds to arbitrary orthomodular posets whereas spaces of numerical events correspond to orthomodular posets having a full set of states. Moreover, we show that there is a natural one-to-one correspondence between orthomodular posets and certain posets with sectionally antitone involutions. Finally, we characterize orthomodular lattices among orthomodular posets.

  14. QCD (&) event generators

    SciTech Connect

    Skands, Peter Z.; /Fermilab

    2005-07-01

    Recent developments in QCD phenomenology have spurred on several improved approaches to Monte Carlo event generation, relative to the post-LEP state of the art. In this brief review, the emphasis is placed on approaches for (1) consistently merging fixed-order matrix element calculations with parton shower descriptions of QCD radiation, (2) improving the parton shower algorithms themselves, and (3) improving the description of the underlying event in hadron collisions.

  15. Complex Event Recognition Architecture

    NASA Technical Reports Server (NTRS)

    Fitzgerald, William A.; Firby, R. James

    2009-01-01

    Complex Event Recognition Architecture (CERA) is the name of a computational architecture, and software that implements the architecture, for recognizing complex event patterns that may be spread across multiple streams of input data. One of the main components of CERA is an intuitive event pattern language that simplifies what would otherwise be the complex, difficult tasks of creating logical descriptions of combinations of temporal events and defining rules for combining information from different sources over time. In this language, recognition patterns are defined in simple, declarative statements that combine point events from given input streams with those from other streams, using conjunction, disjunction, and negation. Patterns can be built on one another recursively to describe very rich, temporally extended combinations of events. Thereafter, a run-time matching algorithm in CERA efficiently matches these patterns against input data and signals when patterns are recognized. CERA can be used to monitor complex systems and to signal operators or initiate corrective actions when anomalous conditions are recognized. CERA can be run as a stand-alone monitoring system, or it can be integrated into a larger system to automatically trigger responses to changing environments or problematic situations.

  16. Activating Event Knowledge

    PubMed Central

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or typically play a role in. We used short stimulus onset asynchrony priming to demonstrate that (1) event nouns prime people (sale-shopper) and objects (trip-luggage) commonly found at those events; (2) location nouns prime people/animals (hospital-doctor) and objects (barn-hay) commonly found at those locations; and (3) instrument nouns prime things on which those instruments are commonly used (key-door), but not the types of people who tend to use them (hose-gardener). The priming effects are not due to normative word association. On our account, facilitation results from event knowledge relating primes and targets. This has much in common with computational models like LSA or BEAGLE in which one word primes another if they frequently occur in similar contexts. LSA predicts priming for all six experiments, whereas BEAGLE correctly predicted that priming should not occur for the instrument-people relation but should occur for the other five. We conclude that event-based relations are encoded in semantic memory and computed as part of word meaning, and have a strong influence on language comprehension. PMID:19298961

  17. Contribution of Infrasound to IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif Mohamed; Medinskaya, Tatiana; Mialle, Pierrick

    2016-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003 but infrasound detections could not be used by the Global Association (GA) Software due to the unmanageable high number of spurious associations. Offline improvements of the automatic processing took place to reduce the number of false detections to a reasonable level. In February 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts and large earthquakes belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. Presence of infrasound detection associated to an event from a mining area indicates a surface explosion. Satellite imaging and a database of active mines can be used to confirm the origin of such events. This presentation will summarize the contribution of 6 years of infrasound data to IDC bulletins and provide examples of events recorded at the IMS infrasound network. Results of this study may help to improve location of small events with observations on infrasound stations.

  18. Network Cosmology

    PubMed Central

    Krioukov, Dmitri; Kitsak, Maksim; Sinkovits, Robert S.; Rideout, David; Meyer, David; Boguñá, Marián

    2012-01-01

    Prediction and control of the dynamics of complex networks is a central problem in network science. Structural and dynamical similarities of different real networks suggest that some universal laws might accurately describe the dynamics of these networks, albeit the nature and common origin of such laws remain elusive. Here we show that the causal network representing the large-scale structure of spacetime in our accelerating universe is a power-law graph with strong clustering, similar to many complex networks such as the Internet, social, or biological networks. We prove that this structural similarity is a consequence of the asymptotic equivalence between the large-scale growth dynamics of complex networks and causal networks. This equivalence suggests that unexpectedly similar laws govern the dynamics of complex networks and spacetime in the universe, with implications to network science and cosmology. PMID:23162688

  19. Concepts of event-by-event analysis

    SciTech Connect

    Stroebele, H.

    1995-07-15

    The particles observed in the final state of nuclear collisions can be divided into two classes: those which are susceptible to strong interactions and those which are not, like leptons and the photon. The bulk properties of the {open_quotes}matter{close_quotes} in the reaction zone may be read-off the kinematical characteristics of the particles observable in the final state. These characteristics are strongly dependent on the last interaction these particles have undergone. In a densly populated reaction zone strongly interacting particles will experience many collisions after they have been formed and before they emerge into the asymptotic final state. For the particles which are not sensitive to strong interactions their formation is also their last interaction. Thus photons and leptons probe the period during which they are produced whereas hadrons reflect the so called freeze-out processes, which occur during the late stage in the evolution of the reaction when the population density becomes small and the mean free paths long. The disadvantage of the leptons and photons is their small production cross section; they cannot be used in an analysis of the characteristics of individual collision events, because the number of particles produced per event is too small. The hadrons, on the other hand, stem from the freeze-out period. Information from earlier periods requires multiparticle observables in the most general sense. It is one of the challenges of present day high energy nuclear physics to establish and understand global observables which differentiate between mere hadronic scenarios, i.e superposition of hadronic interactions, and the formation of a partonic (short duration) steady state which can be considered a new state of matter, the Quark-Gluon Plasma.

  20. A Search for the Higgs Boson Using Neural Networks in Events with Missing Energy and \\boldit{b}-quark Jets in $p\\bar p$ Collisions at $\\sqrt{s}=1.96$ TeV

    SciTech Connect

    Aaltonen, T.; Adelman, J.; Alvarez Gonzalez, B.; Amerio, S.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Apresyan, A.; Arisawa, T.; /Waseda U. /Dubna, JINR

    2009-11-01

    We report on a search for the standard model Higgs boson produced in association with a W or Z boson in p{bar p} collisions at {radical}s = 1.96 TeV recorded by the CDF II experiment at the Tevatron in a data sample corresponding to an integrated luminosity of 2.1 fb{sup -1}. We consider events which have no identified charged leptons, an imbalance in transverse momentum, and two or three jets where at least one jet is consistent with originating from the decay of a b hadron. We find good agreement between data and predictions. We place 95% confidence level upper limits on the production cross section for several Higgs boson masses ranging from 110 GeV/c{sup 2} to 150 GeV/c{sup 2}. For a mass of 115 GeV/c{sup 2} the observed (expected) limit is 6.9 (5.6) times the standard model prediction.

  1. A space-based radio frequency transient event classifier

    SciTech Connect

    Moore, K.R.; Blain, P.C.; Caffrey, M.P.; Franz, R.C.; Henneke, K.M.; Jones, R.G.

    1996-12-31

    The FORTE (Fast On-Orbit Recording of Transient Events) satellite will record RF transients in space. These transients will be classified onboard the spacecraft with an Event Classifier--specialized hardware that performs signal preprocessing and neural network classification. The authors describe the Event Classifier, future directions, and implications for telecommunications satellites. Telecommunication satellites are susceptible to damage from environmental factors such as deep dielectric charging and surface discharges. The event classifier technology the authors are developing is capable of sensing the surface discharges and could be useful for mitigating their effects. In addition, the techniques they are using for processing weak signals in noisy environments are relevant to telecommunications.

  2. Solar extreme events

    NASA Astrophysics Data System (ADS)

    Hudson, Hugh S.

    2015-08-01

    Solar flares and CMEs have a broad range of magnitudes. This review discusses the possibility of “extreme events,” defined as those with magnitudes greater than have been seen in the existing historical record. For most quantitative measures, this direct information does not extend more than a century and a half into the recent past. The magnitude distributions (occurrence frequencies) of solar events (flares/CMEs) typically decrease with the parameter measured or inferred (peak flux, mass, energy etc. Flare radiation fluxes tend to follow a power law slightly flatter than S-2, where S represents a peak flux; solar particle events (SPEs) follow a still flatter power law up to a limiting magnitude, and then appear to roll over to a steeper distribution, which may take an exponential form or follow a broken power law. This inference comes from the terrestrial 14C record and from the depth dependence of various radioisotope proxies in the lunar regolith and in meteorites. Recently major new observational results have impacted our use of the relatively limited historical record in new ways: the detection of actual events in the 14C tree-ring records, and the systematic observations of flares and “superflares” by the Kepler spacecraft. I discuss how these new findings may affect our understanding of the distribution function expected for extreme solar events.

  3. RETRIEVAL EVENTS EVALUATION

    SciTech Connect

    T. Wilson

    1999-11-12

    The purpose of this analysis is to evaluate impacts to the retrieval concept presented in the Design Analysis ''Retrieval Equipment and Strategy'' (Reference 6), from abnormal events based on Design Basis Events (DBE) and Beyond Design Basis Events (BDBE) as defined in two recent analyses: (1) DBE/Scenario Analysis for Preclosure Repository Subsurface Facilities (Reference 4); and (2) Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (Reference 5) The objective of this task is to determine what impacts the DBEs and BDBEs have on the equipment developed for retrieval. The analysis lists potential impacts and recommends changes to be analyzed in subsequent design analyses for developed equipment, or recommend where additional equipment may be needed, to allow retrieval to be performed in all DBE or BDBE situations. This analysis supports License Application design and therefore complies with the requirements of Systems Description Document input criteria comparison as presented in Section 7, Conclusions. In addition, the analysis discusses the impacts associated with not using concrete inverts in the emplacement drifts. The ''Retrieval Equipment and Strategy'' analysis was based on a concrete invert configuration in the emplacement drift. The scope of the analysis, as presented in ''Development Plan for Retrieval Events Evaluation'' (Reference 3) includes evaluation and criteria of the following: Impacts to retrieval from the emplacement drift based on DBE/BDBEs, and changes to the invert configuration for the preclosure period. Impacts to retrieval from the main drifts based on DBE/BDBEs for the preclosure period.

  4. The response of the high-latitude ionosphere to the coronal mass ejection event of April 6, 2000: A practical demonstration of space weather nowcasting with the Super Dual Auroral Radar Network HF radars

    NASA Astrophysics Data System (ADS)

    Ruohoniemi, J. M.; Barnes, R. J.; Greenwald, R. A.; Shepherd, S. G.

    2001-12-01

    The ionosphere at high latitudes is the site of important effects in space weather. These include strong electrical currents that may disrupt power systems through induced currents and density irregularities that can degrade HF and satellite communication links. With the impetus provided by the National Space Weather Program, the radars of the Super Dual Auroral Radar Network have been applied to the real-time specification (``nowcasting'') of conditions in the high-latitude ionosphere. A map of the plasma convection in the northern high-latitude ionosphere is continually generated at the Johns Hopkins University Applied Physics Laboratory (JHU/APL) SuperDARN web site using data downloaded in real time from the radars via Internet connections. Other nowcast items include information on the conditions of HF propagation, the spatial extent of auroral effects, and the total cross polar cap potential variation. Time series of various parameters and an animated replay of the last 2 hours of convection patterns are also available for review. By comparing with simultaneous measurements from an upstream satellite, it is possible to infer the effective delay from the detection of changes in the solar wind at the satellite to the arrival of related effects in the high-latitude ionosphere. We discuss the space weather products available from the JHU/APL SuperDARN web site and their uses by simulating a nowcast of the ionosphere on April 6, 2000, during the arrival of a coronal mass ejection (CME) -related shock. The nowcast convection pattern in particular satisfies a critical need for timely, comprehensive information on ionospheric electric fields.

  5. Network Solutions.

    ERIC Educational Resources Information Center

    Vietzke, Robert; And Others

    1996-01-01

    This special section explains the latest developments in networking technologies, profiles school districts benefiting from successful implementations, and reviews new products for building networks. Highlights include ATM (asynchronous transfer mode), cable modems, networking switches, Internet screening software, file servers, network management…

  6. Pharmacogenomics of suicidal events

    PubMed Central

    Brent, David; Melhem, Nadine; Turecki, Gustavo

    2010-01-01

    Pharmacogenomic studies of antidepressant treatment-emergent suicidal events in depressed patients report associations with polymorphisms in genes involved in transcription (CREB1), neuroprotection (BDNF and NTRK2), glutamatergic and noradrenergic neurotransmission (GRIA3, GRIK2 and ADRA2A), the stress and inflammatory responses (FKBP5 and IL28RA), and the synthesis of glycoproteins (PAPLN). Nearly all of the reported events in these studies were modest one-time increases in suicidal ideation. In 3231 unique subjects across six studies, 424 (13.1%) patients showed increases in suicidal ideation, eight (0.25%) attempted suicide and four (0.12%) completed suicide. Systems related to most of these genes have also been implicated in studies of suicidal behavior irrespective of treatment. Future pharmacogenomic studies should target events that are clinically significant, related clinical phenotypes of response and medication side effects, and biological pathways that are involved in these outcomes in order to improve treatment approaches. PMID:20504254

  7. Networking standards

    NASA Technical Reports Server (NTRS)

    Davies, Mark

    1991-01-01

    The enterprise network is currently a multivendor environment consisting of many defacto and proprietary standards. During the 1990s, these networks will evolve towards networks which are based on international standards in both Local Area Network (LAN) and Wide Area Network (WAN) space. Also, you can expect to see the higher level functions and applications begin the same transition. Additional information is given in viewgraph form.

  8. The Colombia Seismological Network

    NASA Astrophysics Data System (ADS)

    Blanco Chia, J. F.; Poveda, E.; Pedraza, P.

    2013-05-01

    The latest seismological equipment and data processing instrumentation installed at the Colombia Seismological Network (RSNC) are described. System configuration, network operation, and data management are discussed. The data quality and the new seismological products are analyzed. The main purpose of the network is to monitor local seismicity with a special emphasis on seismic activity surrounding the Colombian Pacific and Caribbean oceans, for early warning in case a Tsunami is produced by an earthquake. The Colombian territory is located at the South America northwestern corner, here three tectonic plates converge: Nazca, Caribbean and the South American. The dynamics of these plates, when resulting in earthquakes, is continuously monitored by the network. In 2012, the RSNC registered in 2012 an average of 67 events per day; from this number, a mean of 36 earthquakes were possible to be located well. In 2010 the network was also able to register an average of 67 events, but it was only possible to locate a mean of 28 earthquakes daily. This difference is due to the expansion of the network. The network is made up of 84 stations equipped with different kind of broadband 40s, 120s seismometers, accelerometers and short period 1s sensors. The signal is transmitted continuously in real-time to the Central Recording Center located at Bogotá, using satellite, telemetry, and Internet. Moreover, there are some other stations which are required to collect the information in situ. Data is recorded and processed digitally using two different systems, EARTHWORM and SEISAN, which are able to process and share the information between them. The RSNC has designed and implemented a web system to share the seismological data. This innovative system uses tools like Java Script, Oracle and programming languages like PHP to allow the users to access the seismicity registered by the network almost in real time as well as to download the waveform and technical details. The coverage

  9. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. PMID:20804980

  10. Semantic Networks and Social Networks

    ERIC Educational Resources Information Center

    Downes, Stephen

    2005-01-01

    Purpose: To illustrate the need for social network metadata within semantic metadata. Design/methodology/approach: Surveys properties of social networks and the semantic web, suggests that social network analysis applies to semantic content, argues that semantic content is more searchable if social network metadata is merged with semantic web…

  11. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    SciTech Connect

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  12. CLUSTERING OF RARE EVENTS

    EPA Science Inventory

    The clustering of cases of a rare disease is considered. The number of events observed for each unit is assumed to have a Poisson distribution, the mean of which depends upon the population size and the cluster membership of that unit. Here a cluster consists of those units that ...

  13. Tidal Disruption Events

    NASA Astrophysics Data System (ADS)

    Gezari, Suvi

    2013-12-01

    The majority of supermassive black holes in the Universe lie dormant and starved of fuel. These hidden beasts can be temporarily illuminated when an unlucky star passes close enough to be tidally disrupted and consumed by the black hole. Theorists first proposed in 1975 that tidal disruption events should be an inevitable consequence of supermassive black holes in galaxy nuclei and later argued that the resulting flare of radiation from the accretion of the stellar debris could be a unique signpost for the presence of a dormant black hole in the center of a normal galaxy. It was not until over two decades later that the first convincing tidal disruption event candidates emerged in the X-rays by the ROSAT All-Sky Survey. Since then, over a dozen total candidates have now emerged from searches across the electromagnetic spectrum, including the X-rays, the ultraviolet, and the optical. In the last couple of years, we have also witnessed a paradigm shift with the discovery of relativistic beamed emission associated with tidal disruption events. I review the census of observational candidates to date and discuss the exciting prospects for using large samples of tidal disruption events discovered with the next-generation of ground-based and space-based synoptic surveys to probe accretion disk and/or jet formation and black hole demographics.

  14. Teaching with Current Events

    ERIC Educational Resources Information Center

    Peralta, Andrew

    2005-01-01

    This article describes how a teacher changed all his plans to teach the hurricane. When the Hurricane Katrina hit the Gulf Coast, kids become naturally curious and seek answers in an event this big. The author suggests the use of tragedies to help them grow as students and as citizens.

  15. Solar Energetic Particle Events: Phenomenology and Prediction

    NASA Astrophysics Data System (ADS)

    Gabriel, S. B.; Patrick, G. J.

    2003-04-01

    Solar energetic particle events can cause major disruptions to the operation of spacecraft in earth orbit and outside the earth's magnetosphere and have to be considered for EVA and other manned activities. They may also have an effect on radiation doses received by the crew flying in high altitude aircraft over the polar regions. The occurrence of these events has been assumed to be random, but there would appear to be some solar cycle dependency with a higher annual fluence occuring during a 7 year period, 2 years before and 4 years after the year of solar maximum. Little has been done to try to predict these events in real-time with nearly all of the work concentrating on statistical modelling. Currently our understanding of the causes of these events is not good. But what are the prospects for prediction? Can artificial intelligence techniques be used to predict them in the absence of a more complete understanding of the physics involved? The paper examines the phenomenology of the events, briefly reviews the results of neural network prediction techniques and discusses the conjecture that the underlying physical processes might be related to self-organised criticality and turblent MHD flows.

  16. Wheelchair type biomedical system with event-recorder function.

    PubMed

    Han, Dong-Kyoon; Kim, Jong-Myoung; Cha, Eun-Jong; Lee, Tae-Soo

    2008-01-01

    The present study is about a biometric system for a wheelchair, which can measure both bio-signal (ECG-Electrocardiogram, BCG-Ballistocardiogram) and kinetic signal (acceleration) simultaneously and send the data to a remote medical server. The equipment was developed with the object of building a system that measures the bio-signal and kinetic signal of a subject who is moving or at rest on a wheelchair and transmits the measured signals to a remote server through a CDMA (Code Division Multiple Access) network. The equipment is composed of body area network and remote medical server. The body area network was designed to obtain bio-signal and kinetic signal simultaneously and, on the occurrence of an event, to transmit data to a remote medical server through a CDMA network. The remote medical server was designed to display event data transmitted from the body area network in real time. The performance of the developed system was evaluated through two experiments. First, we measured battery life on the occurrence of events, and second, we tested whether biometric data are transmitted accurately to the remote server on the occurrence of an event. In the first experiment using the developed equipment, events were triggered 16 times and the battery worked stably for around 29 hours. In the second experiment, when an event took place, the corresponding data were transmitted accurately to the remote medical server through a CDMA network. This system is expected to be usable for the healthcare of those moving on a wheelchair and applicable to a mobile healthcare system. PMID:19162939

  17. Performance testing open source products for the TMT event service

    NASA Astrophysics Data System (ADS)

    Gillies, K.; Bhate, Yogesh

    2014-07-01

    The software system for TMT is a distributed system with many components on many computers. Each component integrates with the overall system using a set of software services. The Event Service is a publish-subscribe message system that allows the distribution of demands and other events. The performance requirements for the Event Service are demanding with a goal of over 60 thousand events/second. This service is critical to the success of the TMT software architecture; therefore, a project was started to survey the open source and commercial market for viable software products. A trade study led to the selection of five products for thorough testing using a specially constructed computer/network configuration and test suite. The best performing product was chosen as the basis of a prototype Event Service implementation. This paper describes the process and performance tests conducted by Persistent Systems that led to the selection of the product for the prototype Event Service.

  18. Event boundaries and anaphoric reference.

    PubMed

    Thompson, Alexis N; Radvansky, Gabriel A

    2016-06-01

    The current study explored the finding that parsing a narrative into separate events impairs anaphor resolution. According to the Event Horizon Model, when a narrative event boundary is encountered, a new event model is created. Information associated with the prior event model is removed from working memory. So long as the event model containing the anaphor referent is currently being processed, this information should still be available when there is no narrative event boundary, even if reading has been disrupted by a working-memory-clearing distractor task. In those cases, readers may reactivate their prior event model, and anaphor resolution would not be affected. Alternatively, comprehension may not be as event oriented as this account suggests. Instead, any disruption of the contents of working memory during comprehension, event related or not, may be sufficient to disrupt anaphor resolution. In this case, reading comprehension would be more strongly guided by other, more basic language processing mechanisms and the event structure of the described events would play a more minor role. In the current experiments, participants were given stories to read in which we included, between the anaphor and its referent, either the presence of a narrative event boundary (Experiment 1) or a narrative event boundary along with a working-memory-clearing distractor task (Experiment 2). The results showed that anaphor resolution was affected by narrative event boundaries but not by a working-memory-clearing distractor task. This is interpreted as being consistent with the Event Horizon Model of event cognition. PMID:26452376

  19. Biological event composition

    PubMed Central

    2012-01-01

    Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes

  20. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.