Science.gov

Sample records for event builder networks

  1. Network Performance Testing for the BaBar Event Builder

    SciTech Connect

    Pavel, Tomas J

    1998-11-17

    We present an overview of the design of event building in the BABAR Online, based upon TCP/IP and commodity networking technology. BABAR is a high-rate experiment to study CP violation in asymmetric e{sup +}e{sup {minus}} collisions. In order to validate the event-builder design, an extensive program was undertaken to test the TCP performance delivered by various machine types with both ATM OC-3 and Fast Ethernet networks. The buffering characteristics of several candidate switches were examined and found to be generally adequate for our purposes. We highlight the results of this testing and present some of the more significant findings.

  2. CMS DAQ event builder based on gigabit ethernet

    SciTech Connect

    Pieri, M.; Maron, G.; Brett, A.; Cano, E.; Cittolin, S.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Reino Garrido, R.; Gulmini, M.; Gutleber, J.; Jacobs, C.; Meijers, F.; Meschi, E.; Oh, A.; Orsini, L.; Pollet, L.; Racz, A.; Rosinsky, P.; Sakulin, H.; Schwick, C.; /UC, San Diego /INFN, Legnaro /CERN /UCLA /Santiago de Compostela U. /Lisbon, LIFEP /Fermilab /MIT /Boskovic Inst., Zagreb

    2006-06-01

    The CMS Data Acquisition system is designed to build and filter events originating from approximately 500 data sources from the detector at a maximum Level 1 trigger rate of 100 kHz and with an aggregate throughput of 100 GByte/s. For this purpose different architectures and switch technologies have been evaluated. Events will be built in two stages: the first stage, the FED Builder, will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The next stage, the Readout Builder, will perform the building of full events. The requirement of one Readout Builder is to build events at 12.5 kHz with average size of 16 kBytes from 64 sources. In this paper we present the prospects of a Readout Builder based on TCP/IP over Gigabit Ethernet. Various Readout Builder architectures that we are considering are discussed. The results of throughput measurements and scaling performance are outlined as well as the preliminary estimates of the final performance. All these studies have been carried out at our test-bed farms that are made up of a total of 130 dual Xeon PCs interconnected with Myrinet and Gigabit Ethernet networking and switching technologies.

  3. A new event builder for CMS Run II

    SciTech Connect

    Albertsson, K.; Andre, J-M; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G-L; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-01-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Inniband FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. In conclusion, ee present performance measurements from small-scale prototypes and from the full-scale production system.

  4. A New Event Builder for CMS Run II

    NASA Astrophysics Data System (ADS)

    Albertsson, K.; Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Infiniband FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. We present performance measurements from small-scale prototypes and from the full-scale production system.

  5. CMS DAQ event builder based on Gigabit Ethernet

    SciTech Connect

    Bauer, G.; Boyer, V.; Branson, J.; Brett, A.; Cano, E.; Carboni, A.; Ciganek, M.; Cittolin, S.; Erhan, Samim; Gigi, D.; Glege, F.; /CERN /INFN, Legnaro /CERN /CERN /Kyungpook Natl. U. /MIT /UC, San Diego /CERN

    2007-04-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  6. A new event builder for CMS Run II

    DOE PAGES

    Albertsson, K.; Andre, J-M; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G-L; Deldicque, C.; Dobson, M.; et al

    2015-01-01

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GB/s to the high-level trigger (HLT) farm. The DAQ system has been redesigned during the LHC shutdown in 2013/14. The new DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbps Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbps Innibandmore » FDR CLOS network has been chosen for the event builder. This paper discusses the software design, protocols, and optimizations for exploiting the hardware capabilities. In conclusion, ee present performance measurements from small-scale prototypes and from the full-scale production system.« less

  7. Event builder and level 3 at the CDF experiment

    SciTech Connect

    G. Gomez-Ceballos, A. Belloni and A. Bolshov

    2003-10-30

    The Event Builder and Level3 systems constitute critical components of the DAQ in the CDF experiment at Fermilab. These systems are responsible for collecting data fragments from the front end electronics, assembling the data into complete event records, reconstructing the events, and forming the final trigger decision. With Tevatron Run IIa in progress, the systems have been running successfully at high throughput rates, the design utilizing scalable architecture and distributed event processing to meet the requirements. A brief description current performance in Run IIa and possible upgrade for Run IIb is presented.

  8. Performance and system flexibility of the CDF Hardware Event Builder

    SciTech Connect

    Shaw, T.M.; Schurecht, K.; Sinervo, P.

    1991-11-01

    The CDF Hardware Event Builder [1] is a flexible system which is built from a combination of three different 68020-based single width Fastbus modules. The system may contain as few as three boards or as many as fifteen, depending on the specific application. Functionally, the boards receive a command to read out the raw event data from a set of Fastbus based data buffers (``scanners``), reformat data and then write the data to a Level 3 trigger/processing farm which will decide to throw the event away or to write it to tape. The data acquisition system at CDF will utilize two nine board systems which will allow an event rate of up to 35 Hz into the Level 3 trigger. This paper will present detailed performance factors, system and individual board architecture, and possible system configurations.

  9. Performance and system flexibility of the CDF Hardware Event Builder

    SciTech Connect

    Shaw, T.M.; Schurecht, K. ); Sinervo, P. . Dept. of Physics)

    1991-11-01

    The CDF Hardware Event Builder (1) is a flexible system which is built from a combination of three different 68020-based single width Fastbus modules. The system may contain as few as three boards or as many as fifteen, depending on the specific application. Functionally, the boards receive a command to read out the raw event data from a set of Fastbus based data buffers ( scanners''), reformat data and then write the data to a Level 3 trigger/processing farm which will decide to throw the event away or to write it to tape. The data acquisition system at CDF will utilize two nine board systems which will allow an event rate of up to 35 Hz into the Level 3 trigger. This paper will present detailed performance factors, system and individual board architecture, and possible system configurations.

  10. CDF DAQ upgrade and CMS DAQ R and D: event builder tests using an ATM switch

    SciTech Connect

    Bauer, G.; Daniels, T.; Kelley, K.

    1996-12-31

    The present data acquisition system of the CDF experiment has to be upgraded for the higher luminosities expected during the Run 11 (1999+) data-taking period. The core of the system, consisting of a control network based on reflective memories will remain the same. The network used for data transfers, however, will have to be changed. We have investigated ATM as a possible replacement technology for the current Ultranet switch. We present preliminary results on this new ATM-based event builder system.

  11. Results from a data acquisition system prototype project using a switch-based event builder

    SciTech Connect

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D. ); Bowden, M.; Booth, A. ); Cancelo, G. )

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs.

  12. 76 FR 2145 - Masco Builder Cabinet Group Including On-Site Leased Workers From Reserves Network, Jackson, OH...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-12

    ... Reserves Network, Jackson, OH; Masco Builder Cabinet Group, Waverly, OH; Masco Builder Cabinet Group, Seal Township, OH; Masco Builder Cabinet Group, Seaman, OH; Amended Certification Regarding Eligibility To Apply... Federal Register on December 11, 2009 (74 FR 65798). The Department has received information that...

  13. 76 FR 19466 - Masco Builder Cabinet Group Including On-Site Leased Workers From Reserves Network, Reliable...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-07

    ... Reserves Network, Reliable Staffing, and Third Dimension Waverly, OH; Masco Builder Cabinet Group Including On-Site Leased Workers From Reserves Network, Reliable Staffing, and Third Dimension Seal Township... in the Federal Register on December 11, 2009 (74 FR 65797). The notice was amended on December...

  14. The JigCell model builder: a spreadsheet interface for creating biochemical reaction network models.

    PubMed

    Vass, Marc T; Shaffer, Clifford A; Ramakrishnan, Naren; Watson, Layne T; Tyson, John J

    2006-01-01

    Converting a biochemical reaction network to a set of kinetic rate equations is tedious and error prone. We describe known interface paradigms for inputing models of intracellular regulatory networks: graphical layout (diagrams), wizards, scripting languages, and direct entry of chemical equations. We present the JigCell Model Builder, which allows users to define models as a set of reaction equations using a spreadsheet (an example of direct entry of equations) and outputs model definitions in the Systems Biology Markup Language, Level 2. We present the results of two usability studies. The spreadsheet paradigm demonstrated its effectiveness in reducing the number of errors made by modelers when compared to hand conversion of a wiring diagram to differential equations. A comparison of representatives of the four interface paradigms for a simple model of the cell cycle was conducted which measured time, mouse clicks, and keystrokes to enter the model, and the number of screens needed to view the contents of the model. All four paradigms had similar data entry times. The spreadsheet and scripting language approaches require significantly fewer screens to view the models than do the wizard or graphical layout approaches.

  15. Teams as Network Builders: Analysing Network Contacts in Finnish Elementary School Teacher Teams.

    ERIC Educational Resources Information Center

    Karkkainen, Merja

    2000-01-01

    Studied the preconditions and obstacles to building network contacts for two teams of five Finnish elementary school teachers each. Results show that breaking traditional patterns of teacher work, especially the tradition of single- handed lesson planning and implementation, results from team work in building a shared object. Results show the…

  16. Host Event Based Network Monitoring

    SciTech Connect

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  17. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  18. Controlling extreme events on complex networks

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-08-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network ``mobile'' can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.

  19. Controlling extreme events on complex networks.

    PubMed

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-01-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network "mobile" can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed. PMID:25131344

  20. Controlling extreme events on complex networks

    PubMed Central

    Chen, Yu-Zhong; Huang, Zi-Gang; Lai, Ying-Cheng

    2014-01-01

    Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network “mobile” can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed. PMID:25131344

  1. EventWeb: towards social life networks.

    PubMed

    Jain, Ramesh

    2013-03-28

    The Web has changed the way we live, work and socialize. The nodes in the current Web are documents and hence the current World Wide Web is a Document Web. Advances in technology and requirements of emerging applications require formation of a parallel and closely connected Web of events, the EventWeb, in which each node is an event. In this paper, we explore growth of EventWeb as a natural next step in the evolution of the Web with rich multimodal sensory information. Social networks use events extensively and have revolutionized communication among people. Mobile phones, equipped with myriads of sensors and being used by more than 75% of living humans, are bringing the next generation of social networks, not only to connect people with other people, but also to connect people with other people and essential life resources. We call these networks social life networks, and believe that this is the right time to focus efforts to discover and develop technology and infrastructure to design and build these networks and to apply them for solving some essential human problems.

  2. Emergence of event cascades in inhomogeneous networks

    NASA Astrophysics Data System (ADS)

    Onaga, Tomokatsu; Shinomoto, Shigeru

    2016-09-01

    There is a commonality among contagious diseases, tweets, and neuronal firings that past events facilitate the future occurrence of events. The spread of events has been extensively studied such that the systems exhibit catastrophic chain reactions if the interaction represented by the ratio of reproduction exceeds unity; however, their subthreshold states are not fully understood. Here, we report that these systems are possessed by nonstationary cascades of event-occurrences already in the subthreshold regime. Event cascades can be harmful in some contexts, when the peak-demand causes vaccine shortages, heavy traffic on communication lines, but may be beneficial in other contexts, such that spontaneous activity in neural networks may be used to generate motion or store memory. Thus it is important to comprehend the mechanism by which such cascades appear, and consider controlling a system to tame or facilitate fluctuations in the event-occurrences. The critical interaction for the emergence of cascades depends greatly on the network structure in which individuals are connected. We demonstrate that we can predict whether cascades may emerge, given information about the interactions between individuals. Furthermore, we develop a method of reallocating connections among individuals so that event cascades may be either impeded or impelled in a network.

  3. Emergence of event cascades in inhomogeneous networks.

    PubMed

    Onaga, Tomokatsu; Shinomoto, Shigeru

    2016-01-01

    There is a commonality among contagious diseases, tweets, and neuronal firings that past events facilitate the future occurrence of events. The spread of events has been extensively studied such that the systems exhibit catastrophic chain reactions if the interaction represented by the ratio of reproduction exceeds unity; however, their subthreshold states are not fully understood. Here, we report that these systems are possessed by nonstationary cascades of event-occurrences already in the subthreshold regime. Event cascades can be harmful in some contexts, when the peak-demand causes vaccine shortages, heavy traffic on communication lines, but may be beneficial in other contexts, such that spontaneous activity in neural networks may be used to generate motion or store memory. Thus it is important to comprehend the mechanism by which such cascades appear, and consider controlling a system to tame or facilitate fluctuations in the event-occurrences. The critical interaction for the emergence of cascades depends greatly on the network structure in which individuals are connected. We demonstrate that we can predict whether cascades may emerge, given information about the interactions between individuals. Furthermore, we develop a method of reallocating connections among individuals so that event cascades may be either impeded or impelled in a network. PMID:27625183

  4. Emergence of event cascades in inhomogeneous networks

    PubMed Central

    Onaga, Tomokatsu; Shinomoto, Shigeru

    2016-01-01

    There is a commonality among contagious diseases, tweets, and neuronal firings that past events facilitate the future occurrence of events. The spread of events has been extensively studied such that the systems exhibit catastrophic chain reactions if the interaction represented by the ratio of reproduction exceeds unity; however, their subthreshold states are not fully understood. Here, we report that these systems are possessed by nonstationary cascades of event-occurrences already in the subthreshold regime. Event cascades can be harmful in some contexts, when the peak-demand causes vaccine shortages, heavy traffic on communication lines, but may be beneficial in other contexts, such that spontaneous activity in neural networks may be used to generate motion or store memory. Thus it is important to comprehend the mechanism by which such cascades appear, and consider controlling a system to tame or facilitate fluctuations in the event-occurrences. The critical interaction for the emergence of cascades depends greatly on the network structure in which individuals are connected. We demonstrate that we can predict whether cascades may emerge, given information about the interactions between individuals. Furthermore, we develop a method of reallocating connections among individuals so that event cascades may be either impeded or impelled in a network. PMID:27625183

  5. Acoustic network event classification using swarm optimization

    NASA Astrophysics Data System (ADS)

    Burman, Jerry

    2013-05-01

    Classifying acoustic signals detected by distributed sensor networks is a difficult problem due to the wide variations that can occur in the transmission of terrestrial, subterranean, seismic and aerial events. An acoustic event classifier was developed that uses particle swarm optimization to perform a flexible time correlation of a sensed acoustic signature to reference data. In order to mitigate the effects from interference such as multipath, the classifier fuses signatures from multiple sensors to form a composite sensed acoustic signature and then automatically matches the composite signature with reference data. The approach can classify all types of acoustic events but is particularly well suited to explosive events such as gun shots, mortar blasts and improvised explosive devices that produce an acoustic signature having a shock wave component that is aperiodic and non-linear. The classifier was applied to field data and yielded excellent results in terms of reconstructing degraded acoustic signatures from multiple sensors and in classifying disparate acoustic events.

  6. Neural network classification of questionable EGRET events

    NASA Technical Reports Server (NTRS)

    Meetre, C. A.; Norris, J. P.

    1992-01-01

    High energy gamma rays (greater than 20 MeV) pair producing in the spark chamber of the Energetic Gamma Ray Telescope Experiment (EGRET) give rise to a characteristic but highly variable 3-D locus of spark sites, which must be processed to decide whether the event is to be included in the database. A significant fraction (about 15 percent or 10(exp 4) events/day) of the candidate events cannot be categorized (accept/reject) by an automated rule-based procedure; they are therefore tagged, and must be examined and classified manually by a team of expert analysts. We describe a feedforward, back-propagation neural network approach to the classification of the questionable events. The algorithm computes a set of coefficients using representative exemplars drawn from the preclassified set of questionable events. These coefficients map a given input event into a decision vector that, ideally, describes the correct disposition of the event. The net's accuracy is then tested using a different subset of preclassified events. Preliminary results demonstrate the net's ability to correctly classify a large proportion of the events for some categories of questionables. Current work includes the use of much larger training sets to improve the accuracy of the net.

  7. Builders Challenge High Performance Builder Spotlight - Kacin Homes, Pittsburgh, Pennsylvania

    SciTech Connect

    2008-01-01

    Building America/Builders Challenge fact sheet on Kacin Homes, an energy-efficient home builder in cold climate using airtight envelope, efficient lighting, and tankless water heater. Evaluates cost impacts.

  8. eProject Builder

    SciTech Connect

    2014-06-01

    eProject Builder enables Energy Services Companies (ESCOs) and their contracting agencies to: 1. upload and track project-level Information 2. generate basic project reports required by local, state, and/or federal agencies 3. benchmark new Energy Savings Performance Contract (ESPC) projects against historical data

  9. TrustBuilder2

    2007-07-20

    TrustBuilder2 is a flexible framework for supporting research in the area trust negotiation protocols, designed to allow researchers to quickly prototype and experiment with various approaches to trust negotiation. In Trustbuilder2, the primary components of a trust negotiation system are represented using abstract interfaces. Any or all of these components can be implemented or extended by users of the TrustBuilder2 system, thereby making the system's functionality easily extensible. The TrustBuilder2 configuration files can be modifiedmore » to load these custom components in place of the default system components; this facilitates the use of new features without modifications to the underlying runtime system. In our implementation, we provide support for one negotiation strategy, a policy compliance checker based on Jess (the Java Expert System Shell), query interfaces enabling access to disk-based credential and policy repositories, a credential chain construction algorithm, two credential chain verification routines, and both graphical and text-based logging facilities. Trustbuilder2 also supports the interposition of user-defined plug-ins at communication points between system components to allow for easy monitoring of system activity or the modification of messages passed between components.« less

  10. Digital Learning Network Event with Robotics Engineer Jonathan Rogers

    NASA Video Gallery

    Robotics engineer Jonathan Rogers and Public Affairs Officer Kylie Clem participate in a Digital Learning Network educational event, answering questions from students at Montgomery Middle School in...

  11. Man-machine interface builders at the Advanced Photon Source

    SciTech Connect

    Anderson, M.D.

    1991-12-31

    Argonne National Laboratory is constructing a 7-GeV Advanced Photon Source for use as a synchrotron radiation source in basic and applied research. The controls and computing environment for this accelerator complex includes graphical operator interfaces to the machine based on Motif, X11, and PHIGS/PEX. Construction and operation of the control system for this accelerator relies upon interactive interface builder and diagram/editor type tools, as well as a run-time environment for the constructed displays which communicate with the physical machine via network connections. This paper discusses our experience with several commercial CUI builders, the inadequacies found in these, motivation for the development of an application- specific builder, and design and implementation strategies employed in the development of our own Man-Machine Interface builder. 5 refs.

  12. Man-machine interface builders at the Advanced Photon Source

    SciTech Connect

    Anderson, M.D.

    1991-01-01

    Argonne National Laboratory is constructing a 7-GeV Advanced Photon Source for use as a synchrotron radiation source in basic and applied research. The controls and computing environment for this accelerator complex includes graphical operator interfaces to the machine based on Motif, X11, and PHIGS/PEX. Construction and operation of the control system for this accelerator relies upon interactive interface builder and diagram/editor type tools, as well as a run-time environment for the constructed displays which communicate with the physical machine via network connections. This paper discusses our experience with several commercial CUI builders, the inadequacies found in these, motivation for the development of an application- specific builder, and design and implementation strategies employed in the development of our own Man-Machine Interface builder. 5 refs.

  13. Builders Challenge Quality Criteria Support Document

    SciTech Connect

    2009-06-01

    This document provides guidance to U.S. home builders participating in Builders Challenge. To qualify for the Builders Challenge, a home must score 70 or less on the EnergySmart Home Scale (E-Scale). Homes also must meet the Builders Challenge Quality Cri

  14. Signaling communication events in a computer network

    DOEpatents

    Bender, Carl A.; DiNicola, Paul D.; Gildea, Kevin J.; Govindaraju, Rama K.; Kim, Chulho; Mirza, Jamshed H.; Shah, Gautam H.; Nieplocha, Jaroslaw

    2000-01-01

    A method, apparatus and program product for detecting a communication event in a distributed parallel data processing system in which a message is sent from an origin to a target. A low-level application programming interface (LAPI) is provided which has an operation for associating a counter with a communication event to be detected. The LAPI increments the counter upon the occurrence of the communication event. The number in the counter is monitored, and when the number increases, the event is detected. A completion counter in the origin is associated with the completion of a message being sent from the origin to the target. When the message is completed, LAPI increments the completion counter such that monitoring the completion counter detects the completion of the message. The completion counter may be used to insure that a first message has been sent from the origin to the target and completed before a second message is sent.

  15. Builder's foundation handbook

    SciTech Connect

    Carmody, J. . Underground Space Center); Christian, J. ); Labs, K. )

    1991-05-01

    This handbook contains a worksheet for selecting insulation levels based on specific building construction, climate, HVAC equipment, insulation cost, and other economic considerations. The worksheet permits optimization of foundation insulation levels for new or retrofit applications. Construction details representing good practices for the design and installation of energy efficient basement, crawl space, and slab-n-grade foundations are the focal point of the handbook. The construction details are keyed to lists of critical design information useful for specifying structural integrity; thermal and vapor control; subsurface drainage; waterproofing; and mold, mildew, odor, decay, termite, and radon control strategies. Another useful feature are checklist chapter summaries covering major design considerations for each foundation type--basement, crawl space, and slab-on-grade. These checklist summaries are useful during design and construction inspection. The information in this handbook is drawn heavily from the first foundation handbook from the DOE/ORNL Building Envelope Systems and Materials Program, the Building Foundation Design Handbook (Labs et al., 1988), which is an extensive technical reference manual. This book presents what to do in foundation design'' in an inviting, concise format. This handbook is intended to serve the needs of active home builders; however, the information is pertinent to anyone involved in foundation design and construction decisions including homeowners, architects, and engineers. 17 refs., 49 figs., 18 tabs.

  16. A convolutional neural network neutrino event classifier

    DOE PAGES

    Aurisano, A.; Radovic, A.; Rocco, D.; Himmel, A.; Messier, M. D.; Niner, E.; Pawloski, G.; Psihas, F.; Sousa, A.; Vahle, P.

    2016-09-01

    Here, convolutional neural networks (CNNs) have been widely applied in the computer vision community to solve complex problems in image recognition and analysis. We describe an application of the CNN technology to the problem of identifying particle interactions in sampling calorimeters used commonly in high energy physics and high energy neutrino physics in particular. Following a discussion of the core concepts of CNNs and recent innovations in CNN architectures related to the field of deep learning, we outline a specific application to the NOvA neutrino detector. This algorithm, CVN (Convolutional Visual Network) identifies neutrino interactions based on their topology withoutmore » the need for detailed reconstruction and outperforms algorithms currently in use by the NOvA collaboration.« less

  17. Astronomical network event and observation notification

    NASA Astrophysics Data System (ADS)

    White, R. R.; Allan, A.; Barthelmy, S.; Bloom, J.; Graham, M.; Hessman, F. V.; Marka, S.; Rots, A.; Scholberg, K.; Seaman, R.; Stoughton, C.; Vestrand, W. T.; Williams, R.; Wozniak, P. R.

    2006-09-01

    Networks are becoming a key element in most current and all future, telescope and observatory projects. The ability to easily and efficiently pass observation data, alert data and instrumentation requests between distributed systems could enable science as never before. However, any effective large scale or meta-network of astronomical resources will require a common communication format or development resources will have to be continuously dedicated to creating interpreters. The necessary elements of any astronomy communication can be easily identified, efficiently described and rigidly formatted so that both robotic and human operations can use the same data. In this paper we will explore the current state of notification, what notification requirements are essential to create a successful standard and present a standard now under development by the International Virtual Observatory Alliance (IVOA), called the VOEvent.

  18. A convolutional neural network neutrino event classifier

    NASA Astrophysics Data System (ADS)

    Aurisano, A.; Radovic, A.; Rocco, D.; Himmel, A.; Messier, M. D.; Niner, E.; Pawloski, G.; Psihas, F.; Sousa, A.; Vahle, P.

    2016-09-01

    Convolutional neural networks (CNNs) have been widely applied in the computer vision community to solve complex problems in image recognition and analysis. We describe an application of the CNN technology to the problem of identifying particle interactions in sampling calorimeters used commonly in high energy physics and high energy neutrino physics in particular. Following a discussion of the core concepts of CNNs and recent innovations in CNN architectures related to the field of deep learning, we outline a specific application to the NOvA neutrino detector. This algorithm, CVN (Convolutional Visual Network) identifies neutrino interactions based on their topology without the need for detailed reconstruction and outperforms algorithms currently in use by the NOvA collaboration.

  19. Seismic event classification using Self-Organizing Neural Networks

    SciTech Connect

    Maurer, W.J.; Dowla, F.U.; Jarpe, S.P.

    1991-10-15

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. We have studied Self Organizing Neural Networks (SONNs) for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs were developed and tested with a moderately large set of real seismic events. Given the detection of a seismic event and the corresponding signal, we compute the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This preprocessed input is fed into the SONNs. The overall results based on 111 events (43 training and 68 test events) show that SONNs are able to group events that ``look`` similar. We also find that the ART algorithm has an advantage in that the types of cluster groups do not need to be predefined. When a new type of event is detected, the ART network is able to handle the event rather gracefully. The results from the SONNs together with an expert seismologist`s classification are then used to derive event classification probabilities. A strategy to integrate a SONN into the interpretation of seismic events is also proposed.

  20. Bayesian-network-based soccer video event detection and retrieval

    NASA Astrophysics Data System (ADS)

    Sun, Xinghua; Jin, Guoying; Huang, Mei; Xu, Guangyou

    2003-09-01

    This paper presents an event based soccer video retrieval method, where the scoring even is detected based on Bayesian network from six kinds of cue information including gate, face, audio, texture, caption and text. The topology within the Bayesian network is predefined by hand according to the domain knowledge and the probability distributions are learned in the case of the known structure and full observability. The resulting event probability from the Bayesian network is used as the feature vector to perform the video retrieval. Experiments show that the true and false detection rations for the scoring event are about 90% and 16.67% respectively, and that the video retrieval result based on event is superior to that based on low-level features in the human visual perception.

  1. Extreme events in multilayer, interdependent complex networks and control.

    PubMed

    Chen, Yu-Zhong; Huang, Zi-Gang; Zhang, Hai-Feng; Eisenberg, Daniel; Seager, Thomas P; Lai, Ying-Cheng

    2015-11-27

    We investigate the emergence of extreme events in interdependent networks. We introduce an inter-layer traffic resource competing mechanism to account for the limited capacity associated with distinct network layers. A striking finding is that, when the number of network layers and/or the overlap among the layers are increased, extreme events can emerge in a cascading manner on a global scale. Asymptotically, there are two stable absorption states: a state free of extreme events and a state of full of extreme events, and the transition between them is abrupt. Our results indicate that internal interactions in the multiplex system can yield qualitatively distinct phenomena associated with extreme events that do not occur for independent network layers. An implication is that, e.g., public resource competitions among different service providers can lead to a higher resource requirement than naively expected. We derive an analytical theory to understand the emergence of global-scale extreme events based on the concept of effective betweenness. We also articulate a cost-effective control scheme through increasing the capacity of very few hubs to suppress the cascading process of extreme events so as to protect the entire multi-layer infrastructure against global-scale breakdown.

  2. Network Event Recording Device: An automated system for Network anomaly detection, and notification. Draft

    SciTech Connect

    Simmons, D.G.; Wilkins, R.

    1994-09-01

    The goal of the Network Event Recording Device (NERD) is to provide a flexible autonomous system for network logging and notification when significant network anomalies occur. The NERD is also charged with increasing the efficiency and effectiveness of currently implemented network security procedures. While it has always been possible for network and security managers to review log files for evidence of network irregularities, the NERD provides real-time display of network activity, as well as constant monitoring and notification services for managers. Similarly, real-time display and notification of possible security breaches will provide improved effectiveness in combating resource infiltration from both inside and outside the immediate network environment.

  3. Subsurface Event Detection and Classification Using Wireless Signal Networks

    PubMed Central

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  4. Subsurface event detection and classification using Wireless Signal Networks.

    PubMed

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-01-01

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191

  5. Drag and drop display & builder

    SciTech Connect

    Bolshakov, Timofei B.; Petrov, Andrey D.; /Fermilab

    2007-12-01

    The Drag and Drop (DnD) Display & Builder is a component-oriented system that allows users to create visual representations of data received from data acquisition systems. It is an upgrade of a Synoptic Display mechanism used at Fermilab since 2002. Components can be graphically arranged and logically interconnected in the web-startable Project Builder. Projects can be either lightweight AJAX- and SVG-based web pages, or they can be started as Java applications. The new version was initiated as a response to discussions between the LHC Controls Group and Fermilab.

  6. Builders Challenge High Performance Builder Spotlight - Artistic Homes, Albuquerque, NM

    SciTech Connect

    2009-01-01

    Building America Builders Challenge fact sheet on Artistic Homes of Albuquerque, New Mexico. Describes the first true zero E-scale home in a hot-dry climate with ducts inside, R-50 attic insulation, roof-mounted photovoltaic power system, and solar thermal water heating.

  7. Comparison of Event Detection Methods for Centralized Sensor Networks

    NASA Technical Reports Server (NTRS)

    Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.

    2006-01-01

    The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.

  8. Communication: Analysing kinetic transition networks for rare events

    NASA Astrophysics Data System (ADS)

    Stevenson, Jacob D.; Wales, David J.

    2014-07-01

    The graph transformation approach is a recently proposed method for computing mean first passage times, rates, and committor probabilities for kinetic transition networks. Here we compare the performance to existing linear algebra methods, focusing on large, sparse networks. We show that graph transformation provides a much more robust framework, succeeding when numerical precision issues cause the other methods to fail completely. These are precisely the situations that correspond to rare event dynamics for which the graph transformation was introduced.

  9. Communication: Analysing kinetic transition networks for rare events.

    PubMed

    Stevenson, Jacob D; Wales, David J

    2014-07-28

    The graph transformation approach is a recently proposed method for computing mean first passage times, rates, and committor probabilities for kinetic transition networks. Here we compare the performance to existing linear algebra methods, focusing on large, sparse networks. We show that graph transformation provides a much more robust framework, succeeding when numerical precision issues cause the other methods to fail completely. These are precisely the situations that correspond to rare event dynamics for which the graph transformation was introduced. PMID:25084870

  10. Modular Modeling System Model Builder

    SciTech Connect

    McKim, C.S.; Matthews, M.T.

    1996-12-31

    The latest release of the Modular Modeling System (MMS) Model Builder adds still more time-saving features to an already powerful MMS dynamic-simulation tool set. The Model Builder takes advantage of 32-bit architecture within the Microsoft Windows 95/NT{trademark} Operating Systems to better integrate a mature library of power-plant components. In addition, the MMS Library of components can now be modified and extended with a new tool named MMS CompGen{trademark}. The MMS Model Builder allows the user to quickly build a graphical schematic representation for a plant by selecting from a library of predefined power plant components to dynamically simulate their operation. In addition, each component has a calculation subroutine stored in a dynamic-link library (DLL), which facilitates the determination of a steady-state condition and performance of routine calculations for the component. These calculations, termed auto-parameterization, help avoid repetitive and often tedious hand calculations for model initialization. In striving to meet the needs for large models and increase user productivity, the MMS Model Builder has been completely revamped to make power plant model creation and maintainability easier and more efficient.

  11. Automatic classification of seismic events within a regional seismograph network

    NASA Astrophysics Data System (ADS)

    Tiira, Timo; Kortström, Jari; Uski, Marja

    2015-04-01

    A fully automatic method for seismic event classification within a sparse regional seismograph network is presented. The tool is based on a supervised pattern recognition technique, Support Vector Machine (SVM), trained here to distinguish weak local earthquakes from a bulk of human-made or spurious seismic events. The classification rules rely on differences in signal energy distribution between natural and artificial seismic sources. Seismic records are divided into four windows, P, P coda, S, and S coda. For each signal window STA is computed in 20 narrow frequency bands between 1 and 41 Hz. The 80 discrimination parameters are used as a training data for the SVM. The SVM models are calculated for 19 on-line seismic stations in Finland. The event data are compiled mainly from fully automatic event solutions that are manually classified after automatic location process. The station-specific SVM training events include 11-302 positive (earthquake) and 227-1048 negative (non-earthquake) examples. The best voting rules for combining results from different stations are determined during an independent testing period. Finally, the network processing rules are applied to an independent evaluation period comprising 4681 fully automatic event determinations, of which 98 % have been manually identified as explosions or noise and 2 % as earthquakes. The SVM method correctly identifies 94 % of the non-earthquakes and all the earthquakes. The results imply that the SVM tool can identify and filter out blasts and spurious events from fully automatic event solutions with a high level of confidence. The tool helps to reduce work-load in manual seismic analysis by leaving only ~5 % of the automatic event determinations, i.e. the probable earthquakes for more detailed seismological analysis. The approach presented is easy to adjust to requirements of a denser or wider high-frequency network, once enough training examples for building a station-specific data set are available.

  12. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    PubMed Central

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  13. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-01-01

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady. PMID:26694394

  14. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks.

    PubMed

    Zhou, Zhangbing; Xing, Riliang; Duan, Yucong; Zhu, Yueqin; Xiang, Jianming

    2015-12-15

    With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s). When sensory data are collected at sink node(s), the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  15. Automatic event detection based on artificial neural networks

    NASA Astrophysics Data System (ADS)

    Doubravová, Jana; Wiszniowski, Jan; Horálek, Josef

    2015-04-01

    The proposed algorithm was developed to be used for Webnet, a local seismic network in West Bohemia. The Webnet network was built to monitor West Bohemia/Vogtland swarm area. During the earthquake swarms there is a large number of events which must be evaluated automatically to get a quick estimate of the current earthquake activity. Our focus is to get good automatic results prior to precise manual processing. With automatic data processing we may also reach a lower completeness magnitude. The first step of automatic seismic data processing is the detection of events. To get a good detection performance we require low number of false detections as well as high number of correctly detected events. We used a single layer recurrent neural network (SLRNN) trained by manual detections from swarms in West Bohemia in the past years. As inputs of the SLRNN we use STA/LTA of half-octave filter bank fed by vertical and horizontal components of seismograms. All stations were trained together to obtain the same network with the same neuron weights. We tried several architectures - different number of neurons - and different starting points for training. Networks giving the best results for training set must not be the optimal ones for unknown waveforms. Therefore we test each network on test set from different swarm (but still with similar characteristics, i.e. location, focal mechanisms, magnitude range). We also apply a coincidence verification for each event. It means that we can lower the number of false detections by rejecting events on one station only and force to declare an event on all stations in the network by coincidence on two or more stations. In further work we would like to retrain the network for each station individually so each station will have its own coefficients (neural weights) set. We would also like to apply this method to data from Reykjanet network located in Reykjanes peninsula, Iceland. As soon as we have a reliable detection, we can proceed to

  16. Seismic event interpretation using fuzzy logic and neural networks

    SciTech Connect

    Maurer, W.J.; Dowla, F.U.

    1994-01-01

    In the computer interpretation of seismic data, unknown sources of seismic events must be represented and reasoned about using measurements from the recorded signal. In this report, we develop the use of fuzzy logic to improve our ability to interpret weak seismic events. Processing strategies for the use of fuzzy set theory to represent vagueness and uncertainty, a phenomena common in seismic data analysis, are developed. A fuzzy-assumption based truth-maintenance-inferencing engine is also developed. Preliminary results in interpreting seismic events using the fuzzy neural network knowledge-based system are presented.

  17. A framework for network-wide semantic event correlation

    NASA Astrophysics Data System (ADS)

    Hall, Robert T.; Taylor, Joshua

    2013-05-01

    An increasing need for situational awareness within network-deployed Systems Under Test has increased desire for frameworks that facilitate system-wide data correlation and analysis. Massive event streams are generated from heterogeneous sensors which require tedious manual analysis. We present a framework for sensor data integration and event correlation based on Linked Data principles, Semantic Web reasoning technology, complex event processing, and blackboard architectures. Sensor data are encoded as RDF models, then processed by complex event processing agents (which incorporate domain specific reasoners, as well as general purpose Semantic Web reasoning techniques). Agents can publish inferences on shared blackboards and generate new semantic events that are fed back into the system. We present AIS, Inc.'s Cyber Battlefield Training and Effectiveness Environment to demonstrate use of the framework.

  18. Event-based exponential synchronization of complex networks.

    PubMed

    Zhou, Bo; Liao, Xiaofeng; Huang, Tingwen

    2016-10-01

    In this paper, we consider exponential synchronization of complex networks. The information diffusions between nodes are driven by properly defined events. By employing the M-matrix theory, algebraic graph theory and the Lyapunov method, two kinds of distributed event-triggering laws are designed, which avoid continuous communications between nodes. Then, several criteria that ensure the event-based exponential synchronization are presented, and the exponential convergence rates are obtained as well. Furthermore, we prove that Zeno behavior of the event-triggering laws can be excluded before synchronization being achieved, that is, the lower bounds of inter-event times are strictly positive. Finally, a simulation example is provided to illustrate the effectiveness of theoretical analysis.

  19. Event-based exponential synchronization of complex networks.

    PubMed

    Zhou, Bo; Liao, Xiaofeng; Huang, Tingwen

    2016-10-01

    In this paper, we consider exponential synchronization of complex networks. The information diffusions between nodes are driven by properly defined events. By employing the M-matrix theory, algebraic graph theory and the Lyapunov method, two kinds of distributed event-triggering laws are designed, which avoid continuous communications between nodes. Then, several criteria that ensure the event-based exponential synchronization are presented, and the exponential convergence rates are obtained as well. Furthermore, we prove that Zeno behavior of the event-triggering laws can be excluded before synchronization being achieved, that is, the lower bounds of inter-event times are strictly positive. Finally, a simulation example is provided to illustrate the effectiveness of theoretical analysis. PMID:27668021

  20. Characterizing interactions in online social networks during exceptional events

    NASA Astrophysics Data System (ADS)

    Omodei, Elisa; De Domenico, Manlio; Arenas, Alex

    2015-08-01

    Nowadays, millions of people interact on a daily basis on online social media like Facebook and Twitter, where they share and discuss information about a wide variety of topics. In this paper, we focus on a specific online social network, Twitter, and we analyze multiple datasets each one consisting of individuals' online activity before, during and after an exceptional event in terms of volume of the communications registered. We consider important events that occurred in different arenas that range from policy to culture or science. For each dataset, the users' online activities are modeled by a multilayer network in which each layer conveys a different kind of interaction, specifically: retweeting, mentioning and replying. This representation allows us to unveil that these distinct types of interaction produce networks with different statistical properties, in particular concerning the degree distribution and the clustering structure. These results suggests that models of online activity cannot discard the information carried by this multilayer representation of the system, and should account for the different processes generated by the different kinds of interactions. Secondly, our analysis unveils the presence of statistical regularities among the different events, suggesting that the non-trivial topological patterns that we observe may represent universal features of the social dynamics on online social networks during exceptional events.

  1. Mining the key predictors for event outbreaks in social networks

    NASA Astrophysics Data System (ADS)

    Yi, Chengqi; Bao, Yuanyuan; Xue, Yibo

    2016-04-01

    It will be beneficial to devise a method to predict a so-called event outbreak. Existing works mainly focus on exploring effective methods for improving the accuracy of predictions, while ignoring the underlying causes: What makes event go viral? What factors that significantly influence the prediction of an event outbreak in social networks? In this paper, we proposed a novel definition for an event outbreak, taking into account the structural changes to a network during the propagation of content. In addition, we investigated features that were sensitive to predicting an event outbreak. In order to investigate the universality of these features at different stages of an event, we split the entire lifecycle of an event into 20 equal segments according to the proportion of the propagation time. We extracted 44 features, including features related to content, users, structure, and time, from each segment of the event. Based on these features, we proposed a prediction method using supervised classification algorithms to predict event outbreaks. Experimental results indicate that, as time goes by, our method is highly accurate, with a precision rate ranging from 79% to 97% and a recall rate ranging from 74% to 97%. In addition, after applying a feature-selection algorithm, the top five selected features can considerably improve the accuracy of the prediction. Data-driven experimental results show that the entropy of the eigenvector centrality, the entropy of the PageRank, the standard deviation of the betweenness centrality, the proportion of re-shares without content, and the average path length are the key predictors for an event outbreak. Our findings are especially useful for further exploring the intrinsic characteristics of outbreak prediction.

  2. Event management for large scale event-driven digital hardware spiking neural networks.

    PubMed

    Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean

    2013-09-01

    The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms.

  3. Venezuelan Dam tests builder's skills

    SciTech Connect

    Stussman, H.B.

    1994-01-03

    Given half a chance, Venezuela's Macagua II hydroelectric project would have been finished by now. Handed less than half a chance by the cash-strapped owner, the builder of this nearly $1-billion, 2,540-Mw power development has pushed the project to the 80% complete mark anyway. This article summarizes the status of their project in terms of construction progress, financing, and socio-economic factors in Venezuela.

  4. Information Spread of Emergency Events: Path Searching on Social Networks

    PubMed Central

    Hu, Hongzhi; Wu, Tunan

    2014-01-01

    Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning. PMID:24600323

  5. Information spread of emergency events: path searching on social networks.

    PubMed

    Dai, Weihui; Hu, Hongzhi; Wu, Tunan; Dai, Yonghui

    2014-01-01

    Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning.

  6. Supporting Proactive Application Event Notification to Improve Sensor Network Performance

    NASA Astrophysics Data System (ADS)

    Merlin, Christophe J.; Heinzelman, Wendi B.

    As wireless sensor networks gain in popularity, many deployments are posing new challenges due to their diverse topologies and resource constraints. Previous work has shown the advantage of adapting protocols based on current network conditions (e.g., link status, neighbor status), in order to provide the best service in data transport. Protocols can similarly benefit from adaptation based on current application conditions. In particular, if proactively informed of the status of active queries in the network, protocols can adjust their behavior accordingly. In this paper, we propose a novel approach to provide such proactive application event notification to all interested protocols in the stack. Specifically, we use the existing interfaces and event signaling structure provided by the X-Lisa (Cross-layer Information Sharing Architecture) protocol architecture, augmenting this architecture with a Middleware Interpreter for managing application queries and performing event notification. Using this approach, we observe gains in Quality of Service of up to 40% in packet delivery ratios and a 75% decrease in packet delivery delay for the tested scenario.

  7. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  8. Event Networks and the Identification of Crime Pattern Motifs

    PubMed Central

    2015-01-01

    In this paper we demonstrate the use of network analysis to characterise patterns of clustering in spatio-temporal events. Such clustering is of both theoretical and practical importance in the study of crime, and forms the basis for a number of preventative strategies. However, existing analytical methods show only that clustering is present in data, while offering little insight into the nature of the patterns present. Here, we show how the classification of pairs of events as close in space and time can be used to define a network, thereby generalising previous approaches. The application of graph-theoretic techniques to these networks can then offer significantly deeper insight into the structure of the data than previously possible. In particular, we focus on the identification of network motifs, which have clear interpretation in terms of spatio-temporal behaviour. Statistical analysis is complicated by the nature of the underlying data, and we provide a method by which appropriate randomised graphs can be generated. Two datasets are used as case studies: maritime piracy at the global scale, and residential burglary in an urban area. In both cases, the same significant 3-vertex motif is found; this result suggests that incidents tend to occur not just in pairs, but in fact in larger groups within a restricted spatio-temporal domain. In the 4-vertex case, different motifs are found to be significant in each case, suggesting that this technique is capable of discriminating between clustering patterns at a finer granularity than previously possible. PMID:26605544

  9. Builders Challenge High Performance Builder Spotlight - Centex Corporation, San Ramon, California

    SciTech Connect

    2008-01-01

    Building America/Builders Challenge fact sheet on Centex, an energy-efficient home builder in hot/mixed dry climate using advanced insulation techniques, engineered headers, and tankless water heaters.

  10. Builders Challenge High Performance Builder Spotlight: Yavapai College, Chino Valley, Arizona

    SciTech Connect

    2009-12-22

    Building America Builders Challenge fact sheet on Yavapai College of Chino Valley, Arizona. These college students built a Building America Builders Challenge house that achieved the remarkably low HERS score of -3 and achieved a tight building envelope.

  11. Builders Challenge High Performance Builder Spotlight: Ecofutures Building Inc., Boulder, Colorado

    SciTech Connect

    2009-12-22

    Building America Builders Challenge fact sheet on Ecofutures Building Inc. of Boulder, Colorado. Ecofutures’ first Builders Challenge house has been equipped with extensive energy monitoring equipment and many energy-efficient features.

  12. Builders Challenge High Performance Builder Spotlight - Community Development Corporation of Utah

    SciTech Connect

    2008-01-01

    Building America/Builders Challenge fact sheet on Community Development Corp, an energy-efficient home builder in cold climate using advanced framing and compact duct design. Evaluates impacts to cost.

  13. Builders Challenge High Performance Builder Spotlight - Martha Rose Construction, Inc., Seattle, Washington

    SciTech Connect

    2008-01-01

    Building America/Builders Challenge fact sheet on Martha Rose Construction, an energy-efficient home builder in marine climate using the German Passiv Haus design, improved insulation, and solar photovoltaics.

  14. Convincing the home builder to build solar homes: Evaluation of passive solar workshop for builders

    NASA Astrophysics Data System (ADS)

    Klein, S.

    1981-09-01

    In 1979-80, a pilot series of workshops for home builders was offered throughout the United States. The primary objectives of the project were to educate the home builder in passive solar design techniques, to promote favorable attitudes toward solar concepts, and to encourage builders to use solar design in residential construction. Builders and designers were targeted because they are the decision makers in the progress of commercializing solar design in new construction. The need is outlined for continued and expanded programs for builders to facilitate usage of solar design in the residential sector, based on information provided in nearly 1100 pre- and post-training forms returned.

  15. Event-driven approach of layered multicast to network adaptation in RED-based IP networks

    NASA Astrophysics Data System (ADS)

    Nahm, Kitae; Li, Qing; Kuo, C.-C. J.

    2003-11-01

    In this work, we investigate the congestion control problem for layered video multicast in IP networks of active queue management (AQM) using a simple random early detection (RED) queue model. AQM support from networks improves the visual quality of video streaming but makes network adaptation more di+/-cult for existing layered video multicast proticols that use the event-driven timer-based approach. We perform a simplified analysis on the response of the RED algorithm to burst traffic. The analysis shows that the primary problem lies in the weak correlation between the network feedback and the actual network congestion status when the RED queue is driven by burst traffic. Finally, a design guideline of the layered multicast protocol is proposed to overcome this problem.

  16. Rome: sinkhole events and network of underground cavities (Italy)

    NASA Astrophysics Data System (ADS)

    Nisio, Stefania; Ciotoli, Giancarlo

    2016-04-01

    The anthropogenic sinkholes in the city of Rome are closely linked to the network of underground cavities produced by human activities in more than two thousand years of history. Over the past fifteen years the increased frequency of intense rainfall events, favors sinkhole formation. The risk assessment induced by anthropogenic sinkhole is really difficult. However, a susceptibility of the territory to sinkholes can be more easily determined as the probability that an event may occur in a given space, with unique geological-morphological characteristics, and in an infinite time. A sinkhole susceptibility map of the Rome territory, up to the ring road, has been constructed by using Geographically Weighted Regression technique and geostatistics. The spatial regression model includes the analysis of more than 2700 anthropogenic sinkholes (recorded from 1875 to 2015), as well as geological, morphological, hydrological and predisposing anthropogenic characteristics of the study area. The numerous available data (underground cavities, the ancient entrances to the quarry, bunkers, etc.) facilitate the creation of a series of maps. The density map of the cavity, updated to 2015, showed that more than 20 km2 of the Roman territory are affected by underground cavities. The census of sinkholes (over 2700) shows that over 30 km2 has been affected by sinkholes. The final susceptibility map highlights that inside the Ring Road about 40 km2 of the territory (about 11%) have a very high probability of triggering a sinkhole event. The susceptibility map was also compared with the data of ground subsidence (InSAR) to obtain a predictive model.

  17. Management of a Complex Open Channel Network During Flood Events

    NASA Astrophysics Data System (ADS)

    Franchini, M.; Valiani, A.; Schippa, L.; Mascellani, G.

    2003-04-01

    Most part of the area around Ferrara (Italy) is below the mean sea level and an extensive drainage system combined with several pump stations allows the use of this area for both urban development and industrial and agricultural activities. The three main channels of this hydraulic system constitute the Ferrara Inland Waterway (total length approximately 70 km), which connects the Po river near Ferrara to the sea. Because of the level difference between the upstream and dowstream ends of the waterway, three locks are located along it, each of them combined with a set of gates to control the water levels. During rainfall events, most of the water of the basin flows into the waterway and heavy precipitations sometimes cause flooding in several areas. This is due to the insufficiency of the channel network dimensions and an inadequate manual operation of the gates. This study presents a hydrological-hydraulic model for the entire Ferrara basin and a system of rules in order to operate the gates. In particular, their opening is designed to be regulated in real time by monitoring the water level in several sections along the channels. Besides flood peak attenuation, this operation strategy contributes also to the maintenance of a constant water level for irrigation and fluvial navigation during the dry periods. With reference to the flood event of May 1996, it is shown that this floodgate operation policy, unlike that which was actually adopted during that event, would lead to a significant flood peak attenuation, avoiding flooding in the area upstream of Ferrara.

  18. The CMS Remote Analysis Builder (CRAB)

    SciTech Connect

    Spiga, D.; Cinquilli, M.; Servoli, L.; Lacaprara, S.; Fanzago, F.; Dorigo, A.; Merlo, M.; Farina, F.; Fanfani, A.; Codispoti, G.; Bacchi, W.; /INFN, Bologna /Bologna U /CERN /INFN, CNAF /INFN, Trieste /Fermilab

    2008-01-22

    The CMS experiment will produce several Pbytes of data every year, to be distributed over many computing centers geographically distributed in different countries. Analysis of this data will be also performed in a distributed way, using grid infrastructure. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that allows a transparent access to distributed data to end physicist. Very limited knowledge of underlying technicalities are required to the user. CRAB interacts with the local user environment, the CMS Data Management services and with the Grid middleware. It is able to use WLCG, gLite and OSG middleware. CRAB has been in production and in routine use by end-users since Spring 2004. It has been extensively used in studies to prepare the Physics Technical Design Report (PTDR) and in the analysis of reconstructed event samples generated during the Computing Software and Analysis Challenge (CSA06). This involved generating thousands of jobs per day at peak rates. In this paper we discuss the current implementation of CRAB, the experience with using it in production and the plans to improve it in the immediate future.

  19. Builders Challenge High Performance Builder Spotlight: David Weekley Homes, Houston, Texas

    SciTech Connect

    2009-12-22

    Building America Builders Challenge fact sheet on David Weekley Homes of Houston, Texas. The builder plans homes as a "system," with features such as wood-framed walls that are air-sealed then insulated with R-13 unfaced fiberglass batts plus an external covering of R-2 polyisocyanurate rigid foam sheathing.

  20. Builders Challenge High Performance Builder Spotlight - Masco Environments for Living, Las Vegas, Nevada

    SciTech Connect

    2009-01-01

    Building America Builders Challenge fact sheet on Masco’s Environments for Living Certified Green demo home at the 2009 International Builders Show in Las Vegas. The home has a Home Energy Rating System (HERS) index score of 44, a right-sized air conditi

  1. Building America Best Practices Series Volume 8: Builders Challenge Quality Criteria Support Document

    SciTech Connect

    Baechler, Michael C.; Bartlett, Rosemarie; Gilbride, Theresa L.

    2010-11-01

    The U.S. Department of Energy (DOE) has posed a challenge to the homebuilding industry—to build 220,000 high-performance homes by 2012. Through the Builders Challenge, participating homebuilders will have an easy way to differentiate their best energy-performing homes from other products in the marketplace, and to make the benefits clear to buyers. This document was prepared by Pacific Northwest National Laboratory for DOE to provide guidance to U.S. home builders who want to accept the challenge. To qualify for the Builders Challenge, a home must score 70 or less on the EnergySmart Home Scale (E-Scale). The E-scale is based on the well-established Home Energy Rating System (HERS) index, developed by the Residential Energy Services Network (RESNET). The E-scale allows homebuyers to understand – at a glance – how the energy performance of a particular home compares with the performance of others. To learn more about the index and HERS Raters, visit www.natresnet.org. Homes also must meet the Builders Challenge criteria described in this document. To help builders meet the Challenge, guidance is provided in this report for each of the 29 criteria. Included with guidance for each criteria are resources for more information and references for relevant codes and standards. The Builders Challenge Quality Criteria were originally published in Dec. 2008. They were revised and published as PNNL-18009 Rev 1.2 in Nov. 2009. This is version 1.3, published Nov 2010. Changes from the Nov 2009 version include adding a title page and updating the Energy Star windows critiera to the Version 5.0 criteria approved April 2009 and effective January 4, 2010. This document and other information about the Builders Challenge is available on line at www.buildingamerica.gov/challenge.

  2. Interactive effects of elevation, species richness and extreme climatic events on plant-pollinator networks.

    PubMed

    Hoiss, Bernhard; Krauss, Jochen; Steffan-Dewenter, Ingolf

    2015-11-01

    Plant-pollinator interactions are essential for the functioning of terrestrial ecosystems, but are increasingly affected by global change. The risks to such mutualistic interactions from increasing temperature and more frequent extreme climatic events such as drought or advanced snow melt are assumed to depend on network specialization, species richness, local climate and associated parameters such as the amplitude of extreme events. Even though elevational gradients provide valuable model systems for climate change and are accompanied by changes in species richness, responses of plant-pollinator networks to climatic extreme events under different environmental and biotic conditions are currently unknown. Here, we show that elevational climatic gradients, species richness and experimentally simulated extreme events interactively change the structure of mutualistic networks in alpine grasslands. We found that the degree of specialization in plant-pollinator networks (H2') decreased with elevation. Nonetheless, network specialization increased after advanced snow melt at high elevations, whereas changes in network specialization after drought were most pronounced at sites with low species richness. Thus, changes in network specialization after extreme climatic events depended on climatic context and were buffered by high species richness. In our experiment, only generalized plant-pollinator networks changed in their degree of specialization after climatic extreme events. This indicates that contrary to our assumptions, network generalization may not always foster stability of mutualistic interaction networks.

  3. Marketing Career Speed Networking: A Classroom Event to Foster Career Awareness

    ERIC Educational Resources Information Center

    Buff, Cheryl L.; O'Connor, Suzanne

    2012-01-01

    This paper describes the design, implementation, and evaluation of a marketing career speed networking event held during class time in two sections of the consumer behavior class. The event was coordinated through a partnering effort with marketing faculty and the college's Career Center. A total of 57 students participated in the event, providing…

  4. Impact assessment of extreme storm events using a Bayesian network

    USGS Publications Warehouse

    den Heijer, C.(Kees); Knipping, Dirk T.J.A.; Plant, Nathaniel G.; van Thiel de Vries, Jaap S. M.; Baart, Fedor; van Gelder, Pieter H. A. J. M.

    2012-01-01

    This paper describes an investigation on the usefulness of Bayesian Networks in the safety assessment of dune coasts. A network has been created that predicts the erosion volume based on hydraulic boundary conditions and a number of cross-shore profile indicators. Field measurement data along a large part of the Dutch coast has been used to train the network. Corresponding storm impact on the dunes was calculated with an empirical dune erosion model named duros+. Comparison between the Bayesian Network predictions and the original duros+ results, here considered as observations, results in a skill up to 0.88, provided that the training data covers the range of predictions. Hence, the predictions from a deterministic model (duros+) can be captured in a probabilistic model (Bayesian Network) such that both the process knowledge and uncertainties can be included in impact and vulnerability assessments.

  5. High Performance Builder Spotlight: Treasure Homes Inc.

    SciTech Connect

    2011-01-01

    Treasure Homes, Inc., achieved a HERS rating of 46 without PV on its prototype “Gem” home, located on the shores of Lake Michigan in northern Indiana, thanks in part to training received from a Building America partner, the National Association of Home Builders Research Center.

  6. High Performance Builder Spotlight: Cobblestone Homes

    SciTech Connect

    2011-01-01

    Cobblestone Homes of Freeland,MI quest to understand building science led to construction in 2010 of the "Vision Zero Project," a demonstration home that has earned a DOE Builders Challenge certification and achieved a HERS index of -4 with photovoltaics and 37 without PV.

  7. JOB BUILDER remote batch processing subsystem

    NASA Technical Reports Server (NTRS)

    Orlov, I. G.; Orlova, T. L.

    1980-01-01

    The functions of the JOB BUILDER remote batch processing subsystem are described. Instructions are given for using it as a component of a display system developed by personnel of the System Programming Laboratory, Institute of Space Research, USSR Academy of Sciences.

  8. High Performance Builder Spotlight: Imagine Homes

    SciTech Connect

    2011-01-01

    Imagine Homes, working with the DOE's Building America research team member IBACOS, has developed a system that can be replicated by other contractors to build affordable, high-performance homes. Imagine Homes has used the system to produce more than 70 Builders Challenge-certified homes per year in San Antonio over the past five years.

  9. [Informatics, networks and technological current events in cancerology].

    PubMed

    Labrèze, Laurent

    2003-05-01

    In spite of a sad economic situation, new technologies crossed significant stage in 2001 and 2002: deployment. A great number of Networks were constituted or are on the way to be. Integration and use of these technologies now exceed the framework of the simple experimentation. A 2002 principal outstanding facts press review of telemedicine and oncology is thus presented and a state of the networks and the tools for formation, communication and healthcare.

  10. Social network changes and life events across the life span: a meta-analysis.

    PubMed

    Wrzus, Cornelia; Hänel, Martha; Wagner, Jenny; Neyer, Franz J

    2013-01-01

    For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-analysis on age-related social network changes and the effects of life events on social networks using 277 studies with 177,635 participants from adolescence to old age. Cross-sectional as well as longitudinal studies consistently showed that (a) the global social network increased up until young adulthood and then decreased steadily, (b) both the personal network and the friendship network decreased throughout adulthood, (c) the family network was stable in size from adolescence to old age, and (d) other networks with coworkers or neighbors were important only in specific age ranges. Studies focusing on life events that occur at specific ages, such as transition to parenthood, job entry, or widowhood, demonstrated network changes similar to such age-related network changes. Moderator analyses detected that the type of network assessment affected the reported size of global, personal, and family networks. Period effects on network sizes occurred for personal and friendship networks, which have decreased in size over the last 35 years. Together the findings are consistent with the view that a portion of normative, age-related social network changes are due to normative, age-related life events. We discuss how these patterns of normative social network development inform research in social, evolutionary, cultural, and personality psychology.

  11. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-12-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  12. Forecasting ENSO events: A neural network-extended EOF approach

    SciTech Connect

    Tangang, F.T.; Tang, B.; Monahan, A.H.; Hsieh, W.W.

    1998-01-01

    The authors constructed neural network models to forecast the sea surface temperature anomalies (SSTA) for three regions: Nino 4. Nino 3.5, and Nino 3, representing the western-central, the central, and the eastern-central parts of the equatorial Pacific Ocean, respectively. The inputs were the extended empirical orthogonal functions (EEOF) of the sea level pressure (SLP) field that covered the tropical Indian and Pacific Oceans and evolved for a duration of 1 yr. The EEOFs greatly reduced the size of the neural networks from those of the authors` earlier papers using EOFs. The Nino 4 region appeared to be the best forecasted region, with useful skills up to a year lead time for the 1982-93 forecast period. By network pruning analysis and spectral analysis, four important inputs were identified: modes 1, 2, and 6 of the SLP EEOFs and the SSTA persistence. Mode 1 characterized the low-frequency oscillation (LFO, with 4-5-yr period), and was seen as the typical ENSO signal, while mode 2, with a period of 2-5 yr, characterized the quasi-biennial oscillation (QBO) plus the LFO. Mode 6 was dominated by decadal and interdecadal variations. Thus, forecasting ENSO required information from the QBO, and the decadal-interdecadal oscillations. The nonlinearity of the networks tended to increase with lead time and to become stronger for the eastern regions of the equatorial Pacific Ocean. 35 refs., 14 figs., 4 tabs.

  13. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    SciTech Connect

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET, and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.

  14. Cough event classification by pretrained deep neural network

    PubMed Central

    2015-01-01

    Background Cough is an essential symptom in respiratory diseases. In the measurement of cough severity, an accurate and objective cough monitor is expected by respiratory disease society. This paper aims to introduce a better performed algorithm, pretrained deep neural network (DNN), to the cough classification problem, which is a key step in the cough monitor. Method The deep neural network models are built from two steps, pretrain and fine-tuning, followed by a Hidden Markov Model (HMM) decoder to capture tamporal information of the audio signals. By unsupervised pretraining a deep belief network, a good initialization for a deep neural network is learned. Then the fine-tuning step is a back propogation tuning the neural network so that it can predict the observation probability associated with each HMM states, where the HMM states are originally achieved by force-alignment with a Gaussian Mixture Model Hidden Markov Model (GMM-HMM) on the training samples. Three cough HMMs and one noncough HMM are employed to model coughs and noncoughs respectively. The final decision is made based on viterbi decoding algorihtm that generates the most likely HMM sequence for each sample. A sample is labeled as cough if a cough HMM is found in the sequence. Results The experiments were conducted on a dataset that was collected from 22 patients with respiratory diseases. Patient dependent (PD) and patient independent (PI) experimental settings were used to evaluate the models. Five criteria, sensitivity, specificity, F1, macro average and micro average are shown to depict different aspects of the models. From overall evaluation criteria, the DNN based methods are superior to traditional GMM-HMM based method on F1 and micro average with maximal 14% and 11% error reduction in PD and 7% and 10% in PI, meanwhile keep similar performances on macro average. They also surpass GMM-HMM model on specificity with maximal 14% error reduction on both PD and PI. Conclusions In this paper, we

  15. WebDB Component Builder - Lessons Learned

    SciTech Connect

    Macedo, C.

    2000-02-15

    Oracle WebDB is the easiest way to produce web enabled lightweight and enterprise-centric applications. This concept from Oracle has tantalized our taste for simplistic web development by using a purely web based tool that lives nowhere else but in the database. The use of online wizards, templates, and query builders, which produces PL/SQL behind the curtains, can be used straight ''out of the box'' by both novice and seasoned developers. The topic of this presentation will introduce lessons learned by developing and deploying applications built using the WebDB Component Builder in conjunction with custom PL/SQL code to empower a hybrid application. There are two kinds of WebDB components: those that display data to end users via reporting, and those that let end users update data in the database via entry forms. The presentation will also discuss various methods within the Component Builder to enhance the applications pushed to the desktop. The demonstrated example is an application entitled HOME (Helping Other's More Effectively) that was built to manage a yearly United Way Campaign effort. Our task was to build an end to end application which could manage approximately 900 non-profit agencies, an average of 4,100 individual contributions, and $1.2 million dollars. Using WebDB, the shell of the application was put together in a matter of a few weeks. However, we did encounter some hurdles that WebDB, in it's stage of infancy (v2.0), could not solve for us directly. Together with custom PL/SQL, WebDB's Component Builder became a powerful tool that enabled us to produce a very flexible hybrid application.

  16. XAL Application Framework and Bricks GUI Builder

    SciTech Connect

    Pelaia II, Tom

    2007-01-01

    The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.

  17. Event Building in Future Daq Architectures Using ATM Networks

    NASA Astrophysics Data System (ADS)

    Doughty, David C.; Game, David; Holt, Stephanie; Mitchell, Lisa; Banta, Paul; Heyes, Graham; Putnam, Theodore; Watson, W. A.

    ATM switches and links have been investigated for use as the event building fabric for future data acquisition architectures and will be used in CEBAF's CLAS detector. To avoid contention problems and cell-loss, a linked dual token passing algorithm has been devised, with two different types of tokens being passed through the switch. This algorithm leads to a `barrel shifter' type of parallel data transfer. We describe the hardware architecture and the dual token algorithm, and present simulation and test results.

  18. Event-triggered H∞ filter design for delayed neural network with quantization.

    PubMed

    Liu, Jinliang; Tang, Jia; Fei, Shumin

    2016-10-01

    This paper is concerned with H∞ filter design for a class of neural network systems with event-triggered communication scheme and quantization. Firstly, a new event-triggered communication scheme is introduced to determine whether or not the current sampled sensor data should be broadcasted and transmitted to quantizer, which can save the limited communication resource. Secondly, a logarithmic quantizer is used to quantify the sampled data, which can reduce the data transmission rate in the network. Thirdly, considering the influence of the constrained network resource, we investigate the problem of H∞ filter design for a class of event-triggered neural network systems with quantization. By using Lyapunov functional and linear matrix inequality (LMI) techniques, some delay-dependent stability conditions for the existence of the desired filter are obtained. Furthermore, the explicit expression is given for the designed filter parameters in terms of LMIs. Finally, a numerical example is given to show the usefulness of the obtained theoretical results.

  19. Network-based event-triggered filtering for Markovian jump systems

    NASA Astrophysics Data System (ADS)

    Wang, Huijiao; Shi, Peng; Agarwal, Ramesh K.

    2016-06-01

    The problem of event-triggered H∞ filtering for networked Markovian jump system is studied in this paper. A dynamic discrete event-triggered scheme is designed to choose the transmitted data for different Markovian jumping modes. The time-delay modelling method is employed to describe the event-triggered scheme and the network-related behaviour, such as transmission delay, data package dropout and disorder, into a networked Markovian time-delay jump system. Furthermore, a sufficient condition is derived to guarantee that the resulting filtering error system is stochastically stable with a prescribed performance index. A co-design method for the H∞ filter and the event-triggered scheme is then proposed. The effectiveness and potential of the theoretic results obtained are illustrated by a simulation example.

  20. Percolation Features on Climate Network under Attacks of El Niño Events

    NASA Astrophysics Data System (ADS)

    Lu, Z.

    2015-12-01

    Percolation theory under different attacks is one of the main research areas in complex networks but never be applied to investigate climate network. In this study, for the first time we construct a climate network of surface air temperature field to analyze its percolation features. Here, we regard El Niño event as a kind of naturally attacks generated from Pacific Ocean to attack its upper climate network. We find that El Niño event leads an abrupt percolation phase transition to the climate network which makes it splitting and unstable suddenly. Comparing the results of the climate network under three different forms of attacks, including most connected attack (MA), localized attack (LA) and random attack (RA) respectively, it is found that both MA and LA lead first-order transition and RA leads second-order transition to the climate network. Furthermore, we find that most real attacks consist of all these three forms of attacks. With El Niño event emerging, the ratios of LA and MA increase and dominate the style of attack while RA decreasing. It means the percolation phase transition due to El Niño events is close to first-order transition mostly affected by LA and MA. Our research may help us further understand two questions from perspective of percolation on network: (1) Why not all warming in Pacific Ocean but El Niño events could affect the climate. (2) Why the climate affected by El Niño events changes abruptly.

  1. A Fuzzy-Decision Based Approach for Composite Event Detection in Wireless Sensor Networks

    PubMed Central

    Zhang, Shukui; Chen, Hao; Zhu, Qiaoming

    2014-01-01

    The event detection is one of the fundamental researches in wireless sensor networks (WSNs). Due to the consideration of various properties that reflect events status, the Composite event is more consistent with the objective world. Thus, the research of the Composite event becomes more realistic. In this paper, we analyze the characteristics of the Composite event; then we propose a criterion to determine the area of the Composite event and put forward a dominating set based network topology construction algorithm under random deployment. For the unreliability of partial data in detection process and fuzziness of the event definitions in nature, we propose a cluster-based two-dimensional τ-GAS algorithm and fuzzy-decision based composite event decision mechanism. In the case that the sensory data of most nodes are normal, the two-dimensional τ-GAS algorithm can filter the fault node data effectively and reduce the influence of erroneous data on the event determination. The Composite event judgment mechanism which is based on fuzzy-decision holds the superiority of the fuzzy-logic based algorithm; moreover, it does not need the support of a huge rule base and its computational complexity is small. Compared to CollECT algorithm and CDS algorithm, this algorithm improves the detection accuracy and reduces the traffic. PMID:25136690

  2. Nonthreshold-based event detection for 3d environment monitoring in sensor networks

    SciTech Connect

    Li, M.; Liu, Y.H.; Chen, L.

    2008-12-15

    Event detection is a crucial task for wireless sensor network applications, especially environment monitoring. Existing approaches for event detection are mainly based on some predefined threshold values and, thus, are often inaccurate and incapable of capturing complex events. For example, in coal mine monitoring scenarios, gas leakage or water osmosis can hardly be described by the overrun of specified attribute thresholds but some complex pattern in the full-scale view of the environmental data. To address this issue, we propose a nonthreshold-based approach for the real 3D sensor monitoring environment. We employ energy-efficient methods to collect a time series of data maps from the sensor network and detect complex events through matching the gathered data to spatiotemporal data patterns. Finally, we conduct trace-driven simulations to prove the efficacy and efficiency of this approach on detecting events of complex phenomena from real-life records.

  3. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model.

    PubMed

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-11-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed 'quasi-orbits', which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network's firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms.

  4. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model.

    PubMed

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-11-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed 'quasi-orbits', which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network's firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms. PMID:26558616

  5. Scale-Free Brain Networks Based on the Event-Related Potential during Visual Spatial Attention

    NASA Astrophysics Data System (ADS)

    Li, Ling; Jin, Zhen-Lan

    2011-04-01

    The human brain is thought of as one of the most complex dynamical systems in the universe. The network view of the dynamical system has emerged since the discovery of scale-free networks. Brain functional networks, which represent functional associations among brain regions, are extracted by measuring the temporal correlations from electroencephalogram data. We measure the topological properties of the brain functional network, including degree distribution, average degree, clustering coefficient and the shortest path length, to compare the networks of multi-channel event-related potential activity between visual spatial attention and unattention conditions. It is found that the degree distribution of the brain functional networks under both the conditions is a power law distribution, which reflects a scale-free property. Moreover, the scaling exponent of the attention condition is significantly smaller than that of the unattention condition. However, the degree distribution of equivalent random networks does not follow the power law distribution. In addition, the clustering coefficient of these random networks is smaller than those of brain networks, and the shortest path length of these random networks is large and comparable with those of brain networks. Our results, typical of scale-free networks, indicate that the scaling exponent of brain activity could reflect different cognitive processes.

  6. Dyadic Event Attribution in Social Networks with Mixtures of Hawkes Processes

    PubMed Central

    Li, Liangda; Zha, Hongyuan

    2014-01-01

    In many applications in social network analysis, it is important to model the interactions and infer the influence between pairs of actors, leading to the problem of dyadic event modeling which has attracted increasing interests recently. In this paper we focus on the problem of dyadic event attribution, an important missing data problem in dyadic event modeling where one needs to infer the missing actor-pairs of a subset of dyadic events based on their observed timestamps. Existing works either use fixed model parameters and heuristic rules for event attribution, or assume the dyadic events across actor-pairs are independent. To address those shortcomings we propose a probabilistic model based on mixtures of Hawkes processes that simultaneously tackles event attribution and network parameter inference, taking into consideration the dependency among dyadic events that share at least one actor. We also investigate using additive models to incorporate regularization to avoid overfitting. Our experiments on both synthetic and real-world data sets on international armed conflicts suggest that the proposed new method is capable of significantly improve accuracy when compared with the state-of-the-art for dyadic event attribution. PMID:24917494

  7. Dyadic Event Attribution in Social Networks with Mixtures of Hawkes Processes.

    PubMed

    Li, Liangda; Zha, Hongyuan

    2013-01-01

    In many applications in social network analysis, it is important to model the interactions and infer the influence between pairs of actors, leading to the problem of dyadic event modeling which has attracted increasing interests recently. In this paper we focus on the problem of dyadic event attribution, an important missing data problem in dyadic event modeling where one needs to infer the missing actor-pairs of a subset of dyadic events based on their observed timestamps. Existing works either use fixed model parameters and heuristic rules for event attribution, or assume the dyadic events across actor-pairs are independent. To address those shortcomings we propose a probabilistic model based on mixtures of Hawkes processes that simultaneously tackles event attribution and network parameter inference, taking into consideration the dependency among dyadic events that share at least one actor. We also investigate using additive models to incorporate regularization to avoid overfitting. Our experiments on both synthetic and real-world data sets on international armed conflicts suggest that the proposed new method is capable of significantly improve accuracy when compared with the state-of-the-art for dyadic event attribution. PMID:24917494

  8. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model

    PubMed Central

    Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo

    2015-01-01

    Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed ‘quasi-orbits’, which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network’s firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms. PMID:26558616

  9. Airway reopening through catastrophic events in a hierarchical network

    PubMed Central

    Baudoin, Michael; Song, Yu; Manneville, Paul; Baroud, Charles N.

    2013-01-01

    When you reach with your straw for the final drops of a milkshake, the liquid forms a train of plugs that flow slowly initially because of the high viscosity. They then suddenly rupture and are replaced with a rapid airflow with the characteristic slurping sound. Trains of liquid plugs also are observed in complex geometries, such as porous media during petroleum extraction, in microfluidic two-phase flows, or in flows in the pulmonary airway tree under pathological conditions. The dynamics of rupture events in these geometries play the dominant role in the spatial distribution of the flow and in determining how much of the medium remains occluded. Here we show that the flow of a train of plugs in a straight channel is always unstable to breaking through a cascade of ruptures. Collective effects considerably modify the rupture dynamics of plug trains: Interactions among nearest neighbors take place through the wetting films and slow down the cascade, whereas global interactions, through the total resistance to flow of the train, accelerate the dynamics after each plug rupture. In a branching tree of microchannels, similar cascades occur along paths that connect the input to a particular output. This divides the initial tree into several independent subnetworks, which then evolve independently of one another. The spatiotemporal distribution of the cascades is random, owing to strong sensitivity to the plug divisions at the bifurcations. PMID:23277557

  10. Novel algorithms for improved pattern recognition using the US FDA Adverse Event Network Analyzer.

    PubMed

    Botsis, Taxiarchis; Scott, John; Goud, Ravi; Toman, Pamela; Sutherland, Andrea; Ball, Robert

    2014-01-01

    The medical review of adverse event reports for medical products requires the processing of "big data" stored in spontaneous reporting systems, such as the US Vaccine Adverse Event Reporting System (VAERS). VAERS data are not well suited to traditional statistical analyses so we developed the FDA Adverse Event Network Analyzer (AENA) and three novel network analysis approaches to extract information from these data. Our new approaches include a weighting scheme based on co-occurring triplets in reports, a visualization layout inspired by the islands algorithm, and a network growth methodology for the detection of outliers. We explored and verified these approaches by analysing the historical signal of Intussusception (IS) after the administration of RotaShield vaccine (RV) in 1999. We believe that our study supports the use of AENA for pattern recognition in medical product safety and other clinical data. PMID:25160375

  11. Event Detection in Aerospace Systems using Centralized Sensor Networks: A Comparative Study of Several Methodologies

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.

    2006-01-01

    Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.

  12. Novel algorithms for improved pattern recognition using the US FDA Adverse Event Network Analyzer.

    PubMed

    Botsis, Taxiarchis; Scott, John; Goud, Ravi; Toman, Pamela; Sutherland, Andrea; Ball, Robert

    2014-01-01

    The medical review of adverse event reports for medical products requires the processing of "big data" stored in spontaneous reporting systems, such as the US Vaccine Adverse Event Reporting System (VAERS). VAERS data are not well suited to traditional statistical analyses so we developed the FDA Adverse Event Network Analyzer (AENA) and three novel network analysis approaches to extract information from these data. Our new approaches include a weighting scheme based on co-occurring triplets in reports, a visualization layout inspired by the islands algorithm, and a network growth methodology for the detection of outliers. We explored and verified these approaches by analysing the historical signal of Intussusception (IS) after the administration of RotaShield vaccine (RV) in 1999. We believe that our study supports the use of AENA for pattern recognition in medical product safety and other clinical data.

  13. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  14. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    PubMed

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results.

  15. Builders Challenge High Performance Builder Spotlight - NextGen Home, Las Vegas, Nevada

    SciTech Connect

    2009-01-01

    Building America Builders Challenge fact sheet on the NextGen demo home built in Las Vegas. The home has a Home Energy Rating System (HERS) index score of 44 with R-40 spray foam attic insulation, R-40 insulated concrete walls, and a 4kW DC solar laminate

  16. Builders Challenge High Performance Builder Spotlight: Artistic Homes, Albuquerque, New Mexico

    SciTech Connect

    2009-12-22

    Building America Builders Challenge fact sheet on Artistic Homes of Albuquerque, New Mexico. Standard features of their homes include advanced framed 2x6 24-inch on center walls, R-21 blown insulation in the walls, and high-efficiency windows.

  17. Energy Conservation for the Home Builder: A Course for Residential Builders. Course Outline and Instructional Materials.

    ERIC Educational Resources Information Center

    Koenigshofer, Daniel R.

    Background information, handouts and related instructional materials comprise this manual for conducting a course on energy conservation for home builders. Information presented in the five- and ten-hour course is intended to help residential contractors make appropriate and cost-effective decisions in constructing energy-efficient dwellings.…

  18. A Database of Tornado Events as Perceived by the USArray Transportable Array Network

    NASA Astrophysics Data System (ADS)

    Tytell, J. E.; Vernon, F.; Reyes, J. C.

    2015-12-01

    Over the course of the deployment of Earthscope's USArray Transportable Array (TA) network there have numerous tornado events that have occurred within the changing footprint of its network. The Array Network Facility based in San Diego, California, has compiled a database of these tornado events based on data provided by the NOAA Storm Prediction Center (SPC). The SPC data itself consists of parameters such as start-end point track data for each event, maximum EF intensities, and maximum track widths. Our database is Antelope driven and combines these data from the SPC with detailed station information from the TA network. We are now able to list all available TA stations during any specific tornado event date and also provide a single calculated "nearest" TA station per individual tornado event. We aim to provide this database as a starting resource for those with an interest in investigating tornado signatures within surface pressure and seismic response data. On a larger scale, the database may be of particular interest to the infrasound research community

  19. Proton Single Event Effects (SEE) Testing of the Myrinet Crossbar Switch and Network Interface Card

    NASA Technical Reports Server (NTRS)

    Howard, James W., Jr.; LaBel, Kenneth A.; Carts, Martin A.; Stattel, Ronald; Irwin, Timothy L.; Day, John H. (Technical Monitor)

    2002-01-01

    As part of the Remote Exploration and Experimentation Project (REE), work was performed to do a proton SEE (Single Event Effect) evaluation of the Myricom network protocol system (Myrinet). This testing included the evaluation of the Myrinet crossbar switch and the Network Interface Card (NIC). To this end, two crossbar switch devices and five components in the NIC were exposed to the proton beam at the University of California at Davis Crocker Nuclear Laboratory (CNL).

  20. Coarse-grained event tree analysis for quantifying Hodgkin-Huxley neuronal network dynamics.

    PubMed

    Sun, Yi; Rangan, Aaditya V; Zhou, Douglas; Cai, David

    2012-02-01

    We present an event tree analysis of studying the dynamics of the Hodgkin-Huxley (HH) neuronal networks. Our study relies on a coarse-grained projection to event trees and to the event chains that comprise these trees by using a statistical collection of spatial-temporal sequences of relevant physiological observables (such as sequences of spiking multiple neurons). This projection can retain information about network dynamics that covers multiple features, swiftly and robustly. We demonstrate that for even small differences in inputs, some dynamical regimes of HH networks contain sufficiently higher order statistics as reflected in event chains within the event tree analysis. Therefore, this analysis is effective in discriminating small differences in inputs. Moreover, we use event trees to analyze the results computed from an efficient library-based numerical method proposed in our previous work, where a pre-computed high resolution data library of typical neuronal trajectories during the interval of an action potential (spike) allows us to avoid resolving the spikes in detail. In this way, we can evolve the HH networks using time steps one order of magnitude larger than the typical time steps used for resolving the trajectories without the library, while achieving comparable statistical accuracy in terms of average firing rate and power spectra of voltage traces. Our numerical simulation results show that the library method is efficient in the sense that the results generated by using this numerical method with much larger time steps contain sufficiently high order statistical structure of firing events that are similar to the ones obtained using a regular HH solver. We use our event tree analysis to demonstrate these statistical similarities.

  1. Event-triggered H∞ reliable control for offshore structures in network environments

    NASA Astrophysics Data System (ADS)

    Zhang, Bao-Lin; Han, Qing-Long; Zhang, Xian-Ming

    2016-04-01

    This paper investigates the network-based modeling and event-triggered H∞ reliable control for an offshore structure. First, a network-based model of the offshore structure subject to external wave force and actuator faults is presented. Second, an event-triggering mechanism is proposed such that during the control implementation, only requisite sampled-data is transmitted over networks. Third, an event-triggered H∞ reliable control problem for the offshore structure is solved by employing the Lyapunov-Krasovskii functional approach, and the desired controller can be derived. It is shown through simulation results that for possible actuator failures, the networked controller is capable of guaranteeing the stability of the offshore structure. In addition, compared with the H∞ control scheme without network settings, the proposed controller can suppress the vibration of the offshore structure to almost the same level as the H∞ controller, while the former requires less control cost. Furthermore, under the network-based controller, the communication resources can be saved significantly.

  2. Non-linear time series analysis of precipitation events using regional climate networks for Germany

    NASA Astrophysics Data System (ADS)

    Rheinwalt, Aljoscha; Boers, Niklas; Marwan, Norbert; Kurths, Jürgen; Hoffmann, Peter; Gerstengarbe, Friedrich-Wilhelm; Werner, Peter

    2016-02-01

    Synchronous occurrences of heavy rainfall events and the study of their relation in time and space are of large socio-economical relevance, for instance for the agricultural and insurance sectors, but also for the general well-being of the population. In this study, the spatial synchronization structure is analyzed as a regional climate network constructed from precipitation event series. The similarity between event series is determined by the number of synchronous occurrences. We propose a novel standardization of this number that results in synchronization scores which are not biased by the number of events in the respective time series. Additionally, we introduce a new version of the network measure directionality that measures the spatial directionality of weighted links by also taking account of the effects of the spatial embedding of the network. This measure provides an estimate of heavy precipitation isochrones by pointing out directions along which rainfall events synchronize. We propose a climatological interpretation of this measure in terms of propagating fronts or event traces and confirm it for Germany by comparing our results to known atmospheric circulation patterns.

  3. GeoBuilder: a geometric algorithm visualization and debugging system for 2D and 3D geometric computing.

    PubMed

    Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai

    2009-01-01

    Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications. PMID:19147888

  4. Life-Course Events, Social Networks, and the Emergence of Violence among Female Gang Members

    ERIC Educational Resources Information Center

    Fleisher, Mark S.; Krienert, Jessie L.

    2004-01-01

    Using data gathered from a multi-year field study, this article identifies specific life-course events shared by gang-affiliated women. Gangs emerge as a cultural adaptation or pro-social community response to poverty and racial isolation. Through the use of a social-network approach, data show that violence dramatically increases in the period…

  5. Event-triggered asynchronous intermittent communication strategy for synchronization in complex dynamical networks.

    PubMed

    Li, Huaqing; Liao, Xiaofeng; Chen, Guo; Hill, David J; Dong, Zhaoyang; Huang, Tingwen

    2015-06-01

    This paper presents a new framework for synchronization of complex network by introducing a mechanism of event-triggering distributed sampling information. A kind of event which avoids continuous communication between neighboring nodes is designed to drive the controller update of each node. The advantage of the event-triggering strategy is the significant decrease of the number of controller updates for synchronization task of complex networks involving embedded microprocessors with limited on-board resources. To describe the system's ability reaching synchronization, a concept about generalized algebraic connectivity is introduced for strongly connected networks and then extended to the strongly connected components of the directed network containing a directed spanning tree. Two sufficient conditions are presented to reveal the underlying relationships of corresponding parameters to reach global synchronization based on algebraic graph, matrix theory and Lyapunov control method. A positive lower bound for inter-event times is derived to guarantee the absence of Zeno behavior. Finally, a numerical simulation example is provided to demonstrate the theoretical results.

  6. The magnetic network location of explosive events observed in the solar transition region

    NASA Technical Reports Server (NTRS)

    Porter, J. G.; Dere, K. P.

    1991-01-01

    Compact short-lived explosive events have been observed in solar transition region lines with the High-Resolution Telescope and Spectrograph (HRTS) flown by the Naval Research Laboratory on a series of rockets and on Spacelab 2. Data from Spacelab 2 are coaligned with a simultaneous magnetogram and near-simultaneous He I 10,380 -A spectroheliogram obtained at the National Solar Observatory at Kitt Peak. The comparison shows that the explosive events occur in the solar magnetic network lanes at the boundaries of supergranular convective cells. However, the events occur away from the larger concentrations of magnetic flux in the network, in contradiction to the observed tendency of the more energetic solar phenomena to be associated with the stronger magnetic fields.

  7. Rare events statistics of random walks on networks: localisation and other dynamical phase transitions

    NASA Astrophysics Data System (ADS)

    De Bacco, Caterina; Guggiola, Alberto; Kühn, Reimer; Paga, Pierre

    2016-05-01

    Rare event statistics for random walks on complex networks are investigated using the large deviation formalism. Within this formalism, rare events are realised as typical events in a suitably deformed path-ensemble, and their statistics can be studied in terms of spectral properties of a deformed Markov transition matrix. We observe two different types of phase transition in such systems: (i) rare events which are singled out for sufficiently large values of the deformation parameter may correspond to localised modes of the deformed transition matrix; (ii) ‘mode-switching transitions’ may occur as the deformation parameter is varied. Details depend on the nature of the observable for which the rare event statistics is studied, as well as on the underlying graph ensemble. In the present paper we report results on rare events statistics for path averages of random walks in Erdős-Rényi and scale free networks. Large deviation rate functions and localisation properties are studied numerically. For observables of the type considered here, we also derive an analytical approximation for the Legendre transform of the large deviation rate function, which is valid in the large connectivity limit. It is found to agree well with simulations.

  8. Event-driven model predictive control of sewage pumping stations for sulfide mitigation in sewer networks.

    PubMed

    Liu, Yiqi; Ganigué, Ramon; Sharma, Keshab; Yuan, Zhiguo

    2016-07-01

    Chemicals such as Mg(OH)2 and iron salts are widely dosed to sewage for mitigating sulfide-induced corrosion and odour problems in sewer networks. The chemical dosing rate is usually not automatically controlled but profiled based on experience of operators, often resulting in over- or under-dosing. Even though on-line control algorithms for chemical dosing in single pipes have been developed recently, network-wide control algorithms are currently not available. The key challenge is that a sewer network is typically wide-spread comprising many interconnected sewer pipes and pumping stations, making network-wide sulfide mitigation with a relatively limited number of dosing points challenging. In this paper, we propose and demonstrate an Event-driven Model Predictive Control (EMPC) methodology, which controls the flows of sewage streams containing the dosed chemical to ensure desirable distribution of the dosed chemical throughout the pipe sections of interests. First of all, a network-state model is proposed to predict the chemical concentration in a network. An EMPC algorithm is then designed to coordinate sewage pumping station operations to ensure desirable chemical distribution in the network. The performance of the proposed control methodology is demonstrated by applying the designed algorithm to a real sewer network simulated with the well-established SeweX model using real sewage flow and characteristics data. The EMPC strategy significantly improved the sulfide mitigation performance with the same chemical consumption, compared to the current practice.

  9. Event-driven model predictive control of sewage pumping stations for sulfide mitigation in sewer networks.

    PubMed

    Liu, Yiqi; Ganigué, Ramon; Sharma, Keshab; Yuan, Zhiguo

    2016-07-01

    Chemicals such as Mg(OH)2 and iron salts are widely dosed to sewage for mitigating sulfide-induced corrosion and odour problems in sewer networks. The chemical dosing rate is usually not automatically controlled but profiled based on experience of operators, often resulting in over- or under-dosing. Even though on-line control algorithms for chemical dosing in single pipes have been developed recently, network-wide control algorithms are currently not available. The key challenge is that a sewer network is typically wide-spread comprising many interconnected sewer pipes and pumping stations, making network-wide sulfide mitigation with a relatively limited number of dosing points challenging. In this paper, we propose and demonstrate an Event-driven Model Predictive Control (EMPC) methodology, which controls the flows of sewage streams containing the dosed chemical to ensure desirable distribution of the dosed chemical throughout the pipe sections of interests. First of all, a network-state model is proposed to predict the chemical concentration in a network. An EMPC algorithm is then designed to coordinate sewage pumping station operations to ensure desirable chemical distribution in the network. The performance of the proposed control methodology is demonstrated by applying the designed algorithm to a real sewer network simulated with the well-established SeweX model using real sewage flow and characteristics data. The EMPC strategy significantly improved the sulfide mitigation performance with the same chemical consumption, compared to the current practice. PMID:27124127

  10. Distributed estimation in networked systems under periodic and event-based communication policies

    NASA Astrophysics Data System (ADS)

    Millán, Pablo; Orihuela, Luis; Jurado, Isabel; Vivas, Carlos; Rubio, Francisco R.

    2015-01-01

    This paper's aim is to present a novel design technique for distributed estimation in networked systems. The problem assumes a network of interconnected agents each one having partial access to measurements from a linear plant and broadcasting their estimations to their neighbours. The objective is to reach a reliable estimation of the plant state from every agent location. The observer's structure implemented in each agent is based on local Luenberger-like observers in combination with consensus strategies. The paper focuses on the following network related issues: delays, packet dropouts and communication policy (time and event-driven). The design problem is solved via linear matrix inequalities and stability proofs are provided. The technique is of application for sensor networks and large scale systems where centralized estimation schemes are not advisable and energy-aware implementations are of interest. Simulation examples are provided to show the performance of the proposed methodologies.

  11. Event-triggered H∞ filter design for delayed neural network with quantization.

    PubMed

    Liu, Jinliang; Tang, Jia; Fei, Shumin

    2016-10-01

    This paper is concerned with H∞ filter design for a class of neural network systems with event-triggered communication scheme and quantization. Firstly, a new event-triggered communication scheme is introduced to determine whether or not the current sampled sensor data should be broadcasted and transmitted to quantizer, which can save the limited communication resource. Secondly, a logarithmic quantizer is used to quantify the sampled data, which can reduce the data transmission rate in the network. Thirdly, considering the influence of the constrained network resource, we investigate the problem of H∞ filter design for a class of event-triggered neural network systems with quantization. By using Lyapunov functional and linear matrix inequality (LMI) techniques, some delay-dependent stability conditions for the existence of the desired filter are obtained. Furthermore, the explicit expression is given for the designed filter parameters in terms of LMIs. Finally, a numerical example is given to show the usefulness of the obtained theoretical results. PMID:27459409

  12. A novel decentralised event-triggered ? control for network control systems with communication delays

    NASA Astrophysics Data System (ADS)

    Li, Fuqiang; Fu, Jingqi; Du, Dajun

    2016-10-01

    This paper studies a novel decentralised event-triggered ? control for network control systems with communication delays and external disturbances. To overcome the drawbacks that the relative event-triggered mechanism (ETM) generates many events when system is close to the origin and the absolute ETM produces many events when system is far away from the origin, a novel decentralised sampled-data-based ETM is first proposed. By using both local state-dependent and state-independent information, the decentralised ETM can effectively reduce network loads in each channel during the whole operation time. Then, a novel general system model with parameters of the decentralised ETM, communication delays and external disturbances is presented, and sufficient conditions for the ultimately bounded stability and asymptotic stability of the closed-loop system are obtained. Specially, the quantitative relationship between the boundness of the stability region and the parameters of the decentralised ETM is established. Moreover, to overcome the inconvenience of the two-step design method that controllers are required to be given a priori, a co-design scheme is presented to design the decentralised event generators and the output-based controller simultaneously. Finally, numerical examples confirm the effectiveness of the proposed method.

  13. Sequence-of-events-driven automation of the deep space network

    NASA Technical Reports Server (NTRS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1996-01-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  14. Sequence-of-Events-Driven Automation of the Deep Space Network

    NASA Astrophysics Data System (ADS)

    Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.

    1995-10-01

    In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.

  15. Seismic network detection probability assessment using waveforms and accounting to event association logic

    NASA Astrophysics Data System (ADS)

    Pinsky, Vladimir; Shapira, Avi

    2016-05-01

    The geographical area where a seismic event of magnitude M ≥ M t is detected by a seismic station network, for a defined probability is derived from a station probability of detection estimated as a function of epicentral distance. The latter is determined from both the bulletin data and the waveforms recorded by the station during the occurrence of the event with and without band-pass filtering. For simulating the real detection process, the waveforms are processed using the conventional Carl Johnson detection and association algorithm. The attempt is presented to account for the association time criterion in addition to the conventional approach adopted by the known PMC method.

  16. Passive solar design guidelines and evaluation criteria for builders

    NASA Astrophysics Data System (ADS)

    Conkling, M. L.

    1983-11-01

    The showcase of Solar Homes, which features two model solar villages in Eldorado at Santa Fe and Rio Rancho, New Mexico is described. Key features of the project which include its privete sector character, marketplace considerations, and the need for uniform solar home quality are pointed out. Guidelines using a balance of conservation and passive solar were set and the issue of occupant comfort were addressed. The guidelines were used to establish criteria and worksheets for builders seeking entry into the Showcase project. Designs become examples of the transfer of research to practice. The guidelines for solar builders directly transfers passive solar research and technology to the builders.

  17. Automated generation of discrete event controllers for dynamic reconfiguration of autonomous sensor networks

    NASA Astrophysics Data System (ADS)

    Damiani, Sarah; Griffin, Christopher; Phoha, Shashi

    2003-12-01

    Autonomous Sensor Networks have the potential for broad applicability to national security, intelligent transportation, industrial production and environmental and hazardous process control. Distributed sensors may be used for detecting bio-terrorist attacks, for contraband interdiction, border patrol, monitoring building safety and security, battlefield surveillance, or may be embedded in complex dynamic systems for enabling fault tolerant operations. In this paper we present algorithms and automation tools for constructing discrete event controllers for complex networked systems that restrict the dynamic behavior of the system according to given specifications. In our previous work we have modeled dynamic system as a discrete event automation whose open loop behavior is represented as a language L of strings generated with the alphabet 'Elipson' of all possible atomic events that cause state transitions in the network. The controlled behavior is represented by a sublanguage K, contained in L, that restricts the behavior of the system according to the specifications of the controller. We have developed the algebraic structure of controllable sublanguages as perfect right partial ideals that satisfy a precontrollability condition. In this paper we develop an iterative algorithm to take an ad hoc specification described using a natural language, and to formulate a complete specification that results in a controllable sublanguage. A supervisory controller modeled as an automaton that runs synchronously with the open loop system in the sense of Ramadge and Wonham is automatically generated to restrict the behavior of the open loop system to the controllable sublanguage. A battlefield surveillance scenario illustrates the iterative evolution of ad hoc specifications for controlling an autonomous sensor network and the generation of a controller that reconfigures the sensor network to dynamically adapt to environmental perturbations.

  18. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    PubMed Central

    Nguyen, Vinh Hao; Suh, Young Soo

    2009-01-01

    This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD), sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen. PMID:22574063

  19. The Knowledge-Integrated Network Biomarkers Discovery for Major Adverse Cardiac Events

    PubMed Central

    Jin, Guangxu; Zhou, Xiaobo; Wang, Honghui; Zhao, Hong; Cui, Kemi; Zhang, Xiang-Sun; Chen, Luonan; Hazen, Stanley L.; Li, King; Wong, Stephen T. C.

    2010-01-01

    The mass spectrometry (MS) technology in clinical proteomics is very promising for discovery of new biomarkers for diseases management. To overcome the obstacles of data noises in MS analysis, we proposed a new approach of knowledge-integrated biomarker discovery using data from Major Adverse Cardiac Events (MACE) patients. We first built up a cardiovascular-related network based on protein information coming from protein annotations in Uniprot, protein–protein interaction (PPI), and signal transduction database. Distinct from the previous machine learning methods in MS data processing, we then used statistical methods to discover biomarkers in cardiovascular-related network. Through the tradeoff between known protein information and data noises in mass spectrometry data, we finally could firmly identify those high-confident biomarkers. Most importantly, aided by protein–protein interaction network, that is, cardiovascular-related network, we proposed a new type of biomarkers, that is, network biomarkers, composed of a set of proteins and the interactions among them. The candidate network biomarkers can classify the two groups of patients more accurately than current single ones without consideration of biological molecular interaction. PMID:18665624

  20. Secret Forwarding of Events over Distributed Publish/Subscribe Overlay Network.

    PubMed

    Yoon, Young; Kim, Beom Heyn

    2016-01-01

    Publish/subscribe is a communication paradigm where loosely-coupled clients communicate in an asynchronous fashion. Publish/subscribe supports the flexible development of large-scale, event-driven and ubiquitous systems. Publish/subscribe is prevalent in a number of application domains such as social networking, distributed business processes and real-time mission-critical systems. Many publish/subscribe applications are sensitive to message loss and violation of privacy. To overcome such issues, we propose a novel method of using secret sharing and replication techniques. This is to reliably and confidentially deliver decryption keys along with encrypted publications even under the presence of several Byzantine brokers across publish/subscribe overlay networks. We also propose a framework for dynamically and strategically allocating broker replicas based on flexibly definable criteria for reliability and performance. Moreover, a thorough evaluation is done through a case study on social networks using the real trace of interactions among Facebook users.

  1. Secret Forwarding of Events over Distributed Publish/Subscribe Overlay Network.

    PubMed

    Yoon, Young; Kim, Beom Heyn

    2016-01-01

    Publish/subscribe is a communication paradigm where loosely-coupled clients communicate in an asynchronous fashion. Publish/subscribe supports the flexible development of large-scale, event-driven and ubiquitous systems. Publish/subscribe is prevalent in a number of application domains such as social networking, distributed business processes and real-time mission-critical systems. Many publish/subscribe applications are sensitive to message loss and violation of privacy. To overcome such issues, we propose a novel method of using secret sharing and replication techniques. This is to reliably and confidentially deliver decryption keys along with encrypted publications even under the presence of several Byzantine brokers across publish/subscribe overlay networks. We also propose a framework for dynamically and strategically allocating broker replicas based on flexibly definable criteria for reliability and performance. Moreover, a thorough evaluation is done through a case study on social networks using the real trace of interactions among Facebook users. PMID:27367610

  2. Secret Forwarding of Events over Distributed Publish/Subscribe Overlay Network

    PubMed Central

    Kim, Beom Heyn

    2016-01-01

    Publish/subscribe is a communication paradigm where loosely-coupled clients communicate in an asynchronous fashion. Publish/subscribe supports the flexible development of large-scale, event-driven and ubiquitous systems. Publish/subscribe is prevalent in a number of application domains such as social networking, distributed business processes and real-time mission-critical systems. Many publish/subscribe applications are sensitive to message loss and violation of privacy. To overcome such issues, we propose a novel method of using secret sharing and replication techniques. This is to reliably and confidentially deliver decryption keys along with encrypted publications even under the presence of several Byzantine brokers across publish/subscribe overlay networks. We also propose a framework for dynamically and strategically allocating broker replicas based on flexibly definable criteria for reliability and performance. Moreover, a thorough evaluation is done through a case study on social networks using the real trace of interactions among Facebook users. PMID:27367610

  3. Identifying positions from affiliation networks: Preserving the duality of people and events

    PubMed Central

    Field, Sam; Frank, Kenneth A.; Schiller, Kathryn; Riegle-Crumb, Catherine; Muller, Chandra

    2010-01-01

    Frank’s [Frank, K.A., 1995. Identifying cohesive subgroups. Social Networks 17, 27–56] clustering technique for one-mode social network data is adapted to identify positions in affiliation networks by drawing on recent extensions of p* models to two-mode data. The algorithm is applied to the classic Deep South data on southern women and the social events in which they participated with results comparable to other algorithms. Monte Carlo simulations are used to generate sampling distributions to test for the presence of clustering in new data sets and to evaluate the performance of the algorithm. The algorithm and simulation results are then applied to high school students’ transcripts from one school from the Adolescent Health and Academic Achievement (AHAA) extension of the National Longitudinal Study of Adolescent Health. PMID:20354579

  4. Virtualization of event sources in wireless sensor networks for the internet of things.

    PubMed

    Lucas Martínez, Néstor; Martínez, José-Fernán; Hernández Díaz, Vicente

    2014-12-01

    Wireless Sensor Networks (WSNs) are generally used to collect information from the environment. The gathered data are delivered mainly to sinks or gateways that become the endpoints where applications can retrieve and process such data. However, applications would also expect from a WSN an event-driven operational model, so that they can be notified whenever occur some specific environmental changes instead of continuously analyzing the data provided periodically. In either operational model, WSNs represent a collection of interconnected objects, as outlined by the Internet of Things. Additionally, in order to fulfill the Internet of Things principles, Wireless Sensor Networks must have a virtual representation that allows indirect access to their resources, a model that should also include the virtualization of event sources in a WSN. Thus, in this paper a model for a virtual representation of event sources in a WSN is proposed. They are modeled as internet resources that are accessible by any internet application, following an Internet of Things approach. The model has been tested in a real implementation where a WSN has been deployed in an open neighborhood environment. Different event sources have been identified in the proposed scenario, and they have been represented following the proposed model.

  5. Virtualization of Event Sources in Wireless Sensor Networks for the Internet of Things

    PubMed Central

    Martínez, Néstor Lucas; Martínez, José-Fernán; Díaz, Vicente Hernández

    2014-01-01

    Wireless Sensor Networks (WSNs) are generally used to collect information from the environment. The gathered data are delivered mainly to sinks or gateways that become the endpoints where applications can retrieve and process such data. However, applications would also expect from a WSN an event-driven operational model, so that they can be notified whenever occur some specific environmental changes instead of continuously analyzing the data provided periodically. In either operational model, WSNs represent a collection of interconnected objects, as outlined by the Internet of Things. Additionally, in order to fulfill the Internet of Things principles, Wireless Sensor Networks must have a virtual representation that allows indirect access to their resources, a model that should also include the virtualization of event sources in a WSN. Thus, in this paper a model for a virtual representation of event sources in a WSN is proposed. They are modeled as internet resources that are accessible by any internet application, following an Internet of Things approach. The model has been tested in a real implementation where a WSN has been deployed in an open neighborhood environment. Different event sources have been identified in the proposed scenario, and they have been represented following the proposed model. PMID:25470489

  6. Virtualization of event sources in wireless sensor networks for the internet of things.

    PubMed

    Lucas Martínez, Néstor; Martínez, José-Fernán; Hernández Díaz, Vicente

    2014-01-01

    Wireless Sensor Networks (WSNs) are generally used to collect information from the environment. The gathered data are delivered mainly to sinks or gateways that become the endpoints where applications can retrieve and process such data. However, applications would also expect from a WSN an event-driven operational model, so that they can be notified whenever occur some specific environmental changes instead of continuously analyzing the data provided periodically. In either operational model, WSNs represent a collection of interconnected objects, as outlined by the Internet of Things. Additionally, in order to fulfill the Internet of Things principles, Wireless Sensor Networks must have a virtual representation that allows indirect access to their resources, a model that should also include the virtualization of event sources in a WSN. Thus, in this paper a model for a virtual representation of event sources in a WSN is proposed. They are modeled as internet resources that are accessible by any internet application, following an Internet of Things approach. The model has been tested in a real implementation where a WSN has been deployed in an open neighborhood environment. Different event sources have been identified in the proposed scenario, and they have been represented following the proposed model. PMID:25470489

  7. Detail of builder's plate at northeast end. Waterville Bridge, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of builder's plate at northeast end. - Waterville Bridge, Spanning Swatara Creek at Appalachian Trail (moved from Little Pine Creek at State Route 44, Waterville, Lycoming County), Green Point, Lebanon County, PA

  8. 8. DETAIL OF BUILDER'S PLATE, PROCLAIMING THE INVENTOR OF THIS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF BUILDER'S PLATE, PROCLAIMING THE INVENTOR OF THIS BRIDGE TYPE, WILLIAM SCHERZER. - Pennsylvania Railroad, "Eight-track" Bascule Bridge, Spanning Sanitary & Ship Canal, west of Western Avenue, Chicago, Cook County, IL

  9. Discrete-event simulation of a wide-area health care network.

    PubMed Central

    McDaniel, J G

    1995-01-01

    OBJECTIVE: Predict the behavior and estimate the telecommunication cost of a wide-area message store-and-forward network for health care providers that uses the telephone system. DESIGN: A tool with which to perform large-scale discrete-event simulations was developed. Network models for star and mesh topologies were constructed to analyze the differences in performances and telecommunication costs. The distribution of nodes in the network models approximates the distribution of physicians, hospitals, medical labs, and insurers in the Province of Saskatchewan, Canada. Modeling parameters were based on measurements taken from a prototype telephone network and a survey conducted at two medical clinics. Simulation studies were conducted for both topologies. RESULTS: For either topology, the telecommunication cost of a network in Saskatchewan is projected to be less than $100 (Canadian) per month per node. The estimated telecommunication cost of the star topology is approximately half that of the mesh. Simulations predict that a mean end-to-end message delivery time of two hours or less is achievable at this cost. A doubling of the data volume results in an increase of less than 50% in the mean end-to-end message transfer time. CONCLUSION: The simulation models provided an estimate of network performance and telecommunication cost in a specific Canadian province. At the expected operating point, network performance appeared to be relatively insensitive to increases in data volume. Similar results might be anticipated in other rural states and provinces in North America where a telephone-based network is desired. PMID:7583646

  10. A Tool for Modelling the Impact of Triggered Landslide Events on Road Networks

    NASA Astrophysics Data System (ADS)

    Taylor, F. E.; Santangelo, M.; Marchesini, I.; Malamud, B. D.; Guzzetti, F.

    2014-12-01

    In the minutes to weeks after a landslide trigger such as an earthquake or heavy rain, tens to thousands of landslides may occur across a region, resulting in simultaneous blockages across the road network, which can impact recovery efforts. In this paper, we show the development, application and confrontation with observed data, of a model to semi-stochastically simulate triggered landslide events and their impact on road network topologies. In this model, "synthetic" triggered landslide event inventories are created by randomly selecting landslide sizes and shapes from already established statistical distributions. The landslides are then semi-randomly distributed over the region's road network, where they are more or less likely to land based on a landslide susceptibility map. The number, size and network impact of the road blockages is then calculated. This process is repeated in a Monte Carlo type simulation to assess a range of scenarios. Due to the generally applicable statistical distributions used to create the synthetic triggered landslide event inventories and the relatively minimal data requirements to run the model, the model is theoretically applicable to many regions of the world where triggered landslide events occur. Current work focuses on applying the model to two regions: (i) the Collazzone basin (79 km2) in Central Italy where 422 landslides were triggered by rapid snowmelt in January 1997, (ii) the Oat Mountain quadrangle (155 km2) in California, USA, where 1,350 landslides were triggered by the Northridge Earthquake (M = 6.7) in January 1994. When appropriate adjustments are made to susceptibility in the immediate vicinity of the roads, model results match reasonably well observations. In Collazzone (length of road = 153 km, landslide density = 5.2 landslides km-2), the median number of road blockages over 100 model runs was 5 (±2.5 s.d.), compared to the observed number of 5. In Northridge (length of road = 780 km, landslide density = 8

  11. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data

    PubMed Central

    2014-01-01

    Background Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. Methods This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Results Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. Conclusions The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest. PMID:25209121

  12. Using the relational event model (REM) to investigate the temporal dynamics of animal social networks

    PubMed Central

    Tranmer, Mark; Marcum, Christopher Steven; Morton, F. Blake; Croft, Darren P.; de Kort, Selvino R.

    2015-01-01

    Social dynamics are of fundamental importance in animal societies. Studies on nonhuman animal social systems often aggregate social interaction event data into a single network within a particular time frame. Analysis of the resulting network can provide a useful insight into the overall extent of interaction. However, through aggregation, information is lost about the order in which interactions occurred, and hence the sequences of actions over time. Many research hypotheses relate directly to the sequence of actions, such as the recency or rate of action, rather than to their overall volume or presence. Here, we demonstrate how the temporal structure of social interaction sequences can be quantified from disaggregated event data using the relational event model (REM). We first outline the REM, explaining why it is different from other models for longitudinal data, and how it can be used to model sequences of events unfolding in a network. We then discuss a case study on the European jackdaw, Corvus monedula, in which temporal patterns of persistence and reciprocity of action are of interest, and present and discuss the results of a REM analysis of these data. One of the strengths of a REM analysis is its ability to take into account different ways in which data are collected. Having explained how to take into account the way in which the data were collected for the jackdaw study, we briefly discuss the application of the model to other studies. We provide details of how the models may be fitted in the R statistical software environment and outline some recent extensions to the REM framework. PMID:26190856

  13. Supplement consumption in body builder athletes

    PubMed Central

    Karimian, Jahangir; Esfahani, Parivash Shekarchizadeh

    2011-01-01

    BACKGROUND: Widespread use of supplements is observed among world athletes in different fields. The aim of this study was to estimate the prevalence and determinants of using supplements among body builder athletes. METHODS: This cross-sectional study was conducted on 250 men and 250 women from 30 different bodybuilding clubs. Participants were asked to complete a self-administered standardized anonymous check-list. RESULTS: Forty nine percent of the respondents declared supplement use. Men were more likely to take supplements than women (86.8% vs. 11.2%, p = 0.001). Reasons for using supplements were reported to be for health (45%), enhancing the immune system (40%) and improving athletic performance (25%). Most athletes (72%) had access to a nutritionist but underused this resource. Coaches (65%) had the greatest influence on supplementation practices followed by nutritionists (30%) and doctors (25%) after them. CONCLUSIONS: The prevalence of supplement use among bodybuilders was high. Sex, health-related issues and sport experts were determinant factors of supplement use. PMID:22973330

  14. Feedback between Accelerator Physicists and magnet builders

    SciTech Connect

    Peggs, S.

    1995-12-31

    Our task is not to record history but to change it. (K. Marx (paraphrased)) How should Accelerator Physicists set magnet error specifications? In a crude social model, they place tolerance limits on undesirable nonlinearities and errors (higher order harmonics, component alignments, etc.). The Magnet Division then goes away for a suitably lengthy period of time, and comes back with a working magnet prototype that is reproduced in industry. A better solution is to set no specifications. Accelerator Physicists begin by evaluating expected values of harmonics, generated by the Magnet Division, before and during prototype construction. Damaging harmonics are traded off against innocuous harmonics as the prototype design evolves, lagging one generation behind the evolution of expected harmonics. Finally, the real harmonics are quickly evaluated during early industrial production, allowing a final round of performance trade-offs, using contingency scenarios prepared earlier. This solution assumes a close relationship and rapid feedback between the Accelerator Physicists and the magnet builders. What follows is one perspective of the way that rapid feedback was used to `change history` (improve linear and dynamic aperture) at RHIC, to great benefit.

  15. Liquid phase structure within an unsaturated fracture network beneath a surface infiltration event: Field experiment

    NASA Astrophysics Data System (ADS)

    Glass, Robert J.; Nicholl, Michael J.; Ramirez, Abelardo L.; Daily, William D.

    2002-10-01

    We conducted a simple field experiment to elucidate structure (i.e., geometry) of the liquid phase (water) resulting from ponded infiltration into a pervasive fracture network that dissected a nearly impermeable rock matrix. Over a 46 min period, dyed water was infiltrated from a surface pond while electrical resistance tomography (ERT) was employed to monitor the rapid invasion of the initially dry fracture network and subsequent drainage. We then excavated the rock mass to a depth of ˜5 m, mapping the fracture network and extent of dye staining over a series of horizontal pavements located directly beneath the pond. Near the infiltration surface, flow was dominated by viscous forces, and the fracture network was fully stained. With increasing depth, flow transitioned to unsaturated conditions, and the phase structure became complicated, exhibiting evidence of fragmentation, preferential flow, fingers, irregular wetting patterns, and varied behavior at fracture intersections. ERT images demonstrate that water spanned the instrumented network rapidly on ponding and also rapidly drained after ponding was terminated. Estimates suggest that our excavation captured from ˜15 to 1% or less of the rock volume interrogated by our infiltration slug, and thus the penetration depth from our short ponding event could have been quite large.

  16. Input sensitivity analysis of neural network models for flood event prediction in ungauged catchments

    NASA Astrophysics Data System (ADS)

    Dawson, Christian W.; Abrahart, Robert J.

    2010-05-01

    Artificial neural networks have now been applied to problems within hydrology for nearly twenty years - primarily in rainfall-runoff modelling and flood forecasting. In recent years the scope of this research has expanded to encompass more theoretical issues and address some of the earlier criticisms of such models - including the internal behaviour of neural networks and the link with physically-based models. While there has been some work on the application of neural network models to predicting flood events in ungauged catchments, such research is limited to only a few studies in a handful of regions worldwide. In this paper neural network models are developed using the UK Environment Agency's HiFlows-UK dataset released in 2008. This dataset provides catchment descriptors and annual maximum series for over 900 sites across the UK. The neural network models predict the index flood (median flood) based on four catchment characteristics: area, standard average annual rainfall, index of flood attenuation due to reservoirs and lakes, and baseflow index. These models are assessed using a novel sensitivity analysis procedure that is designed to expose the internal relationship that has been implemented between each catchment characteristic and the index flood. Results provide some physical explanation of model behaviour - linking catchment characteristics to the calculated index flood. The results are compared with the FEH QMED mathematical model and with older equivalent models developed on the original FEH data set.

  17. Consensus analysis of networks with time-varying topology and event-triggered diffusions.

    PubMed

    Han, Yujuan; Lu, Wenlian; Chen, Tianping

    2015-11-01

    This paper studies the consensus problem of networks with time-varying topology. Event-triggered rules are employed in diffusion coupling terms to reduce the updating load of the coupled system. Two strategies are considered: event-triggered strategy, that each node observes the state information in an instantaneous way, to determine the next triggering event time, and self-triggered strategy, that each node only needs to observe the state information at the event time to predict the next triggering event time. In each strategy, two kinds of algorithms are considered: the pull-based algorithm, that the diffusion coupling term of every node is updated at the latest observations of the neighborhood at its triggered time, and push-based algorithm, the diffusion coupling term of every node uses the state information of its neighborhood at their latest triggered time. It is proved that if the coupling matrix across time intervals with length less than some given constant has spanning trees, then the proposed algorithms can realize consensus. Examples with numerical simulation are provided to show the effectiveness of the theoretical results.

  18. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    PubMed

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly.

  19. Words Analysis of Online Chinese News Headlines about Trending Events: A Complex Network Perspective

    PubMed Central

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines’ keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words’ networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly. PMID:25807376

  20. Targeting Planetary Anomalies in Microlensing Events with the Las Cumbres Observatory Global Telescope Network

    NASA Astrophysics Data System (ADS)

    Street, Rachel; RoboNet Microlensing Team

    2007-12-01

    By the nature of these transient, non-repeating phenomena, observing microlensing events requires a fast, responsive system of telescopes distributed over a range of longitudes. The Las Cumbres Observatory Global Telescope Network currently consists of the 2m Faulkes Telescopes North and South. Over the course of the next few years LCOGT will expand this network to a complement of 44, including 2x2m, 18x1m and 24x0.4m which will be sited in clusters of 3-4 telescopes such that at least one site is in the dark at any given time, enabling us to provide 24hr coverage of any transient event. The telescopes are controlled via a robotic scheduler, allowing a fast response to alerts from eStar or other robotic agents or to manual override. Both 2m telescopes have been engaged in robotically-controlled follow-up of 222 OGLE and MOA alerts during the 2007 Bulge season and intensive observations of 2 events displaying clear anomalies. We summarise here the results to date.

  1. The Waveform Correlation Event Detection System project, Phase II: Testing with the IDC primary network

    SciTech Connect

    Young, C.J.; Beiriger, J.I.; Moore, S.G.

    1998-04-01

    Further improvements to the Waveform Correlation Event Detection System (WCEDS) developed by Sandia Laboratory have made it possible to test the system on the accepted Comprehensive Test Ban Treaty (CTBT) seismic monitoring network. For our test interval we selected a 24-hour period from December 1996, and chose to use the Reviewed Event Bulletin (REB) produced by the Prototype International Data Center (PIDC) as ground truth for evaluating the results. The network is heterogeneous, consisting of array and three-component sites, and as a result requires more flexible waveform processing algorithms than were available in the first version of the system. For simplicity and superior performance, we opted to use the spatial coherency algorithm of Wagner and Owens (1996) for both types of sites. Preliminary tests indicated that the existing version of WCEDS, which ignored directional information, could not achieve satisfactory detection or location performance for many of the smaller events in the REB, particularly those in the south Pacific where the network coverage is unusually sparse. To achieve an acceptable level of performance, we made modifications to include directional consistency checks for the correlations, making the regions of high correlation much less ambiguous. These checks require the production of continuous azimuth and slowness streams for each station, which is accomplished by means of FK processing for the arrays and power polarization processing for the three-component sites. In addition, we added the capability to use multiple frequency-banded data streams for each site to increase sensitivity to phases whose frequency content changes as a function of distance.

  2. Disentangling the attention network test: behavioral, event related potentials, and neural source analyses

    PubMed Central

    Galvao-Carmona, Alejandro; González-Rosa, Javier J.; Hidalgo-Muñoz, Antonio R.; Páramo, Dolores; Benítez, María L.; Izquierdo, Guillermo; Vázquez-Marrufo, Manuel

    2014-01-01

    Background: The study of the attentional system remains a challenge for current neuroscience. The “Attention Network Test” (ANT) was designed to study simultaneously three different attentional networks (alerting, orienting, and executive) based in subtraction of different experimental conditions. However, some studies recommend caution with these calculations due to the interactions between the attentional networks. In particular, it is highly relevant that several interpretations about attentional impairment have arisen from these calculations in diverse pathologies. Event related potentials (ERPs) and neural source analysis can be applied to disentangle the relationships between these attentional networks not specifically shown by behavioral measures. Results: This study shows that there is a basic level of alerting (tonic alerting) in the no cue (NC) condition, represented by a slow negative trend in the ERP trace prior to the onset of the target stimuli. A progressive increase in the CNV amplitude related to the amount of information provided by the cue conditions is also shown. Neural source analysis reveals specific modulations of the CNV related to a task-related expectancy presented in the NC condition; a late modulation triggered by the central cue (CC) condition and probably representing a generic motor preparation; and an early and late modulation for spatial cue (SC) condition suggesting specific motor and sensory preactivation. Finally, the first component in the information processing of the target stimuli modulated by the interaction between orienting network and the executive system can be represented by N1. Conclusions: The ANT is useful as a paradigm to study specific attentional mechanisms and their interactions. However, calculation of network effects is based in subtractions with non-comparable experimental conditions, as evidenced by the present data, which can induce misinterpretations in the study of the attentional capacity in human

  3. [Analysis of policies in activating the Infectious Disease Specialist Network (IDSN) for bioterrorism events].

    PubMed

    Kim, Yang Soo

    2008-07-01

    Bioterrorism events have worldwide impacts, not only in terms of security and public health policy, but also in other related sectors. Many countries, including Korea, have set up new administrative and operational structures and adapted their preparedness and response plans in order to deal with new kinds of threats. Korea has dual surveillance systems for the early detection of bioterrorism. The first is syndromic surveillance that typically monitors non-specific clinical information that may indicate possible bioterrorism-associated diseases before specific diagnoses are made. The other is infectious disease specialist network that diagnoses and responds to specific illnesses caused by intentional release of biologic agents. Infectious disease physicians, clinical microbiologists, and infection control professionals play critical and complementary roles in these networks. Infectious disease specialists should develop practical and realistic response plans for their institutions in partnership with local and state health departments, in preparation for a real or suspected bioterrorism attack.

  4. HOS network-based classification of power quality events via regression algorithms

    NASA Astrophysics Data System (ADS)

    Palomares Salas, José Carlos; González de la Rosa, Juan José; Sierra Fernández, José María; Pérez, Agustín Agüera

    2015-12-01

    This work compares seven regression algorithms implemented in artificial neural networks (ANNs) supported by 14 power-quality features, which are based in higher-order statistics. Combining time and frequency domain estimators to deal with non-stationary measurement sequences, the final goal of the system is the implementation in the future smart grid to guarantee compatibility between all equipment connected. The principal results are based in spectral kurtosis measurements, which easily adapt to the impulsive nature of the power quality events. These results verify that the proposed technique is capable of offering interesting results for power quality (PQ) disturbance classification. The best results are obtained using radial basis networks, generalized regression, and multilayer perceptron, mainly due to the non-linear nature of data.

  5. Morphogenesis in sea urchin embryos: linking cellular events to gene regulatory network states

    PubMed Central

    Lyons, Deidre; Kaltenbach, Stacy; McClay, David R.

    2013-01-01

    Gastrulation in the sea urchin begins with ingression of the primary mesenchyme cells (PMCs) at the vegetal pole of the embryo. After entering the blastocoel the PMCs migrate, form a syncitium, and synthesize the skeleton of the embryo. Several hours after the PMCs ingress the vegetal plate buckles to initiate invagination of the archenteron. That morphogenetic process occurs in several steps. The non-skeletogenic cells produce the initial inbending of the vegetal plate. Endoderm cells then rearrange and extend the length of the gut across the blastocoel to a target near the animal pole. Finally, cells that will form part of the midgut and hindgut are added to complete gastrulation. Later, the stomodeum invaginates from the oral ectoderm and fuses with the foregut to complete the archenteron. In advance of, and during these morphogenetic events an increasingly complex gene regulatory network controls the specification and the cell biological events that conduct the gastrulation movements. PMID:23801438

  6. Digital Learning Network Education Events for the Desert Research and Technology Studies

    NASA Technical Reports Server (NTRS)

    Paul, Heather L.; Guillory, Erika R.

    2007-01-01

    NASA s Digital Learning Network (DLN) reaches out to thousands of students each year through video conferencing and webcasting. As part of NASA s Strategic Plan to reach the next generation of space explorers, the DLN develops and delivers educational programs that reinforce principles in the areas of science, technology, engineering and mathematics. The DLN has created a series of live education videoconferences connecting the Desert Research and Technology Studies (RATS) field test to students across the United States. The programs are also extended to students around the world via live webcasting. The primary focus of the events is the Vision for Space Exploration. During the programs, Desert RATS engineers and scientists inform and inspire students about the importance of exploration and share the importance of the field test as it correlates with plans to return to the Moon and explore Mars. This paper describes the events that took place in September 2006.

  7. Characterization of computer network events through simultaneous feature selection and clustering of intrusion alerts

    NASA Astrophysics Data System (ADS)

    Chen, Siyue; Leung, Henry; Dondo, Maxwell

    2014-05-01

    As computer network security threats increase, many organizations implement multiple Network Intrusion Detection Systems (NIDS) to maximize the likelihood of intrusion detection and provide a comprehensive understanding of intrusion activities. However, NIDS trigger a massive number of alerts on a daily basis. This can be overwhelming for computer network security analysts since it is a slow and tedious process to manually analyse each alert produced. Thus, automated and intelligent clustering of alerts is important to reveal the structural correlation of events by grouping alerts with common features. As the nature of computer network attacks, and therefore alerts, is not known in advance, unsupervised alert clustering is a promising approach to achieve this goal. We propose a joint optimization technique for feature selection and clustering to aggregate similar alerts and to reduce the number of alerts that analysts have to handle individually. More precisely, each identified feature is assigned a binary value, which reflects the feature's saliency. This value is treated as a hidden variable and incorporated into a likelihood function for clustering. Since computing the optimal solution of the likelihood function directly is analytically intractable, we use the Expectation-Maximisation (EM) algorithm to iteratively update the hidden variable and use it to maximize the expected likelihood. Our empirical results, using a labelled Defense Advanced Research Projects Agency (DARPA) 2000 reference dataset, show that the proposed method gives better results than the EM clustering without feature selection in terms of the clustering accuracy.

  8. Discrete event command and control for networked teams with multiple missions

    NASA Astrophysics Data System (ADS)

    Lewis, Frank L.; Hudas, Greg R.; Pang, Chee Khiang; Middleton, Matthew B.; McMurrough, Christopher

    2009-05-01

    During mission execution in military applications, the TRADOC Pamphlet 525-66 Battle Command and Battle Space Awareness capabilities prescribe expectations that networked teams will perform in a reliable manner under changing mission requirements, varying resource availability and reliability, and resource faults. In this paper, a Command and Control (C2) structure is presented that allows for computer-aided execution of the networked team decision-making process, control of force resources, shared resource dispatching, and adaptability to change based on battlefield conditions. A mathematically justified networked computing environment is provided called the Discrete Event Control (DEC) Framework. DEC has the ability to provide the logical connectivity among all team participants including mission planners, field commanders, war-fighters, and robotic platforms. The proposed data management tools are developed and demonstrated on a simulation study and an implementation on a distributed wireless sensor network. The results show that the tasks of multiple missions are correctly sequenced in real-time, and that shared resources are suitably assigned to competing tasks under dynamically changing conditions without conflicts and bottlenecks.

  9. How events determine spreading patterns: information transmission via internal and external influences on social networks

    NASA Astrophysics Data System (ADS)

    Liu, Chuang; Zhan, Xiu-Xiu; Zhang, Zi-Ke; Sun, Gui-Quan; Hui, Pak Ming

    2015-11-01

    Recently, information transmission models motivated by the classical epidemic propagation, have been applied to a wide-range of social systems, generally assume that information mainly transmits among individuals via peer-to-peer interactions on social networks. In this paper, we consider one more approach for users to get information: the out-of-social-network influence. Empirical analyzes of eight typical events’ diffusion on a very large micro-blogging system, Sina Weibo, show that the external influence has significant impact on information spreading along with social activities. In addition, we propose a theoretical model to interpret the spreading process via both internal and external channels, considering three essential properties: (i) memory effect; (ii) role of spreaders; and (iii) non-redundancy of contacts. Experimental and mathematical results indicate that the information indeed spreads much quicker and broader with mutual effects of the internal and external influences. More importantly, the present model reveals that the event characteristic would highly determine the essential spreading patterns once the network structure is established. The results may shed some light on the in-depth understanding of the underlying dynamics of information transmission on real social networks.

  10. An event-based neural network architecture with an asynchronous programmable synaptic memory.

    PubMed

    Moradi, Saber; Indiveri, Giacomo

    2014-02-01

    We present a hybrid analog/digital very large scale integration (VLSI) implementation of a spiking neural network with programmable synaptic weights. The synaptic weight values are stored in an asynchronous Static Random Access Memory (SRAM) module, which is interfaced to a fast current-mode event-driven DAC for producing synaptic currents with the appropriate amplitude values. These currents are further integrated by current-mode integrator synapses to produce biophysically realistic temporal dynamics. The synapse output currents are then integrated by compact and efficient integrate and fire silicon neuron circuits with spike-frequency adaptation and adjustable refractory period and spike-reset voltage settings. The fabricated chip comprises a total of 32 × 32 SRAM cells, 4 × 32 synapse circuits and 32 × 1 silicon neurons. It acts as a transceiver, receiving asynchronous events in input, performing neural computation with hybrid analog/digital circuits on the input spikes, and eventually producing digital asynchronous events in output. Input, output, and synaptic weight values are transmitted to/from the chip using a common communication protocol based on the Address Event Representation (AER). Using this representation it is possible to interface the device to a workstation or a micro-controller and explore the effect of different types of Spike-Timing Dependent Plasticity (STDP) learning algorithms for updating the synaptic weights values in the SRAM module. We present experimental results demonstrating the correct operation of all the circuits present on the chip.

  11. Diet Segregation between Cohabiting Builder and Inquiline Termite Species

    PubMed Central

    Florencio, Daniela Faria; Marins, Alessandra; Rosa, Cassiano Sousa; Cristaldo, Paulo Fellipe; Araújo, Ana Paula Albano; Silva, Ivo Ribeiro; DeSouza, Og

    2013-01-01

    How do termite inquilines manage to cohabit termitaria along with the termite builder species? With this in mind, we analysed one of the several strategies that inquilines could use to circumvent conflicts with their hosts, namely, the use of distinct diets. We inspected overlapping patterns for the diets of several cohabiting Neotropical termite species, as inferred from carbon and nitrogen isotopic signatures for termite individuals. Cohabitant communities from distinct termitaria presented overlapping diet spaces, indicating that they exploited similar diets at the regional scale. When such communities were split into their components, full diet segregation could be observed between builders and inquilines, at regional (environment-wide) and local (termitarium) scales. Additionally, diet segregation among inquilines themselves was also observed in the vast majority of inspected termitaria. Inquiline species distribution among termitaria was not random. Environmental-wide diet similarity, coupled with local diet segregation and deterministic inquiline distribution, could denounce interactions for feeding resources. However, inquilines and builders not sharing the same termitarium, and thus not subject to potential conflicts, still exhibited distinct diets. Moreover, the areas of the builder’s diet space and that of its inquilines did not correlate negatively. Accordingly, the diet areas of builders which hosted inquilines were in average as large as the areas of builders hosting no inquilines. Such results indicate the possibility that dietary partitioning by these cohabiting termites was not majorly driven by current interactive constraints. Rather, it seems to be a result of traits previously fixed in the evolutionary past of cohabitants. PMID:23805229

  12. Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network

    NASA Technical Reports Server (NTRS)

    Kuhn, D. Richard; Kacker, Raghu; Lei, Yu

    2010-01-01

    This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.

  13. How activation, entanglement, and searching a semantic network contribute to event memory.

    PubMed

    Nelson, Douglas L; Kitto, Kirsty; Galea, David; McEvoy, Cathy L; Bruza, Peter D

    2013-08-01

    Free-association norms indicate that words are organized into semantic/associative neighborhoods within a larger network of words and links that bind the net together. We present evidence indicating that memory for a recent word event can depend on implicitly and simultaneously activating related words in its neighborhood. Processing a word during encoding primes its network representation as a function of the density of the links in its neighborhood. Such priming increases recall and recognition and can have long-lasting effects when the word is processed in working memory. Evidence for this phenomenon is reviewed in extralist-cuing, primed free-association, intralist-cuing, and single-item recognition tasks. The findings also show that when a related word is presented in order to cue the recall of a studied word, the cue activates the target in an array of related words that distract and reduce the probability of the target's selection. The activation of the semantic network produces priming benefits during encoding, and search costs during retrieval. In extralist cuing, recall is a negative function of cue-to-distractor strength, and a positive function of neighborhood density, cue-to-target strength, and target-to-cue strength. We show how these four measures derived from the network can be combined and used to predict memory performance. These measures play different roles in different tasks, indicating that the contribution of the semantic network varies with the context provided by the task. Finally, we evaluate spreading-activation and quantum-like entanglement explanations for the priming effects produced by neighborhood density.

  14. The First Documented Space Weather Event That Perturbed the Communication Networks in Iberia

    NASA Astrophysics Data System (ADS)

    Ribeiro, P.; Vaquero, J. M.; Gallego, M. C.; Trigo, R. M.

    2016-07-01

    In this work, we review the first space weather event that affected significantly a number of communication networks in the Iberian Peninsula (Southwest of Europe). The event took place on 31 October 1903, during the ascending phase of solar cycle 14 (the lowest since the Dalton Minimum). We describe the widespread problems that occurred in the telegraph communication network in two midlatitude countries (Portugal and Spain), that was practically interrupted from 09 h30 to 21 h00 UT. Different impacts on the telegraphic communication are described and shown to be dependent on the large-scale orientation of the wires. In order to put these results into a wider context we provide measurements of the concurrent geomagnetic field that are available from the observatories of Coimbra (Portugal) and San Fernando (Spain). The measurements confirm the simultaneous occurrence of large geomagnetic disturbances. In particular, the magnetograms recorded in Coimbra show a clear and large amplitude storm sudden commencement around 05 h30. The main phase, with a H (horizontal component of geomagnetic field) maximum range of ~500 nT, started approximately 1 h later and lasted for almost 10 h, suggesting that the interplanetary magnetic field was strongly southward for long time.

  15. An automated standardized system for managing adverse events in clinical research networks.

    PubMed

    Richesson, Rachel L; Malloy, Jamie F; Paulus, Kathleen; Cuthbertson, David; Krischer, Jeffrey P

    2008-01-01

    Multi-site clinical protocols and clinical research networks require tools to manage and monitor adverse events (AEs). To be successful, these tools must be designed to comply with applicable regulatory requirements, reflect current data standards, international directives and advances in pharmacovigilance, and be convenient and adaptable to multiple needs. We describe an Adverse Event Data Management System (AEDAMS) that is used across multiple study designs in the various clinical research networks and multi-site studies for which we provide data and technological support. Investigators enter AE data using a standardized and structured web-based data collection form. The automated AEDAMS forwards the AE information to individuals in designated roles (investigators, sponsors, Data Safety and Monitoring Boards) and manages subsequent communications in real time, as the entire reporting, review and notification is done by automatically generated emails. The system was designed to adhere to timelines and data requirements in compliance with Good Clinical Practice (International Conference on Harmonisation E6) reporting standards and US federal regulations, and can be configured to support AE management for many types of study designs and adhere to various domestic or international reporting requirements. This tool allows AEs to be collected in a standard way by multiple distributed users, facilitates accurate and timely AE reporting and reviews, and allows the centralized management of AEs. Our design justification and experience with the system are described.

  16. A community-based event delivery protocol in publish/subscribe systems for delay tolerant sensor networks.

    PubMed

    Liu, Nianbo; Liu, Ming; Zhu, Jinqi; Gong, Haigang

    2009-01-01

    The basic operation of a Delay Tolerant Sensor Network (DTSN) is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short) paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  17. Statistical Patterns of Triggered Landslide Events and their Application to Road Networks

    NASA Astrophysics Data System (ADS)

    Taylor, Faith E.; Malamud, Bruce D.; Santangelo, Michele; Marchesini, Ivan; Guzzetti, Fausto

    2015-04-01

    In the minutes to weeks after a landslide trigger such as an earthquake or heavy rainfall, as part of a triggered landslide event, one individual to tens of thousands of landslides may occur across a region. If in the region, one or more roads become blocked by landslides, this can cause extensive detours and delay rescue and recovery operations. In this paper, we show the development, application and confrontation with real data of a model to simulate triggered landslide events and their impacts upon road networks. This is done by creating a 'synthetic' triggered landslide event inventory by randomly sampling landslide areas and shapes from already established statistical distributions. These landslides are then semi-randomly dropped across a given study region, conditioned by that region's landslide susceptibility. The resulting synthetic triggered landslide event inventory is overlaid with the region's road network map and the number, size, location and network impact of road blockages and landslides near roads calculated. This process is repeated hundreds of times in a Monte Carlo type simulation. The statistical distributions and approaches used in the model are thought to be generally applicable for low-mobility triggered landslides in many medium to high-topography regions throughout the world. The only local data required to run the model are a road network map, a landslide susceptibility map, a map of the study area boundary and a digital elevation model. Coupled with an Open Source modelling approach (in GRASS-GIS), this model may be applied to many regions where triggered landslide events are an issue. We present model results and confrontation with observed data for two study regions where the model has been applied: Collazzone (Central Italy) where rapid snowmelt triggered 413 landslides in January 1997 and Oat Mountain (Northridge, USA), where the Northridge Earthquake triggered 1,356 landslides in January 1994. We find that when the landslide

  18. Impacts of the January 2014 extreme rainfall event on transportation network in the Alps Maritimes (France)

    NASA Astrophysics Data System (ADS)

    Voumard, Jeremie; Penna, Ivanna; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    Road networks in mountain areas are highly inter-dependent systems, and hillslope processes such as landslides are main drivers of infrastructure detriment and transportation disruptions. Besides the structural damages, economic losses are also related to road and surrounding slope maintenance, as well as due to the disruption of transportation of goods, inaccessibility of tourist resorts, etc. 16-17th January 2014, an intense rainfall event was recorded in the Alpes Maritimes from the southern part of France. According to meteorological data, it was the highest since the 70's. This rainfall triggered numerous landslides (rockfalls, earth flows and debris flows), mostly on January 17th. There were no casualties registered due to hillslope processes, but several houses were damaged, some populations living in the Var valley along the RM 2205 road were isolated, and several roads were partially and totally blocked. 1.5 km upstream the village of Saint-Sauveur-sur-Tinée, 150 m3 of rock detached from the slope and blocked the road, after which temporary traffic interruptions due to road works lasted around one week. In the Menton area, where hillslopes are highly urbanized, the volume of rocks involved in slope failures was so large that materials removed to reestablish the traffic had to be placed in transitory storage sites. The average landslide volume was estimated at around 100 m3. Most of the landslides occurred in slopes cut during road and houses constructions. Several trucks were needed to clean up materials, giving place to traffic jams, etc. (some single events reached around 400 m3). The aim of this study is to document the impact on transportation networks caused by this rainfall event. Damages and consequences for the traffic were documented during a field visit, obtained from secondary information, as well as by the aid of a drone in the case of inaccessible areas.

  19. Digital Learning Network Education Events of NASA's Extreme Environments Mission Operations

    NASA Technical Reports Server (NTRS)

    Paul, Heather; Guillory, Erika

    2007-01-01

    NASA's Digital Learning Network (DLN) reaches out to thousands of students each year through video conferencing and web casting. The DLN has created a series of live education videoconferences connecting NASA s Extreme Environment Missions Operations (NEEMO) team to students across the United States. The programs are also extended to students around the world live web casting. The primary focus of the events is the vision for space exploration. During the programs, NEEMO Crewmembers including NASA astronauts, engineers and scientists inform and inspire students about the importance of exploration and share the impact of the project as it correlates with plans to return to the moon and explore the planet Mars. These events highlight interactivity. Students talk live with the aquanauts in Aquarius, the National Oceanic and Atmospheric Administration s underwater laboratory. With this program, NASA continues the Agency s tradition of investing in the nation's education programs. It is directly tied to the Agency's major education goal of attracting and retaining students in science, technology, and engineering disciplines. Before connecting with the aquanauts, the students conduct experiments of their own designed to coincide with mission objectives. This paper describes the events that took place in September 2006.

  20. Networks of recurrent events, a theory of records, and an application to finding causal signatures in seismicity.

    PubMed

    Davidsen, Jörn; Grassberger, Peter; Paczuski, Maya

    2008-06-01

    We propose a method to search for signs of causal structure in spatiotemporal data making minimal a priori assumptions about the underlying dynamics. To this end, we generalize the elementary concept of recurrence for a point process in time to recurrent events in space and time. An event is defined to be a recurrence of any previous event if it is closer to it in space than all the intervening events. As such, each sequence of recurrences for a given event is a record breaking process. This definition provides a strictly data driven technique to search for structure. Defining events to be nodes, and linking each event to its recurrences, generates a network of recurrent events. Significant deviations in statistical properties of that network compared to networks arising from (acausal) random processes allows one to infer attributes of the causal dynamics that generate observable correlations in the patterns. We derive analytically a number of properties for the network of recurrent events composed by a random process in space and time. We extend the theory of records to treat not only the variable where records happen, but also time as continuous. In this way, we construct a fully symmetric theory of records leading to a number of results. Those analytic results are compared in detail to the properties of a network synthesized from time series of epicenter locations for earthquakes in Southern California. Significant disparities from the ensemble of acausal networks that can be plausibly attributed to the causal structure of seismicity are as follows. (1) Invariance of network statistics with the time span of the events considered. (2) The appearance of a fundamental length scale for recurrences, independent of the time span of the catalog, which is consistent with observations of the "rupture length." (3) Hierarchy in the distances and times of subsequent recurrences. As expected, almost all of the statistical properties of a network constructed from a surrogate

  1. Building America Top Innovations 2012: Reduced Call-Backs with High-Performance Production Builders

    SciTech Connect

    none,

    2013-01-01

    This Building America Top Innovations profile describes ways Building America teams have helped builders cut call-backs. Harvard University study found builders who worked with Building America had a 50% drop in call-backs. One builder reported a 50-fold reduction in the incidence of pipe freezing, a 50% reduction in drywall cracking, and a 60% decline in call-backs.

  2. A neural network model to predict the wastewater inflow incorporating rainfall events.

    PubMed

    El-Din, Ahmed Gamal; Smith, Daniel W

    2002-03-01

    Under steady-state conditions, a wastewater treatment plant usually has a satisfactory performance because these conditions are similar to design conditions. However, load variations constitute a large portion of the operating life of a treatment facility and most of the observed problems in complying with permit requirements occur during these load transients. During storm events upsets to the different physical and biological processes may take place in a wastewater treatment plant, and therefore, the ability to predict the hydraulic load to a treatment facility during such events is very beneficial for the optimization of the treatment process. Most of the hydrologic and hydraulic models describing sewage collection systems are deterministic. Such models require detailed knowledge of the system and usually rely on a large number of parameters, some of which are uncertain or difficult to determine. Presented in this paper, an artificial neural network (ANN) model that is used to make short-term predictions of wastewater inflow rate that enters the Gold Bar Wastewater Treatment Plant (GBWWTP), the largest plant in the Edmonton area (Alberta, Canada). The neural model uses rainfall data, observed in the collection system discharging to the plant, as inputs. The building process of the model was conducted in a systematic way that allowed the identification of a parsimonious model that is able to learn (and not memorize) from past data and generalize very well to unseen data that was used to validate the model. The neural network model gave excellent results. The potential of using the model as part of a real-time process control system is also discussed.

  3. E-Classical Fairy Tales: Multimedia Builder as a Tool

    ERIC Educational Resources Information Center

    Eteokleous, Nikleia; Ktoridou, Despo; Tsolakidis, Symeon

    2011-01-01

    The study examines pre-service teachers' experiences in delivering a traditional-classical fairy tale using the Multimedia Builder software, in other words an e-fairy tale. A case study approach was employed, collecting qualitative data through classroom observations and focus groups. The results focus on pre-service teachers' reactions, opinions,…

  4. New Whole-House Solutions Case Study: Pine Mountain Builders

    SciTech Connect

    none,

    2013-02-01

    Pine Mountain Builders achieved HERS scores as low as 59 and electric bills as low as $50/month with extensive air sealing (blower door tests = 1.0 to 1.8 ACH 50), R-3 XPS sheathing instead of OSB, and higher efficiency heat pumps.

  5. 4. DETAIL OF BUILDER'S PLATES: '1901, CARTER H. HARRISON, COMMISSIONER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. DETAIL OF BUILDER'S PLATES: '1901, CARTER H. HARRISON, COMMISSIONER OF PUBLIC WORKS, MAYOR F.W. BLOCKI, JOHN ERICSON, CITY ENGINEER'; 'FITZSIMONS AND CONNELL CO. SUBSTRUCTURE'; 'AMERICAN BRIDGE COMPANY, LASSIG PLANT, CONTRACTOR FOR SUPERSTRUCTURE' - Chicago River Bascule Bridge, West Cortland Street, Spanning North Branch of Chicago River at West Cortland Street, Chicago, Cook County, IL

  6. Play on Words: A Reader, Speller, & Vocabulary Builder.

    ERIC Educational Resources Information Center

    McGann, Thomas Daniel

    Designed to be used primarily as a reader for first and second graders, but also as a speller and vocabulary builder for any student who needs help in language arts mastery, this book combines phonics with the "Look and Say" methods to present a step-by-step learning guide. Following an introduction explaining the five-step procedure teachers can…

  7. Polyol and Amino Acid-Based Biosurfactants, Builders, and Hydrogels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This chapter reviews different detergent materials which have been synthesized from natural agricultural commodities. Background information, which gives reasons why the use of biobased materials may be advantageous, is presented. Detergent builders from L-aspartic acid, citric acid and D-sorbitol...

  8. 10. DETAIL OF BUILDER'S PLATE AT NORTH PORTAL. PLATE READS: ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. DETAIL OF BUILDER'S PLATE AT NORTH PORTAL. PLATE READS: 1889, BUILT BY THE BERLIN IRON BRIDGE CO. EAST BERLIN CONN. DOUGLAS & JARVIS PAT. APT. 16, 1878, AP'L 17, 1885. A.P. FORESMAN, WM. S. STARR, T.J. STREBEIGH, COMMISSIONERS. - Pine Creek Bridge, River Road spanning Pine Creek, Jersey Shore, Lycoming County, PA

  9. Marketing and promoting solar water heaters to home builders

    SciTech Connect

    Keller, C.; Ghent, P.

    1999-12-06

    This is the final report of a four-task project to develop a marketing plan designed for businesses interested in marketing solar water heaters in the new home industry. This report outlines suggested marketing communication materials and other promotional tools focused on selling products to the new home builder. Information relevant to promoting products to the new home buyer is also included.

  10. SDMProjectBuilder: SWAT Setup for Nutrient Fate and Transport

    EPA Science Inventory

    This tutorial reviews some of the screens, icons, and basic functions of the SDMProjectBuilder (SDMPB) and explains how one uses SDMPB output to populate the Soil and Water Assessment Tool (SWAT) input files for nutrient fate and transport modeling in the Salt River Basin. It dem...

  11. 5. DETAIL OF BUILDER'S PLATE, WHICH READS '1898, THE SANITARY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. DETAIL OF BUILDER'S PLATE, WHICH READS '1898, THE SANITARY DISTRICT OF CHICAGO, BOARD OF TRUSTEES, WILLIAM BOLDENWECK, JOSEPH C. BRADEN, ZINA R. CARTER, BERNARD A. ECKART, ALEXANDER J. JONES, THOMAS KELLY, JAMES P. MALLETTE, THOMAS SMYTHE, FRANK WINTER; ISHAM RANDOLPH, CHIEF ENGINEER.' - Santa Fe Railroad, Sanitary & Ship Canal Bridge, Spanning Sanitary & Ship Canal east of Harlem Avenue, Chicago, Cook County, IL

  12. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network

    PubMed Central

    Hao, Xiaoqing; An, Haizhong; Zhang, Lijia; Li, Huajiao; Wei, Guannan

    2015-01-01

    To study the sentiment diffusion of online public opinions about hot events, we collected people’s posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P), weakly positive (p), neutral (o), weakly negative (n), and strongly negative (N). These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP) with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts’ sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people’s sentiments regarding online hot events according to the sentiment diffusion mechanism. PMID:26462230

  13. Real-time Monitoring Network to Characterize Anthropogenic and Natural Events Affecting the Hudson River, NY

    NASA Astrophysics Data System (ADS)

    Islam, M. S.; Bonner, J. S.; Fuller, C.; Kirkey, W.; Ojo, T.

    2011-12-01

    The Hudson River watershed spans 34,700 km2 predominantly in New York State, including agricultural, wilderness, and urban areas. The Hudson River supports many activities including shipping, supplies water for municipal, commercial, and agricultural uses, and is an important recreational resource. As the population increases within this watershed, so does the anthropogenic impact on this natural system. To address the impacts of anthropogenic and natural activities on this ecosystem, the River and Estuary Observatory Network (REON) is being developed through a joint venture between the Beacon Institute, Clarkson University, General Electric Inc. and IBM Inc. to monitor New York's Hudson and Mohawk Rivers in real-time. REON uses four sensor platform types with multiple nodes within the network to capture environmentally relevant episodic events. Sensor platform types include: 1) fixed robotic vertical profiler (FRVP); 2) mobile robotic undulating platform (MRUP); 3) fixed acoustic Doppler current profiler (FADCP) and 4) Autonomous Underwater Vehicle (AUV). The FRVP periodically generates a vertical profile with respect to water temperature, salinity, dissolved oxygen, particle concentration and size distribution, and fluorescence. The MRUP utilizes an undulating tow-body tethered behind a research vessel to measure the same set of water parameters as the FRVP, but does so 'synchronically' over a highly-resolved spatial regime. The fixed ADCP provides continuous water current profiles. The AUV maps four-dimensional (time, latitude, longitude, depth) variation of water quality, water currents and bathymetry along a pre-determined transect route. REON data can be used to identify episodic events, both anthropogenic and natural, that impact the Hudson River. For example, a strong heat signature associated with cooling water discharge from the Indian Point nuclear power plant was detected with the MRUP. The FRVP monitoring platform at Beacon, NY, located in the

  14. Application of Parallel Discrete Event Simulation to the Space Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jefferson, D.; Leek, J.

    2010-09-01

    In this paper we describe how and why we chose parallel discrete event simulation (PDES) as the paradigm for modeling the Space Surveillance Network (SSN) in our modeling framework, TESSA (Testbed Environment for Space Situational Awareness). DES is a simulation paradigm appropriate for systems dominated by discontinuous state changes at times that must be calculated dynamically. It is used primarily for complex man-made systems like telecommunications, vehicular traffic, computer networks, economic models etc., although it is also useful for natural systems that are not described by equations, such as particle systems, population dynamics, epidemics, and combat models. It is much less well known than simple time-stepped simulation methods, but has the great advantage of being time scale independent, so that one can freely mix processes that operate at time scales over many orders of magnitude with no runtime performance penalty. In simulating the SSN we model in some detail: (a) the orbital dynamics of up to 105 objects, (b) their reflective properties, (c) the ground- and space-based sensor systems in the SSN, (d) the recognition of orbiting objects and determination of their orbits, (e) the cueing and scheduling of sensor observations, (f) the 3-d structure of satellites, and (g) the generation of collision debris. TESSA is thus a mixed continuous-discrete model. But because many different types of discrete objects are involved with such a wide variation in time scale (milliseconds for collisions, hours for orbital periods) it is suitably described using discrete events. The PDES paradigm is surprising and unusual. In any instantaneous runtime snapshot some parts my be far ahead in simulation time while others lag behind, yet the required causal relationships are always maintained and synchronized correctly, exactly as if the simulation were executed sequentially. The TESSA simulator is custom-built, conservatively synchronized, and designed to scale to

  15. Molecular Insights into Reprogramming-Initiation Events Mediated by the OSKM Gene Regulatory Network

    PubMed Central

    Liao, Mei-Chih; Prigione, Alessandro; Jozefczuk, Justyna; Lichtner, Björn; Wolfrum, Katharina; Haltmeier, Manuela; Flöttmann, Max; Schaefer, Martin; Hahn, Alexander; Mrowka, Ralf; Klipp, Edda; Andrade-Navarro, Miguel A.; Adjaye, James

    2011-01-01

    Somatic cells can be reprogrammed to induced pluripotent stem cells by over-expression of OCT4, SOX2, KLF4 and c-MYC (OSKM). With the aim of unveiling the early mechanisms underlying the induction of pluripotency, we have analyzed transcriptional profiles at 24, 48 and 72 hours post-transduction of OSKM into human foreskin fibroblasts. Experiments confirmed that upon viral transduction, the immediate response is innate immunity, which induces free radical generation, oxidative DNA damage, p53 activation, senescence, and apoptosis, ultimately leading to a reduction in the reprogramming efficiency. Conversely, nucleofection of OSKM plasmids does not elicit the same cellular stress, suggesting viral response as an early reprogramming roadblock. Additional initiation events include the activation of surface markers associated with pluripotency and the suppression of epithelial-to-mesenchymal transition. Furthermore, reconstruction of an OSKM interaction network highlights intermediate path nodes as candidates for improvement intervention. Overall, the results suggest three strategies to improve reprogramming efficiency employing: 1) anti-inflammatory modulation of innate immune response, 2) pre-selection of cells expressing pluripotency-associated surface antigens, 3) activation of specific interaction paths that amplify the pluripotency signal. PMID:21909390

  16. An extreme events laboratory to provide network centric collaborative situation assessment and decision making

    NASA Astrophysics Data System (ADS)

    Panulla, Brian J.; More, Loretta D.; Shumaker, Wade R.; Jones, Michael D.; Hooper, Robert; Vernon, Jeffrey M.; Aungst, Stanley G.

    2009-05-01

    Rapid improvements in communications infrastructure and sophistication of commercial hand-held devices provide a major new source of information for assessing extreme situations such as environmental crises. In particular, ad hoc collections of humans can act as "soft sensors" to augment data collected by traditional sensors in a net-centric environment (in effect, "crowd-sourcing" observational data). A need exists to understand how to task such soft sensors, characterize their performance and fuse the data with traditional data sources. In order to quantitatively study such situations, as well as study distributed decision-making, we have developed an Extreme Events Laboratory (EEL) at The Pennsylvania State University. This facility provides a network-centric, collaborative situation assessment and decision-making capability by supporting experiments involving human observers, distributed decision making and cognition, and crisis management. The EEL spans the information chain from energy detection via sensors, human observations, signal and image processing, pattern recognition, statistical estimation, multi-sensor data fusion, visualization and analytics, and modeling and simulation. The EEL command center combines COTS and custom collaboration tools in innovative ways, providing capabilities such as geo-spatial visualization and dynamic mash-ups of multiple data sources. This paper describes the EEL and several on-going human-in-the-loop experiments aimed at understanding the new collective observation and analysis landscape.

  17. Development and application of CATIA-GDML geometry builder

    NASA Astrophysics Data System (ADS)

    Belogurov, S.; Berchun, Yu; Chernogorov, A.; Malzacher, P.; Ovcharenko, E.; Schetinin, V.

    2014-06-01

    Due to conceptual difference between geometry descriptions in Computer-Aided Design (CAD) systems and particle transport Monte Carlo (MC) codes direct conversion of detector geometry in either direction is not feasible. The paper presents an update on functionality and application practice of the CATIA-GDML geometry builder first introduced at CHEP2010. This set of CATIAv5 tools has been developed for building a MC optimized GEANT4/ROOT compatible geometry based on the existing CAD model. The model can be exported via Geometry Description Markup Language (GDML). The builder allows also import and visualization of GEANT4/ROOT geometries in CATIA. The structure of a GDML file, including replicated volumes, volume assemblies and variables, is mapped into a part specification tree. A dedicated file template, a wide range of primitives, tools for measurement and implicit calculation of parameters, different types of multiple volume instantiation, mirroring, positioning and quality check have been implemented. Several use cases are discussed.

  18. HOT METAL BRIDGE (NOTE: BUILDERS: JONES AND LAUGHLIN STEEL CA. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT METAL BRIDGE (NOTE: BUILDERS: JONES AND LAUGHLIN STEEL CA. 1890), SOUTH PORTAL. THREE PIN CONNECTED CAMELBACK TRUSS SPANS, ONE SKEWED THROUGH TRUSS SPAN ON NORTH SIDE TRUSS BRIDGE, EAST OF HOT METAL BRIDGE BUILT BY AMERICAN BRIDGE COMPANY CA. 1910. (RIVETED MULTI-SPAN TRUSS). - Jones & Laughlin Steel Corporation, Pittsburgh Works, Morgan Billet Mill Engine, 550 feet north of East Carson Street, opposite South Twenty-seventh Street, Pittsburgh, Allegheny County, PA

  19. Best Practices Case Study: Pine Mountain Builders - Pine Mountain, GA

    SciTech Connect

    2011-09-01

    Case study of Pine Mountain Builders who worked with DOE’s IBACOS team to achieve HERS scores of 59 on 140 homes built around a wetlands in Georgia. The team used taped rigid foam exterior sheathing and spray foam insulation in the walls and on the underside of the attic for a very tight 1.0 to 1.8 ACH 50 building shell.

  20. Solar project description for Gill Harrop, Builders, single-family detached residence, Big Flats, New York

    SciTech Connect

    1982-04-23

    The Gill Harrop Builders Site is a house with approximately 1360 square feet of conditioned space heated by a direct gain system with manually operated insulated curtains. Solar heating is augmented by electric resistance heating, and a wood burning stove may be installed. Sunlight is admitted through both south facing windows and through clerestory collector panels and is absorbed and stored as heat in a concrete floor and wall. Heat is then distributed by natural convection and radiation. Temperature regulation is assisted by earth berms. Three modes of operation are described: collector-to-storage, storage-to-space heating, and passive space cooling, which is accomplished by shading, movable insulation, and ventilation. The instrumentation for the National Solar Data Network is described. The solar energy portion of the construction costs is estimated to be $7000. (LEW)

  1. Action Learning: Developing Innovative Networks of Practice ... for Ideas Worth Sharing--Design Event Held at Mersey Care NHS Trust Liverpool on 6 November, 2008

    ERIC Educational Resources Information Center

    Harvey, Brendon

    2009-01-01

    In this article, the author talks about the highlights of the first design event of the "Northern Action Learning Network" held in Mersey Care NHS Trust Liverpool last November 6, 2008. The intent of the event was to tap into the diverse and flourishing action learning (AL) community by growing an innovative network of practitioners keen to create…

  2. Onset and Offset of Aversive Events Establish Distinct Memories Requiring Fear and Reward Networks

    ERIC Educational Resources Information Center

    Andreatta, Marta; Fendt, Markus; Muhlberger, Andreas; Wieser, Matthias J.; Imobersteg, Stefan; Yarali, Ayse; Gerber, Bertram; Pauli, Paul

    2012-01-01

    Two things are worth remembering about an aversive event: What made it happen? What made it cease? If a stimulus precedes an aversive event, it becomes a signal for threat and will later elicit behavior indicating conditioned fear. However, if the stimulus is presented upon cessation of the aversive event, it elicits behavior indicating…

  3. Evaluation of the U.S. Department of Energy Challenge Home Program Certification of Production Builders

    SciTech Connect

    Kerrigan, P.; Loomis, H.

    2014-09-01

    The purpose of this project was to evaluate integrated packages of advanced measures in individual test homes to assess their performance with respect to Building America Program goals, specifically compliance with the DOE Challenge Home Program. BSC consulted on the construction of five test houses by three Cold Climate production builders in three separate US cities. BSC worked with the builders to develop a design package tailored to the cost-related impacts for each builder. Therefore, the resulting design packages do vary from builder to builder. BSC provided support through this research project on the design, construction and performance testing of the five test homes. Overall, the builders have concluded that the energy related upgrades (either through the prescriptive or performance path) represent reasonable upgrades. The builders commented that while not every improvement in specification was cost effective (as in a reasonable payback period), many were improvements that could improve the marketability of the homes and serve to attract more energy efficiency discerning prospective homeowners. However, the builders did express reservations on the associated checklists and added certifications. An increase in administrative time was observed with all builders. The checklists and certifications also inherently increase cost due to: 1. Adding services to the scope of work for various trades, such as HERS Rater, HVAC contractor; 2. Increased material costs related to the checklists, especially the EPA Indoor airPLUS and EPA WaterSense(R) Efficient Hot Water Distribution requirement.

  4. Development of an event search and download system for analyzing waveform data observed at seafloor seismic network, DONET

    NASA Astrophysics Data System (ADS)

    Takaesu, M.; Horikawa, H.; Sueki, K.; Kamiya, S.; Nakamura, T.; Nakano, M.; Takahashi, N.; Sonoda, A.; Tsuboi, S.

    2014-12-01

    Mega-thrust earthquakes are anticipated to occur in the Nankai Trough in southwest Japan. In the source areas, we installed seafloor seismic network, DONET (Dense Ocean-floor Network System for Earthquake and Tsunamis), in 2010 in order to monitor seismicity, crustal deformations, and tsunamis. DONET system consists of totally 20 stations, which is composed of six kinds of sensors; strong-motion and broadband seismometers, quartz and differential pressure gauges, hydrophone, and thermometer. The stations are densely distributed with an average spatial interval of 15-20 km and cover near coastal areas to the trench axis. Observed data are transferred to a land station through a fiber-optical cable and then to JAMSTEC (Japan Agency for Marine-Earth Science and Technology) data management center through a private network in real time. The data are based on WIN32 format in the private network and finally archived in SEED format in the management center to combine waveform data with related metadata. We are developing a web-based application system to easily download seismic waveform data of DONET. In this system, users can select 20 Hz broadband (BH type) and 200 Hz strong-motion (EH type) data and download them in SEED. Users can also search events from the options of time periods, magnitude, source area and depth in a GUI platform. Event data are produced referring to event catalogues from USGS and JMA (Japan Meteorological Agency). The thresholds of magnitudes for the production are M6 for far-field and M4 for local events using the USGS and JMA lists, respectively. Available data lengths depend on magnitudes and epicentral distances. In this presentation, we briefly introduce DONET stations and then show our developed application system. We open DONET data through the system and want them to be widely recognized so that many users analyze. We also discuss next plans for further developments of the system.

  5. Discrimination Analysis of Earthquakes and Man-Made Events Using ARMA Coefficients Determination by Artificial Neural Networks

    SciTech Connect

    AllamehZadeh, Mostafa

    2011-12-15

    A Quadratic Neural Networks (QNNs) model has been developed for identifying seismic source classification problem at regional distances using ARMA coefficients determination by Artificial Neural Networks (ANNs). We have devised a supervised neural system to discriminate between earthquakes and chemical explosions with filter coefficients obtained by windowed P-wave phase spectra (15 s). First, we preprocess the recording's signals to cancel out instrumental and attenuation site effects and obtain a compact representation of seismic records. Second, we use a QNNs system to obtain ARMA coefficients for feature extraction in the discrimination problem. The derived coefficients are then applied to the neural system to train and classification. In this study, we explore the possibility of using single station three-component (3C) covariance matrix traces from a priori-known explosion sites (learning) for automatically recognizing subsequent explosions from the same site. The results have shown that this feature extraction gives the best classifier for seismic signals and performs significantly better than other classification methods. The events have been tested, which include 36 chemical explosions at the Semipalatinsk test site in Kazakhstan and 61 earthquakes (mb = 5.0-6.5) recorded by the Iranian National Seismic Network (INSN). The 100% correct decisions were obtained between site explosions and some of non-site events. The above approach to event discrimination is very flexible as we can combine several 3C stations.

  6. Adaptive and context-aware detection and classification of potential QoS degradation events in biomedical wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Abreu, Carlos; Miranda, Francisco; Mendes, Paulo M.

    2016-06-01

    The use of wireless sensor networks in healthcare has the potential to enhance the services provided to citizens. In particular, they play an important role in the development of state-of-the-art patient monitoring applications. Nevertheless, due to the critical nature of the data conveyed by such patient monitoring applications, they have to fulfil high standards of quality of service in order to obtain the confidence of all players in the healthcare industry. In such context, vis-à-vis the quality of service being provided by the wireless sensor network, this work presents an adaptive and context-aware method to detect and classify performance degradation events. The proposed method has the ability to catch the most significant and damaging variations on the metrics being used to quantify the quality of service provided by the network without overreacting to small and innocuous variations on the metric's value.

  7. Combining Neural Networks with Existing Methods to Estimate 1 in 100-Year Flood Event Magnitudes

    NASA Astrophysics Data System (ADS)

    Newson, A.; See, L.

    2005-12-01

    Over the last fifteen years artificial neural networks (ANN) have been shown to be advantageous for the solution of many hydrological modelling problems. The use of ANNs for flood magnitude estimation in ungauged catchments, however, is a relatively new and under researched area. In this paper ANNs are used to make estimates of the magnitude of the 100-year flood event (Q100) for a number of ungauged catchments. The data used in this study were provided by the Centre for Ecology and Hydrology's Flood Estimation Handbook (FEH), which contains information on catchments across the UK. Sixteen catchment descriptors for 719 catchments were used to train an ANN, which was split into a training, validation and test data set. The goodness-of-fit statistics on the test data set indicated good model performance, with an r-squared value of 0.8 and a coefficient of efficiency of 79 percent. Data for twelve ungauged catchments were then put through the trained ANN to produce estimates of Q100. Two other accepted methodologies were also employed: the FEH statistical method and the FSR (Flood Studies Report) design storm technique, both of which are used to produce flood frequency estimates. The advantage of developing an ANN model is that it provides a third figure to aid a hydrologist in making an accurate estimate. For six of the twelve catchments, there was a relatively low spread between estimates. In these instances, an estimate of Q100 could be made with a fair degree of certainty. Of the remaining six catchments, three had areas greater than 1000km2, which means the FSR design storm estimate cannot be used. Armed with the ANN model and the FEH statistical method the hydrologist still has two possible estimates to consider. For these three catchments, the estimates were also fairly similar, providing additional confidence to the estimation. In summary, the findings of this study have shown that an accurate estimation of Q100 can be made using the catchment descriptors of

  8. Whole-House Approach Benefits Builders, Buyers, and the Environment

    SciTech Connect

    Not Available

    2001-05-01

    This document provides an overview of the U.S. Department of Energy's Building America program. Building America works with the residential building industry to develop and implement innovative building processes and technologies-innovations that save builders and homeowners millions of dollars in construction and energy costs. This industry-led, cost-shared partnership program aims to reduce energy use by 50% and reduce construction time and waste, improve indoor air quality and comfort, encourage a systems engineering approach for design and construction of new homes, and accelerate the development and adoption of high performance in production housing.

  9. Friend me or you'll strain us: understanding negative events that occur over social networking sites.

    PubMed

    Tokunaga, Robert S

    2011-01-01

    Social networking sites (SNSs) provide the ideal infrastructure for the maintenance of existing relationships and the development of new contacts. Although these Web-based technologies supplement offline relationships, several of their characteristics have the potential to provoke negative experiences. The interpersonal strain and other relational problems stemming from negative events have recently gained notoriety. This investigation examines personal accounts of users who have experienced these negative events, which are described as any encounter or behavior exercised by others that instigates interpersonal strain, on SNSs to understand better the nature of this phenomenon. Using a mixed-methods approach, open coding of open-ended responses revealed 10 negative event types that surface during participation on SNSs. Quantitative coding was then used to identify a cut-off point for the most frequently experienced negative events. The findings reveal that the three most commonly experienced negative event types include ignoring or denying friend requests, deleting public messages or identification tags, and identifying ranking disparities on Top Friends applications. The practical, theoretical, and negative social implications of participation on SNSs are discussed.

  10. What would you do? Managing a metro network during mass crowd events.

    PubMed

    Barr, Andy C; Lau, Raymond C M; Ng, Nelson W H; da Silva, Marco Antônio; Baptista, Marcia; Oliveira, Vinícius Floriano; Barbosa, Maria Beatriz; Batistini, Estela; de Toledo Ramos, Nancy

    2010-03-01

    Major public events, such as sporting events, carnivals and festivals, are common occurrences in urban and city environments. They are characterised by the mass movement of people in relatively small areas, far in excess of normal daily activity. This section reviews how different metro systems across the globe respond to such peaks of activity, ensuring that people are moved swiftly, efficiently and safely. To this end, representatives from four major public metro systems (London, Hong Kong, Rio de Janeiro and São Paulo) describe how their respective metro systems respond to the capacity demands of a major annual event. PMID:20494882

  11. CHARMM-GUI Membrane Builder Toward Realistic Biological Membrane Simulations

    PubMed Central

    Wu, Emilia L.; Cheng, Xi; Jo, Sunhwan; Rui, Huan; Song, Kevin C.; Dávila-Contreras, Eder M.; Qi, Yifei; Lee, Jumin; Monje-Galvan, Viviana; Venable, Richard M.; Klauda, Jeffery B.; Im, Wonpil

    2014-01-01

    CHARMM-GUI Membrane Builder, http://www.charmm-gui.org/input/membrane, is a web-based user interface designed to interactively build all-atom protein/membrane or membrane-only systems for molecular dynamics simulation through an automated optimized process. In this work, we describe the new features and major improvements in Membrane Builderthat allow users to robustly build realistic biological membrane systems, including (1) addition of new lipid types such as phosphoinositides, cardiolipin, sphingolipids, bacterial lipids, and ergosterol, yielding more than 180 lipid types, (2) enhanced building procedure for lipid packing around protein, (3) reliable algorithm to detect lipid tail penetration to ring structures and protein surface, (4) distance-based algorithm for faster initial ion displacement, (5) CHARMM inputs for P21 image transformation, and (6) NAMD equilibration and production inputs. The robustness of these new features is illustrated by building and simulating a membrane model of the polar and septal regions of E. coli membrane, which contains five lipid types: cardiolipin lipids with two types of acyl chains and phosphatidylethanolamine lipids with three types of acyl chains. It is our hope that CHARMM-GUI Membrane Builder becomes a useful tool for simulation studies to better understand the structure and dynamics of proteins and lipids in realistic biological membrane environments. PMID:25130509

  12. Solar installer training: Home Builders Institute Job Corps

    SciTech Connect

    Hansen, K.; Mann, R.

    1996-10-01

    The instructors describe the solar installation training program operated since 1979 by the Home Builders Institute, the Educational Arm of the National Association of Home Builders for the US Department of Labor, Job Corps in San Diego, CA. The authors are the original instructors and have developed the program since its inception by a co-operative effort between the Solar Energy Industries Association, NAHB and US DOL. Case studies of a few of the 605 students who have gone to work over the years after the training are included. It is one of the most successful programs under the elaborate Student Performance Monitoring Information System used by all Job Corps programs. Job Corps is a federally funded residential job training program for low income persons 16--24 years of age. Discussion details the curriculum and methods used in the program including classroom, shop and community service projects. Solar technologies including all types of hot water heating, swimming pool and spa as well as photovoltaics are included.

  13. Canadian paediatricians’ approaches to managing patients with adverse events following immunization: The role of the Special Immunization Clinic network

    PubMed Central

    Top, Karina A; Zafack, Joseline; De Serres, Gaston; Halperin, Scott A

    2014-01-01

    BACKGROUND: When moderate or severe adverse events occur after vaccination, physicians and patients may have concerns about future immunizations. Similar concerns arise in patients with underlying conditions whose risk for adverse events may differ from the general population. The Special Immunization Clinic (SIC) network was established in 2013 at 13 sites in Canada to provide expertise in the clinical evaluation and vaccination of these patients. OBJECTIVES: To assess referral patterns for patients with vaccine adverse events or potential vaccine contraindications among paediatricians and to assess the anticipated utilization of an SIC. METHODS: A 12-item questionnaire was distributed to paediatricians and subspecialists participating in the Canadian Paediatric Surveillance Program through monthly e-mail and mail contacts. RESULTS: The response rate was 24% (586 of 2490). Fifty-three percent of respondents practiced general paediatrics exclusively and 52% reported that they administer vaccines. In the previous 12 months, 26% of respondents had encountered children with challenging adverse events or potential vaccine contraindications in their practice and 29% had received referrals for such patients, including 27% of subspecialists. Overall, 69% of respondents indicated that they would be likely or very likely to refer patients to an SIC, and 34% indicated that they would have referred at least one patient to an SIC in the previous 12 months. CONCLUSIONS: Patients who experience challenging adverse events following immunization or potential vaccine contraindications are encountered by paediatricians and subspecialists in all practice settings. The SIC network will be able to respond to a clinical need and support paediatricians in managing these patients. PMID:25332661

  14. Detecting and mitigating abnormal events in large scale networks: budget constrained placement on smart grids

    SciTech Connect

    Santhi, Nandakishore; Pan, Feng

    2010-10-19

    Several scenarios exist in the modern interconnected world which call for an efficient network interdiction algorithm. Applications are varied, including various monitoring and load shedding applications on large smart energy grids, computer network security, preventing the spread of Internet worms and malware, policing international smuggling networks, and controlling the spread of diseases. In this paper we consider some natural network optimization questions related to the budget constrained interdiction problem over general graphs, specifically focusing on the sensor/switch placement problem for large-scale energy grids. Many of these questions turn out to be computationally hard to tackle. We present a particular form of the interdiction question which is practically relevant and which we show as computationally tractable. A polynomial-time algorithm will be presented for solving this problem.

  15. Practices and Processes of Leading High Performance Home Builders in the Upper Midwest

    SciTech Connect

    Von Thoma, E.; Ojczyk, C.

    2012-12-01

    The NorthernSTAR Building America Partnership team proposed this study to gain insight into the business, sales, and construction processes of successful high performance builders. The knowledge gained by understanding the high performance strategies used by individual builders, as well as the process each followed to move from traditional builder to high performance builder, will be beneficial in proposing more in-depth research to yield specific action items to assist the industry at large transform to high performance new home construction. This investigation identified the best practices of three successful high performance builders in the upper Midwest. In-depth field analysis of the performance levels of their homes, their business models, and their strategies for market acceptance were explored. All three builders commonly seek ENERGY STAR certification on their homes and implement strategies that would allow them to meet the requirements for the Building America Builders Challenge program. Their desire for continuous improvement, willingness to seek outside assistance, and ambition to be leaders in their field are common themes. Problem solving to overcome challenges was accepted as part of doing business. It was concluded that crossing the gap from code-based building to high performance based building was a natural evolution for these leading builders.

  16. Wildlife Scenario Builder and User's Guide (Version 1.0, Beta Test)

    EPA Science Inventory

    Cover of the Wildlife Scenario <span class=Builder User's Manual" vspace = "5" hspace="5" align="right" border="1" /> The Wildlife Scenario Builder (WSB) was developed to improve the quality of wildlif...

  17. DOE Zero Energy Ready Home Case Study: New Town Builders, Denver, Colorado

    SciTech Connect

    none,

    2013-09-01

    All homes in the Stapleton community must be ENERGY STAR certified; New Town Builders has announced that it will build 250–300 new homes over the next 7–10 years, all of which will be Challenge Homes. New Town received a 2013 Housing Innovation Award in the production builder category.

  18. 78 FR 42122 - Bridge Builder Trust and Olive Street Investment Advisers, LLC; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-15

    ... COMMISSION Bridge Builder Trust and Olive Street Investment Advisers, LLC; Notice of Application July 9, 2013... requirements. APPLICANTS: Bridge Builder Trust (the ``Trust'') and Olive Street Investment Advisers (the.... Neuberger, 2020 East Financial Way, Suite 100, Glendora, CA 91741; The Adviser: James A. Tricarico,...

  19. Application of gene expression programming and neural networks to predict adverse events of radical hysterectomy in cervical cancer patients.

    PubMed

    Kusy, Maciej; Obrzut, Bogdan; Kluska, Jacek

    2013-12-01

    The aim of this article was to compare gene expression programming (GEP) method with three types of neural networks in the prediction of adverse events of radical hysterectomy in cervical cancer patients. One-hundred and seven patients treated by radical hysterectomy were analyzed. Each record representing a single patient consisted of 10 parameters. The occurrence and lack of perioperative complications imposed a two-class classification problem. In the simulations, GEP algorithm was compared to a multilayer perceptron (MLP), a radial basis function network neural, and a probabilistic neural network. The generalization ability of the models was assessed on the basis of their accuracy, the sensitivity, the specificity, and the area under the receiver operating characteristic curve (AUROC). The GEP classifier provided best results in the prediction of the adverse events with the accuracy of 71.96 %. Comparable but slightly worse outcomes were obtained using MLP, i.e., 71.87 %. For each of measured indices: accuracy, sensitivity, specificity, and the AUROC, the standard deviation was the smallest for the models generated by GEP classifier.

  20. Practices and Processes of Leading High Performance Home Builders in the Upper Midwest

    SciTech Connect

    Von Thoma, Ed; Ojzcyk, Cindy

    2012-12-01

    The NorthernSTAR Building America Partnership team proposed this study to gain insight into the business, sales, and construction processes of successful high performance builders. The knowledge gained by understanding the high performance strategies used by individual builders, as well as the process each followed to move from traditional builder to high performance builder, will be beneficial in proposing more in-depth research to yield specific action items to assist the industry at large transform to high performance new home construction. This investigation identified the best practices of three successful high performance builders in the upper Midwest. In-depth field analysis of the performance levels of their homes, their business models, and their strategies for market acceptance were explored.

  1. Building America Case Study: New Town Builders' Power of Zero Energy Center, Denver, Colorado (Brochure)

    SciTech Connect

    Not Available

    2014-10-01

    New Town Builders, a builder of energy efficient homes in Denver, Colorado, offers a zero energy option for all the homes it builds. To attract a wide range of potential homebuyers to its energy efficient homes, New Town Builders created a 'Power of Zero Energy Center' linked to its model home in the Stapleton community of Denver. This case study presents New Town Builders' marketing approach, which is targeted to appeal to homebuyers' emotions rather than overwhelming homebuyers with scientific details about the technology. The exhibits in the Power of Zero Energy Center focus on reduced energy expenses for the homeowner, improved occupant comfort, the reputation of the builder, and the lack of sacrificing the homebuyers' desired design features to achieve zero net energy in the home. The case study also contains customer and realtor testimonials related to the effectiveness of the Center in influencing homebuyers to purchase a zero energy home.

  2. Storm Event Variability in Particulate Organic Matter Source, Size, and Carbon and Nitrogen Content Along a Forested Drainage Network

    NASA Astrophysics Data System (ADS)

    Rowland, R. D.; Inamdar, S. P.; Parr, T. B.

    2015-12-01

    Coupled inputs of carbon and nitrogen comprise an important energy and nutrient subsidy for aquatic ecosystems. Large storm events can mobilize substantial amounts of these elements, especially in particulate form. While the role of storms in mobilizing allochthonous particulate organic matter (POM) is well recognized, less is known about the changes in source, particle size, and composition of POM as it is routed through the fluvial network. Questions we addressed include- (a) How does source, size, and C and N content of suspended POM vary with storm magnitude and intensity? (b) How does POM size and C and N content evolve along the drainage network? (c) How accurate are high-frequency, in-situ sensors in characterizing POM? We conducted this study in a 79 ha, forested catchment in the Piedmont region of Maryland. Event sampling for suspended POM was performed using automated stream water samplers and in-situ, high-frequency sensors (s::can specto::lyser and YSI EXO 2; 30 minute intervals) at 12 and 79 ha drainage locations. Composite storm-event sediment samples were also collected using passive samplers at five catchment drainage scales. Data is available for multiple storms since August 2014. Samples were partitioned into three discrete particle size classes (coarse: 1000-2000 µm, medium: 250-1000 µm, fine: < 250 µm) for organic C and N determination. Suspended sediments and seven soil end members were also analyzed for stable 13C and 15N isotopes ratios to characterize the evolution in sediment sources through the drainage network. Contrary to our expectations, preliminary results suggest finer suspended sediments in the upstream portion of the catchment, and that these may contain more POM. Unsurprisingly, sensors' ability to estimate the coarser particle classes via turbidity are weak compared to the finer class, but this is less pronounced in organic-rich sediments. Distinct patterns in in-situ absorbance spectra may suggest an ability to discern

  3. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Standard form of War Risk Builder's Risk Insurance... TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Builder's Risk Insurance § 308.409 Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk...

  4. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Standard form of War Risk Builder's Risk Insurance... TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Builder's Risk Insurance § 308.409 Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk...

  5. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Standard form of War Risk Builder's Risk Insurance... TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Builder's Risk Insurance § 308.409 Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk...

  6. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Standard form of War Risk Builder's Risk Insurance... TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Builder's Risk Insurance § 308.409 Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk...

  7. 46 CFR 308.409 - Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Standard form of War Risk Builder's Risk Insurance... TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Builder's Risk Insurance § 308.409 Standard form of War Risk Builder's Risk Insurance Policy, Form MA-283. The standard form of War Risk...

  8. Reward and Novelty Enhance Imagination of Future Events in a Motivational-Episodic Network

    PubMed Central

    Bulganin, Lisa; Wittmann, Bianca C.

    2015-01-01

    Thinking about personal future events is a fundamental cognitive process that helps us make choices in daily life. We investigated how the imagination of episodic future events is influenced by implicit motivational factors known to guide decision making. In a two-day functional magnetic resonance imaging (fMRI) study, we controlled learned reward association and stimulus novelty by pre-familiarizing participants with two sets of words in a reward learning task. Words were repeatedly presented and consistently followed by monetary reward or no monetary outcome. One day later, participants imagined personal future events based on previously rewarded, unrewarded and novel words. Reward association enhanced the perceived vividness of the imagined scenes. Reward and novelty-based construction of future events were associated with higher activation of the motivational system (striatum and substantia nigra/ ventral tegmental area) and hippocampus, and functional connectivity between these areas increased during imagination of events based on reward-associated and novel words. These data indicate that implicit past motivational experience contributes to our expectation of what the future holds in store. PMID:26599537

  9. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  10. A replica exchange transition interface sampling method with multiple interface sets for investigating networks of rare events

    NASA Astrophysics Data System (ADS)

    Swenson, David W. H.; Bolhuis, Peter G.

    2014-07-01

    The multiple state transition interface sampling (TIS) framework in principle allows the simulation of a large network of complex rare event transitions, but in practice suffers from convergence problems. To improve convergence, we combine multiple state TIS [J. Rogal and P. G. Bolhuis, J. Chem. Phys. 129, 224107 (2008)] with replica exchange TIS [T. S. van Erp, Phys. Rev. Lett. 98, 268301 (2007)]. In addition, we introduce multiple interface sets, which allow more than one order parameter to be defined for each state. We illustrate the methodology on a model system of multiple independent dimers, each with two states. For reaction networks with up to 64 microstates, we determine the kinetics in the microcanonical ensemble, and discuss the convergence properties of the sampling scheme. For this model, we find that the kinetics depend on the instantaneous composition of the system. We explain this dependence in terms of the system's potential and kinetic energy.

  11. RoboNet-II: Follow-up observations of microlensing events with a robotic network of telescopes

    NASA Astrophysics Data System (ADS)

    Tsapras, Y.; Street, R.; Horne, K.; Snodgrass, C.; Dominik, M.; Allan, A.; Steele, I.; Bramich, D. M.; Saunders, E. S.; Rattenbury, N.; Mottram, C.; Fraser, S.; Clay, N.; Burgdorf, M.; Bode, M.; Lister, T. A.; Hawkins, E.; Beaulieu, J. P.; Fouqué, P.; Albrow, M.; Menzies, J.; Cassan, A.; Dominis-Prester, D.

    2009-01-01

    RoboNet-II uses a global network of robotic telescopes to perform follow-up observations of microlensing events in the Galactic Bulge. The current network consists of three 2 m telescopes located in Hawaii and Australia (owned by Las Cumbres Observatory) and the Canary Islands (owned by Liverpool John Moores University). In future years the network will be expanded by deploying clusters of 1 m telescopes in other suitable locations. A principal scientific aim of the RoboNet-II project is the detection of cool extra-solar planets by the method of gravitational microlensing. These detections will provide crucial constraints to models of planetary formation and orbital migration. RoboNet-II acts in coordination with the PLANET microlensing follow-up network and uses an optimization algorithm (``web-PLOP'') to select the targets and a distributed scheduling paradigm (eSTAR) to execute the observations. Continuous automated assessment of the observations and anomaly detection is provided by the ARTEMiS system.

  12. Assessing the Influence of an Individual Event in Complex Fault Spreading Network Based on Dynamic Uncertain Causality Graph.

    PubMed

    Dong, Chunling; Zhao, Yue; Zhang, Qin

    2016-08-01

    Identifying the pivotal causes and highly influential spreaders in fault propagation processes is crucial for improving the maintenance decision making for complex systems under abnormal and emergency situations. A dynamic uncertain causality graph-based method is introduced in this paper to explicitly model the uncertain causalities among system components, identify fault conditions, locate the fault origins, and predict the spreading tendency by means of probabilistic reasoning. A new algorithm is proposed to assess the impacts of an individual event by investigating the corresponding node's time-variant betweenness centrality and the strength of global causal influence in the fault propagation network. The algorithm does not depend on the whole original and static network but on the real-time spreading behaviors and dynamics, which makes the algorithm to be specifically targeted and more efficient. Experiments on both simulated networks and real-world systems demonstrate the accuracy, effectiveness, and comprehensibility of the proposed method for the fault management of power grids and other complex networked systems.

  13. Aeolian dust event in Korea observed by an EZ Lidar in the frame of global lidar networks.

    NASA Astrophysics Data System (ADS)

    Lolli, Simone

    2010-05-01

    Duststorms and sandstorms regularly devastate Northeast Asia and cause considerable damage to transportation system and public health; further, these events are conceived to be one of the very important indices for estimating the global warming and desertification. Previously, yellow sand events were considered natural phenomena that originate in deserts and arid areas. However, the greater scale and frequency of these events in recent years are considered to be the result of human activities such as overgrazing and over-cultivation. Japan, Korea, Cina and Mongolia are directly concerned to prevent and control these storms and have been able to some extent to provide forecasts and early warnings. In this framework, to improve the accuracy of forecasting , a compact and rugged eye safe lidar, the EZ LIDAR™, developed together by Laboratoire des Sciences du Climat et l'Environnement (LSCE) (CEA-CNRS) and LEOSPHERE (France) to study and investigate structural and optical properties of clouds and aerosols, thanks to the strong know-how of CEA and CNRS in the field of air quality measurements and cloud observation and analysis, was deployed in Seoul, Korea in order to detect and study yellow sand events, thanks to its depolarization channel and scan capabilities. The preliminary results, showed in this paper, of this measurement campaign put in evidence that EZ Lidar, for its capabilities of operating unattended day and night under each atmospheric condition, is mature to be deployed in a global network to study long-range transport, crucial in the forecasting model.

  14. Characterization and Analysis of Networked Array of Sensors for Event Detection (CANARY-EDS)

    2011-05-27

    CANARY -EDS provides probabilistic event detection based on analysis of time-series data from water quality or other sensors. CANARY also can compare patterns against a library of previously seen data to indicate that a certain pattern has reoccurred, suppressing what would otherwise be considered an event. CANARY can be configured to analyze previously recorded data from files or databases, or it can be configured to run in real-time mode directory from a database, or throughmore » the US EPA EDDIES software.« less

  15. Age differences in the Attention Network Test: Evidence from behavior and event-related potentials.

    PubMed

    Williams, Ryan S; Biel, Anna Lena; Wegier, Pete; Lapp, Leann K; Dyson, Benjamin J; Spaniol, Julia

    2016-02-01

    The Attention Network Test (ANT) is widely used to capture group and individual differences in selective attention. Prior behavioral studies with younger and older adults have yielded mixed findings with respect to age differences in three putative attention networks (alerting, orienting, and executive control). To overcome the limitations of behavioral data, the current study combined behavioral and electrophysiological measures. Twenty-four healthy younger adults (aged 18-29years) and 24 healthy older adults (aged 60-76years) completed the ANT while EEG data were recorded. Behaviorally, older adults showed reduced alerting, but did not differ from younger adults in orienting or executive control. Electrophysiological components related to alerting and orienting (P1, N1, and CNV) were similar in both age groups, whereas components related to executive control (N2 and P3) showed age-related differences. Together these results suggest that comparisons of network effects between age groups using behavioral data alone may not offer a complete picture of age differences in selective attention, especially for alerting and executive control networks.

  16. Age differences in the Attention Network Test: Evidence from behavior and event-related potentials.

    PubMed

    Williams, Ryan S; Biel, Anna Lena; Wegier, Pete; Lapp, Leann K; Dyson, Benjamin J; Spaniol, Julia

    2016-02-01

    The Attention Network Test (ANT) is widely used to capture group and individual differences in selective attention. Prior behavioral studies with younger and older adults have yielded mixed findings with respect to age differences in three putative attention networks (alerting, orienting, and executive control). To overcome the limitations of behavioral data, the current study combined behavioral and electrophysiological measures. Twenty-four healthy younger adults (aged 18-29years) and 24 healthy older adults (aged 60-76years) completed the ANT while EEG data were recorded. Behaviorally, older adults showed reduced alerting, but did not differ from younger adults in orienting or executive control. Electrophysiological components related to alerting and orienting (P1, N1, and CNV) were similar in both age groups, whereas components related to executive control (N2 and P3) showed age-related differences. Together these results suggest that comparisons of network effects between age groups using behavioral data alone may not offer a complete picture of age differences in selective attention, especially for alerting and executive control networks. PMID:26760449

  17. Social Network Changes and Life Events across the Life Span: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wrzus, Cornelia; Hanel, Martha; Wagner, Jenny; Neyer, Franz J.

    2013-01-01

    For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-analysis on age-related social network…

  18. DEVELOPMENT, EVALUATION AND APPLICATION OF AN AUTOMATED EVENT PRECIPITATION SAMPLER FOR NETWORK OPERATION

    EPA Science Inventory

    In 1993, the University of Michigan Air Quality Laboratory (UMAQL) designed a new wet-only precipitation collection system that was utilized in the Lake Michigan Loading Study. The collection system was designed to collect discrete mercury and trace element samples on an event b...

  19. Event-Related fMRI of Category Learning: Differences in Classification and Feedback Networks

    ERIC Educational Resources Information Center

    Little, Deborah M.; Shin, Silvia S.; Sisco, Shannon M.; Thulborn, Keith R.

    2006-01-01

    Eighteen healthy young adults underwent event-related (ER) functional magnetic resonance imaging (fMRI) of the brain while performing a visual category learning task. The specific category learning task required subjects to extract the rules that guide classification of quasi-random patterns of dots into categories. Following each classification…

  20. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  1. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  2. Systematic observations of long-range transport events and climatological backscatter profiles with the DWD ceilometer network

    NASA Astrophysics Data System (ADS)

    Mattis, Ina; Müller, Gerhard; Wagner, Frank; Hervo, Maxime

    2015-04-01

    The German Meteorological Service (DWD) operates a network of about 60 CHM15K-Nimbus ceilometers for cloud base height observations. Those very powerful ceilometers allow for the detection and characterization of aerosol layers. Raw data of all network ceilometers are transferred online to DWD's data analysis center at the Hohenpeißenberg Meteorological Observatory. There, the occurrence of aerosol layers from long-range transport events in the free troposphere is systematically monitored on daily basis for each single station. If possible, the origin of the aerosol layers is determined manually from the analysis of the meteorological situation and model output. We use backward trajectories as well as the output of the MACC and DREAM models for the decision, whether the observed layer originated in the Sahara region, from forest fires in North America or from another, unknown source. Further, the magnitude of the observed layers is qualitatively estimated taking into account the geometrical layer depth, signal intensity, model output and nearby sun photometer or lidar observations (where available). All observed layers are attributed to one of the categories 'faint', 'weak', 'medium', 'strong', or 'extreme'. We started this kind of analysis in August 2013 and plan to continue this systematic documentation of long-range transport events of aerosol layers to Germany on long-term base in the framework of our GAW activities. Most of the observed aerosol layers have been advected from the Sahara region to Germany. In the 15 months between August 2013 and November 2014 we observed on average 46 days with Sahara dust layers per station, but only 16 days with aerosol layers from forest fires. The occurrence of Sahara dust layers vary with latitude. We observed only 28 dusty days in the north, close to the coasts of North Sea and Baltic Sea. In contrast, in southern Germany, in Bavarian Pre-Alps and in the Black Forest mountains, we observed up to 59 days with dust. At

  3. Evaluation of the U.S. Department of Energy Challenge Home Program Certification of Production Builders

    SciTech Connect

    Kerrigan, P.; Loomis, H.

    2014-09-01

    The purpose of this project was to evaluate integrated packages of advanced measures in individual test homes to assess their performance with respect to Building America program goals, specifically compliance with the DOE Challenge Home Program. BSC consulted on the construction of five test houses by three cold climate production builders in three U.S. cities and worked with the builders to develop a design package tailored to the cost-related impacts for each builder. Also, BSC provided support through performance testing of the five test homes. Overall, the builders have concluded that the energy related upgrades (either through the prescriptive or performance path) represent reasonable upgrades. The builders commented that while not every improvement in specification was cost effective (as in a reasonable payback period), many were improvements that could improve the marketability of the homes and serve to attract more energy efficiency discerning prospective homeowners. However, the builders did express reservations on the associated checklists and added certifications. An increase in administrative time was observed with all builders. The checklists and certifications also inherently increase cost due to: adding services to the scope of work for various trades, such as HERS Rater, HVAC contractor; and increased material costs related to the checklists, especially the EPA Indoor airPLUS and EPA WaterSense® Efficient Hot Water Distribution requirement.

  4. Enriched encoding: reward motivation organizes cortical networks for hippocampal detection of unexpected events.

    PubMed

    Murty, Vishnu P; Adcock, R Alison

    2014-08-01

    Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expectancy violations. During functional magnetic resonance imaging, participants performed a reaction-time task in which goal-irrelevant expectancy violations were encountered during states of high- or low-reward motivation. Motivation amplified hippocampal activation to and declarative memory for expectancy violations. Connectivity of the ventral tegmental area (VTA) with medial prefrontal, ventrolateral prefrontal, and visual cortices preceded and predicted this increase in hippocampal sensitivity. These findings elucidate a novel mechanism whereby reward motivation can enhance hippocampus-dependent memory: anticipatory VTA-cortical-hippocampal interactions. Further, the findings integrate literatures on dopaminergic neuromodulation of prefrontal function and hippocampus-dependent memory. We conclude that during reward motivation, VTA modulation induces distributed neural changes that amplify hippocampal signals and records of expectancy violations to improve predictions-a potentially unique contribution of the hippocampus to reward learning.

  5. Response control networks are selectively modulated by attention to rare events and memory load regardless of the need for inhibition.

    PubMed

    Wijeakumar, Sobanawartiny; Magnotta, Vincent A; Buss, Aaron T; Ambrose, Joseph P; Wifall, Timothy A; Hazeltine, Eliot; Spencer, John P

    2015-10-15

    Recent evidence has sparked debate about the neural bases of response selection and inhibition. In the current study, we employed two reactive inhibition tasks, the Go/Nogo (GnG) and Simon tasks, to examine questions central to these debates. First, we investigated whether a fronto-cortical-striatal system was sensitive to the need for inhibition per se or the presentation of infrequent stimuli, by manipulating the proportion of trials that do not require inhibition (Go/Compatible trials) relative to trials that require inhibition (Nogo/Incompatible trials). A cortico-subcortical network composed of insula, putamen, and thalamus showed greater activation on salient and infrequent events, regardless of the need for inhibition. Thus, consistent with recent findings, key parts of the fronto-cortical-striatal system are engaged by salient events and do not appear to play a selective role in response inhibition. Second, we examined how the fronto-cortical-striatal system is modulated by working memory demands by varying the number of stimulus-response (SR) mappings. Right inferior parietal lobule showed decreasing activation as the number of SR mappings increased, suggesting that a form of associative memory - rather than working memory - might underlie performance in these tasks. A broad motor planning and control network showed similar trends that were also modulated by the number of motor responses required in each task. Finally, bilateral lingual gyri were more robustly engaged in the Simon task, consistent with the role of this area in shifts of visuo-spatial attention. The current study sheds light on how the fronto-cortical-striatal network is selectively engaged in reactive control tasks and how control is modulated by manipulations of attention and memory load. PMID:26190403

  6. Response control networks are selectively modulated by attention to rare events and memory load regardless of the need for inhibition.

    PubMed

    Wijeakumar, Sobanawartiny; Magnotta, Vincent A; Buss, Aaron T; Ambrose, Joseph P; Wifall, Timothy A; Hazeltine, Eliot; Spencer, John P

    2015-10-15

    Recent evidence has sparked debate about the neural bases of response selection and inhibition. In the current study, we employed two reactive inhibition tasks, the Go/Nogo (GnG) and Simon tasks, to examine questions central to these debates. First, we investigated whether a fronto-cortical-striatal system was sensitive to the need for inhibition per se or the presentation of infrequent stimuli, by manipulating the proportion of trials that do not require inhibition (Go/Compatible trials) relative to trials that require inhibition (Nogo/Incompatible trials). A cortico-subcortical network composed of insula, putamen, and thalamus showed greater activation on salient and infrequent events, regardless of the need for inhibition. Thus, consistent with recent findings, key parts of the fronto-cortical-striatal system are engaged by salient events and do not appear to play a selective role in response inhibition. Second, we examined how the fronto-cortical-striatal system is modulated by working memory demands by varying the number of stimulus-response (SR) mappings. Right inferior parietal lobule showed decreasing activation as the number of SR mappings increased, suggesting that a form of associative memory - rather than working memory - might underlie performance in these tasks. A broad motor planning and control network showed similar trends that were also modulated by the number of motor responses required in each task. Finally, bilateral lingual gyri were more robustly engaged in the Simon task, consistent with the role of this area in shifts of visuo-spatial attention. The current study sheds light on how the fronto-cortical-striatal network is selectively engaged in reactive control tasks and how control is modulated by manipulations of attention and memory load.

  7. Differential Network Analyses of Alzheimer’s Disease Identify Early Events in Alzheimer’s Disease Pathology

    DOE PAGES

    Xia, Jing; Rocke, David M.; Perry, George; Ray, Monika

    2014-01-01

    In late-onset Alzheimer’s disease (AD), multiple brain regions are not affected simultaneously. Comparing the gene expression of the affected regions to identify the differences in the biological processes perturbed can lead to greater insight into AD pathogenesis and early characteristics. We identified differentially expressed (DE) genes from single cell microarray data of four AD affected brain regions: entorhinal cortex (EC), hippocampus (HIP), posterior cingulate cortex (PCC), and middle temporal gyrus (MTG). We organized the DE genes in the four brain regions into region-specific gene coexpression networks. Differential neighborhood analyses in the coexpression networks were performed to identify genes with lowmore » topological overlap (TO) of their direct neighbors. The low TO genes were used to characterize the biological differences between two regions. Our analyses show that increased oxidative stress, along with alterations in lipid metabolism in neurons, may be some of the very early events occurring in AD pathology. Cellular defense mechanisms try to intervene but fail, finally resulting in AD pathology as the disease progresses. Furthermore, disease annotation of the low TO genes in two independent protein interaction networks has resulted in association between cancer, diabetes, renal diseases, and cardiovascular diseases.« less

  8. Stochastic switching in gene networks can occur by a single-molecule event or many molecular steps.

    PubMed

    Choi, Paul J; Xie, X Sunney; Shakhnovich, Eugene I

    2010-02-12

    Due to regulatory feedback, biological networks can exist stably in multiple states, leading to heterogeneous phenotypes among genetically identical cells. Random fluctuations in protein numbers, tuned by specific molecular mechanisms, have been hypothesized to drive transitions between these different states. We develop a minimal theoretical framework to analyze the limits of switching in terms of simple experimental parameters. Our model identifies and distinguishes between two distinct molecular mechanisms for generating stochastic switches. In one class of switches, the stochasticity of a single-molecule event, a specific and rare molecular reaction, directly controls the macroscopic change in a cell's state. In the second class, no individual molecular event is significant, and stochasticity arises from the propagation of biochemical noise through many molecular pathways and steps. As an example, we explore switches based on protein-DNA binding fluctuations and predict relations between transcription factor kinetics, absolute switching rate, robustness, and efficiency that differentiate between switching by single-molecule events or many molecular steps. Finally, we apply our methods to recent experimental data on switching in Escherichia coli lactose metabolism, providing quantitative interpretations of a single-molecule switching mechanism. PMID:19931280

  9. Design of simulation builder software to support the enterprise modeling and simulation task of the AMTEX program

    SciTech Connect

    Nolan, M.; Lamont, A.; Chang, L.

    1995-12-12

    This document describes the implementation of the Simulation Builder developed as part of the Enterprise Modeling and Simulation (EM&S) portion of the Demand Activated Manufacturing Architecture (DAMA) project. The Simulation Builder software allows users to develop simulation models using pre-defined modules from a library. The Simulation Builder provides the machinery to allow the modules to link together and communicate information during the simulation run. This report describes the basic capabilities and structure of the Simulation Builder to assist a user in reviewing and using the code. It also describes the basic steps to follow when developing modules to take advantage of the capabilities provided by the Simulation Builder. The Simulation Builder software is written in C++. The discussion in this report assumes a sound understanding of the C++ language. Although this report describes the steps to follow when using the Simulation Builder, it is not intended to be a tutorial for a user unfamiliar with C++.

  10. Network Systems Technician.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 17 subjects appropriate for use in a competency list for the occupation of network systems technician, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 17 units are as follows:…

  11. Automatic detection of epileptiform events in EEG by a three-stage procedure based on artificial neural networks.

    PubMed

    Acir, Nurettin; Oztura, Ibrahim; Kuntalp, Mehmet; Baklan, Bariş; Güzeliş, Cüneyt

    2005-01-01

    This paper introduces a three-stage procedure based on artificial neural networks for the automatic detection of epileptiform events (EVs) in a multichannel electroencephalogram (EEG) signal. In the first stage, two discrete perceptrons fed by six features are used to classify EEG peaks into three subgroups: 1) definite epileptiform transients (ETs); 2) definite non-ETs; and 3) possible ETs and possible non-ETs. The pre-classification done in the first stage not only reduces the computation time but also increases the overall detection performance of the procedure. In the second stage, the peaks falling into the third group are aimed to be separated from each other by a nonlinear artificial neural network that would function as a postclassifier whose input is a vector of 41 consecutive sample values obtained from each peak. Different networks, i.e., a backpropagation multilayer perceptron and two radial basis function networks trained by a hybrid method and a support vector method, respectively, are constructed as the postclassifier and then compared in terms of their classification performances. In the third stage, multichannel information is integrated into the system for contributing to the process of identifying an EV by the electroencephalographers (EEGers). After the integration of multichannel information, the overall performance of the system is determined with respect to EVs. Visual evaluation, by two EEGers, of 19 channel EEG records of 10 epileptic patients showed that the best performance is obtained with a radial basis support vector machine providing an average sensitivity of 89.1%, an average selectivity of 85.9%, and a false detection rate (per hour) of 7.5.

  12. CTBT infrasound network performance to detect the 2013 Russian fireball event

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Le Pichon, Alexis; Mialle, Pierrick; Garcés, Milton A.

    2015-04-01

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individual noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. Furthermore, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.

  13. A comparison between National Healthcare Safety Network laboratory-identified event reporting versus traditional surveillance for Clostridium difficile infection.

    PubMed

    Durkin, Michael J; Baker, Arthur W; Dicks, Kristen V; Lewis, Sarah S; Chen, Luke F; Anderson, Deverick J; Sexton, Daniel J; Moehring, Rebekah W

    2015-02-01

    OBJECTIVE Hospitals in the National Healthcare Safety Network began reporting laboratory-identified (LabID) Clostridium difficile infection (CDI) events in January 2013. Our study quantified the differences between the LabID and traditional surveillance methods. DESIGN Cohort study. SETTING A cohort of 29 community hospitals in the southeastern United States. METHODS A period of 6 months (January 1, 2013, to June 30, 2013) of prospectively collected data using both LabID and traditional surveillance definitions were analyzed. CDI events with mismatched surveillance categories between LabID and traditional definitions were identified and characterized further. Hospital-onset CDI (HO-CDI) rates for the entire cohort of hospitals were calculated using each method, then hospital-specific HO-CDI rates and standardized infection ratios (SIRs) were calculated. Hospital rankings based on each CDI surveillance measure were compared. RESULTS A total of 1,252 incident LabID CDI events were identified during 708,551 patient-days; 286 (23%) mismatched CDI events were detected. The overall HO-CDI rate was 6.0 vs 4.4 per 10,000 patient-days for LabID and traditional surveillance, respectively (P<.001); of 29 hospitals, 25 (86%) detected a higher CDI rate using LabID compared with the traditional method. Hospital rank in the cohort differed greatly between surveillance measures. A rank change of at least 5 places occurred in 9 of 28 hospitals (32%) between LabID and traditional CDI surveillance methods, and for SIR. CONCLUSIONS LabID surveillance resulted in a higher hospital-onset CDI incidence rate than did traditional surveillance. Hospital-specific rankings varied based on the HO-CDI surveillance measure used. A clear understanding of differences in CDI surveillance measures is important when interpreting national and local CDI data.

  14. A twenty-first century California observing network for monitoring extreme weather events

    USGS Publications Warehouse

    White, A.B.; Anderson, M.L.; Dettinger, M.D.; Ralph, F.M.; Hinojosa, A.; Cayan, D.R.; Hartman, R.K.; Reynolds, D.W.; Johnson, L.E.; Schneider, T.L.; Cifelli, R.; Toth, Z.; Gutman, S.I.; King, C.W.; Gehrke, F.; Johnston, P.E.; Walls, C.; Mann, Dorte; Gottas, D.J.; Coleman, T.

    2013-01-01

    During Northern Hemisphere winters, the West Coast of North America is battered by extratropical storms. The impact of these storms is of paramount concern to California, where aging water supply and flood protection infrastructures are challenged by increased standards for urban flood protection, an unusually variable weather regime, and projections of climate change. Additionally, there are inherent conflicts between releasing water to provide flood protection and storing water to meet requirements for water supply, water quality, hydropower generation, water temperature and flow for at-risk species, and recreation. In order to improve reservoir management and meet the increasing demands on water, improved forecasts of precipitation, especially during extreme events, is required. Here we describe how California is addressing their most important and costliest environmental issue – water management – in part, by installing a state-of-the-art observing system to better track the area’s most severe wintertime storms.

  15. DOE Zero Energy Ready Home Case Study: Preferred Builders, Old Greenwich, Connecticut

    SciTech Connect

    none,

    2013-04-01

    The first Challenge Home built in New England features cool-roof shingles, HERS 20–42, and walls densely packed with blown fiberglass. This house won a 2013 Housing Innovation Award in the custom builder category.

  16. 77 FR 2310 - Notice of Submission of Proposed Information Collection to OMB; Builder's Certification/Guarantee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ... termite hazards. Builders certify and guarantee that all required treatment for termites are performed and... treatment for termites are performed and there is no infestation of treated areas for a year. Also,...

  17. CarbBuilder: Software for building molecular models of complex oligo- and polysaccharide structures.

    PubMed

    Kuttel, Michelle M; Ståhle, Jonas; Widmalm, Göran

    2016-08-15

    CarbBuilder is a portable software tool for producing three-dimensional molecular models of carbohydrates from the simple text specification of a primary structure. CarbBuilder can generate a wide variety of carbohydrate structures, ranging from monosaccharides to large, branched polysaccharides. Version 2.0 of the software, described in this article, supports monosaccharides of both mammalian and bacterial origin and a range of substituents for derivatization of individual sugar residues. This improved version has a sophisticated building algorithm to explore the range of possible conformations for a specified carbohydrate molecule. Illustrative examples of models of complex polysaccharides produced by CarbBuilder demonstrate the capabilities of the software. CarbBuilder is freely available under the Artistic License 2.0 from https://people.cs.uct.ac.za/~mkuttel/Downloads.html. © 2016 Wiley Periodicals, Inc.

  18. Best Practices Case Study: Devoted Builders, LLC, Mediterrtanean Villas, Pasco,WA

    SciTech Connect

    2010-12-01

    Devoted Builders of Kennewick, WA worked with Building America's BIRA team to achieve the 50% Federal tax credit level energy savings on 81 homes at its Mediterranean Villas community in eastern Washington.

  19. Building with passive solar: an application guide for the southern homeowner and builder

    SciTech Connect

    1981-03-01

    This instructional material was prepared for training workshops for builders and home designers. It includes: fundamental definitions and equations, climate and site studies, building components, passive systems and techniques, and design tools. (MHR)

  20. DETAIL OF CORNERSTONE, WHICH STATES "J.J. DANIELS, BUILDER 1861." NOTE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF CORNERSTONE, WHICH STATES "J.J. DANIELS, BUILDER 1861." NOTE ALSO IRON STRAP AT EAST CORNER OF ABUTMENT. - Jackson Covered Bridge, Spanning Sugar Creek, CR 775N (Changed from Spanning Sugar Creek), Bloomingdale, Parke County, IN

  1. CarbBuilder: Software for building molecular models of complex oligo- and polysaccharide structures.

    PubMed

    Kuttel, Michelle M; Ståhle, Jonas; Widmalm, Göran

    2016-08-15

    CarbBuilder is a portable software tool for producing three-dimensional molecular models of carbohydrates from the simple text specification of a primary structure. CarbBuilder can generate a wide variety of carbohydrate structures, ranging from monosaccharides to large, branched polysaccharides. Version 2.0 of the software, described in this article, supports monosaccharides of both mammalian and bacterial origin and a range of substituents for derivatization of individual sugar residues. This improved version has a sophisticated building algorithm to explore the range of possible conformations for a specified carbohydrate molecule. Illustrative examples of models of complex polysaccharides produced by CarbBuilder demonstrate the capabilities of the software. CarbBuilder is freely available under the Artistic License 2.0 from https://people.cs.uct.ac.za/~mkuttel/Downloads.html. © 2016 Wiley Periodicals, Inc. PMID:27317625

  2. Building America Top Innovations 2012: DOE Challenge Home (Formerly Builders Challenge)

    SciTech Connect

    none,

    2013-01-01

    This Building America Top Innovations profile describes DOE’s Builders Challenge. Hundreds of leading-edge builders across the country signed on to the Challenge and more than 14,000 homes earned the label, saving homeowners over $10 million a year in utility bills. DOE’s new program, the DOE Challenge Home, increases the rigor of the guidelines including requiring homes to be Zero Net-Energy Ready.

  3. ABodyBuilder: Automated antibody structure prediction with data–driven accuracy estimation

    PubMed Central

    Leem, Jinwoo; Dunbar, James; Georges, Guy; Shi, Jiye; Deane, Charlotte M.

    2016-01-01

    ABSTRACT Computational modeling of antibody structures plays a critical role in therapeutic antibody design. Several antibody modeling pipelines exist, but no freely available methods currently model nanobodies, provide estimates of expected model accuracy, or highlight potential issues with the antibody's experimental development. Here, we describe our automated antibody modeling pipeline, ABodyBuilder, designed to overcome these issues. The algorithm itself follows the standard 4 steps of template selection, orientation prediction, complementarity-determining region (CDR) loop modeling, and side chain prediction. ABodyBuilder then annotates the ‘confidence’ of the model as a probability that a component of the antibody (e.g., CDRL3 loop) will be modeled within a root–mean square deviation threshold. It also flags structural motifs on the model that are known to cause issues during in vitro development. ABodyBuilder was tested on 4 separate datasets, including the 11 antibodies from the Antibody Modeling Assessment–II competition. ABodyBuilder builds models that are of similar quality to other methodologies, with sub–Angstrom predictions for the ‘canonical’ CDR loops. Its ability to model nanobodies, and rapidly generate models (∼30 seconds per model) widens its potential usage. ABodyBuilder can also help users in decision–making for the development of novel antibodies because it provides model confidence and potential sequence liabilities. ABodyBuilder is freely available at http://opig.stats.ox.ac.uk/webapps/abodybuilder. PMID:27392298

  4. Fast oscillations in cortical-striatal networks switch frequency following rewarding events and stimulant drugs

    PubMed Central

    Berke, J.D.

    2009-01-01

    Oscillations may organize communication between components of large-scale brain networks. Although gamma-band oscillations have been repeatedly observed in cortical-basal ganglia circuits, their functional roles are not yet clear. Here I show that, in behaving animals, distinct frequencies of ventral striatal local field potential oscillations show coherence with different cortical inputs. ~50Hz gamma oscillations that normally predominate in awake ventral striatum are coherent with piriform cortex, while ~80-100Hz high-gamma oscillations are consistently coherent with frontal cortex. Within striatum, entrainment to gamma rhythms is selective to fast-spiking interneurons (FSIs), with distinct FSI populations entrained to different gamma frequencies. Administration of the psychomotor stimulant amphetamine or the dopamine agonist apomorphine causes a prolonged decrease in ~50Hz power and increase in ~80-100Hz power. The same frequency switch is observed for shorter epochs spontaneously in awake, undrugged animals, and is consistently provoked for <1s following reward receipt. Individual striatal neurons can participate in these brief high-gamma bursts with, or without, substantial changes in firing rate. Switching between discrete oscillatory states may allow different modes of information processing during decision-making and reinforcement-based learning, and may also be an important systems-level process by which stimulant drugs affect cognition and behavior. PMID:19659455

  5. Modeling the Energy Performance of Event-Driven Wireless Sensor Network by Using Static Sink and Mobile Sink

    PubMed Central

    Chen, Jiehui; Salim, Mariam B.; Matsumoto, Mitsuji

    2010-01-01

    Wireless Sensor Networks (WSNs) designed for mission-critical applications suffer from limited sensing capacities, particularly fast energy depletion. Regarding this, mobile sinks can be used to balance the energy consumption in WSNs, but the frequent location updates of the mobile sinks can lead to data collisions and rapid energy consumption for some specific sensors. This paper explores an optimal barrier coverage based sensor deployment for event driven WSNs where a dual-sink model was designed to evaluate the energy performance of not only static sensors, but Static Sink (SS) and Mobile Sinks (MSs) simultaneously, based on parameters such as sensor transmission range r and the velocity of the mobile sink v, etc. Moreover, a MS mobility model was developed to enable SS and MSs to effectively collaborate, while achieving spatiotemporal energy performance efficiency by using the knowledge of the cumulative density function (cdf), Poisson process and M/G/1 queue. The simulation results verified that the improved energy performance of the whole network was demonstrated clearly and our eDSA algorithm is more efficient than the static-sink model, reducing energy consumption approximately in half. Moreover, we demonstrate that our results are robust to realistic sensing models and also validate the correctness of our results through extensive simulations. PMID:22163503

  6. Structuring the Future: Anticipated Life Events, Peer Networks, and Adolescent Sexual Behavior

    PubMed Central

    Soller, Brian; Haynie, Dana L.

    2013-01-01

    While prior research has established associations between individual expectations of future events and risk behavior among adolescents, the potential effects of peers’ future perceptions on risk-taking have been overlooked. We extend prior research by testing whether peers’ anticipation of college completion is associated with adolescent sexual risk-taking. We also examine whether adolescents’ perceptions of the negative consequences of pregnancy and idealized romantic relationship scripts mediate the association between peers’ anticipation of college completion and sexual risk-taking. Results from multivariate regression models with data from the National Longitudinal Study of Adolescent Health (Add Health) indicate peers’ anticipation of college completion is negatively associated with a composite measure of sexual risk-taking and positively associated with the odds of abstaining from sexual intercourse and only engaging in intercourse with a romantic partner (compared to having intercourse with a non-romantic partner). In addition, perceptions of the negative consequences of pregnancy and sexualized relationship scripts appear to mediate a large portion of the association between peers’ anticipation of future success and sexual risk-taking and the likelihood of abstaining (but not engaging in romantic-only intercourse). Results from our study underscore the importance of peers in shaping adolescent sexual behavior. PMID:24223438

  7. Librarians as Knowledge Builders: Strategic Partnering for Service and Advocacy

    SciTech Connect

    Kreitz, P

    2003-12-15

    In their article on the challenges facing the postmodern library authors Elteto and Frank warn that the ''relevancy of academic libraries are at stake as a result of dramatic budget reductions and ongoing changes in the use of libraries.'' Recognizing the fiscal crisis facing libraries, many leaders in the profession are calling for libraries to strengthen their core roles in supporting campus research, teaching, and learning and to become more proactive and effective communicators of the critical role the library plays in supporting institutional goals. Responding to this difficult period facing academia and interested in highlighting the creative ways academic libraries around the country are responding, ACRL President, Tyrone Cannon has chosen ''Partnerships and Connections: the Learning Community as Knowledge Builders'' 2 as the theme for his presidential year. His intention is to foster opportunities for libraries to ''play a key role in developing, defining and enhancing learning communities central to campus life.'' Focusing our efforts on supporting the core business of academia will ensure that academic libraries continue to be places of ''opportunity, interaction, serendipity and strong collections and remain central to the knowledge building process.''

  8. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  9. Life events, social network, life-style, and health: an analysis of the 1979 National Survey of Personal Health Practices and Consequences.

    PubMed

    Gottlieb, N H; Green, L W

    1984-01-01

    The relationships among social structure, stress, social support, life-style health behavior, and health status are explored in this multivariate analysis of data from the National Survey of Personal Health Practices and Consequences. Path analyses showed social structural factors to influence life-style practices both directly and indirectly through social network and negative life events. For women, social network and life events had direct relationships to health related life-style practices, while age and income acted both directly and indirectly through social network and, for income, through life events. Education was also directly related to life-style. For men, social network and education had the only direct effects on health practices, and age and income had indirect effects through network. We then examined the relative contributions of the social network index elements, life events, and demographic variables to each of the life-style practices. These analyses confirmed the importance of gender, education, age, and income to predicting life-style behaviors. Negative life events were associated with smoking for both men and women, sleep for women only, and physical activity and alcohol use for men, which suggests sex-specific norms for coping with stress. For both sexes, church attendance and marriage were associated with favorable smoking and alcoholic use, implicating cognitive social support or social control as a mediator of health promotion. Finally, analyses for each gender using health status as the outcome variable indicated that age, income, education, and life events affected health directly, while the effects of church attendance and marriage were likely mediated through smoking and alcohol behaviors.

  10. Network analysis of possible anaphylaxis cases reported to the US vaccine adverse event reporting system after H1N1 influenza vaccine.

    PubMed

    Botsis, Taxiarchis; Ball, Robert

    2011-01-01

    The identification of signals from spontaneous reporting systems plays an important role in monitoring the safety of medical products. Network analysis (NA) allows the representation of complex interactions among the key elements of such systems. We developed a network for a subset of the US Vaccine Adverse Event Reporting System (VAERS) by representing the vaccines/adverse events (AEs) and their interconnections as the nodes and the edges, respectively; this subset we focused upon included possible anaphylaxis reports that were submitted for the H1N1 influenza vaccine. Subsequently, we calculated the main metrics that characterize the connectivity of the nodes and applied the island algorithm to identify the densest region in the network and, thus, identify potential safety signals. AEs associated with anaphylaxis formed a dense region in the 'anaphylaxis' network demonstrating the strength of NA techniques for pattern recognition. Additional validation and development of this approach is needed to improve future pharmacovigilance efforts. PMID:21893812

  11. Data scanner and event builder for the SMVD of the KEK B-factory

    SciTech Connect

    Tanaka, Manobu; Ikeda, Hirokazu; Ikeda, Mitsuo; Fujita, Youichi; Ozaki, Hitoshi . Physics Div.); Fukunaga, Chikara . Dept. of Physics)

    1994-02-01

    The authors are designing a readout system for a silicon microstrip vertex detector (SMVD) to be used in the KEK B-factory experiment. In order to obtain optimum numbers for the buffer size and the processing time they have constructed a simplified model using Verilog Hardware Description language (HDL). They estimated the live-time fractions of the SMVD readout system for various configurations.

  12. Transitioning to High Performance Homes: Successes and Lessons Learned From Seven Builders

    SciTech Connect

    Widder, Sarah H.; Kora, Angela R.; Baechler, Michael C.; Fonorow, Ken; Jenkins, David W.; Stroer, Dennis

    2013-03-01

    As homebuyers are becoming increasingly concerned about rising energy costs and the impact of fossil fuels as a major source of greenhouse gases, the returning new home market is beginning to demand energy-efficient and comfortable high-performance homes. In response to this, some innovative builders are gaining market share because they are able to market their homes’ comfort, better indoor air quality, and aesthetics, in addition to energy efficiency. The success and marketability of these high-performance homes is creating a builder demand for house plans and information about how to design, build, and sell their own low-energy homes. To help make these and other builders more successful in the transition to high-performance construction techniques, Pacific Northwest National Laboratory (PNNL) partnered with seven interested builders in the hot humid and mixed humid climates to provide technical and design assistance through two building science firms, Florida Home Energy and Resources Organization (FL HERO) and Calcs-Plus, and a designer that offers a line of stock plans designed specifically for energy efficiency, called Energy Smart Home Plans (ESHP). This report summarizes the findings of research on cost-effective high-performance whole-house solutions, focusing on real-world implementation and challenges and identifying effective solutions. The ensuing sections provide project background, profile each of the builders who participated in the program, and describe their houses’ construction characteristics, key challenges the builders encountered during the construction and transaction process); and present primary lessons learned to be applied to future projects. As a result of this technical assistance, 17 homes have been built featuring climate-appropriate efficient envelopes, ducts in conditioned space, and correctly sized and controlled heating, ventilation, and air-conditioning systems. In addition, most builders intend to integrate high

  13. Automated 3D Damaged Cavity Model Builder for Lower Surface Acreage Tile on Orbiter

    NASA Technical Reports Server (NTRS)

    Belknap, Shannon; Zhang, Michael

    2013-01-01

    The 3D Automated Thermal Tool for Damaged Acreage Tile Math Model builder was developed to perform quickly and accurately 3D thermal analyses on damaged lower surface acreage tiles and structures beneath the damaged locations on a Space Shuttle Orbiter. The 3D model builder created both TRASYS geometric math models (GMMs) and SINDA thermal math models (TMMs) to simulate an idealized damaged cavity in the damaged tile(s). The GMMs are processed in TRASYS to generate radiation conductors between the surfaces in the cavity. The radiation conductors are inserted into the TMMs, which are processed in SINDA to generate temperature histories for all of the nodes on each layer of the TMM. The invention allows a thermal analyst to create quickly and accurately a 3D model of a damaged lower surface tile on the orbiter. The 3D model builder can generate a GMM and the correspond ing TMM in one or two minutes, with the damaged cavity included in the tile material. A separate program creates a configuration file, which would take a couple of minutes to edit. This configuration file is read by the model builder program to determine the location of the damage, the correct tile type, tile thickness, structure thickness, and SIP thickness of the damage, so that the model builder program can build an accurate model at the specified location. Once the models are built, they are processed by the TRASYS and SINDA.

  14. Widespread prevalence of cryptic Symbiodinium D in the key Caribbean reef builder, Orbicella annularis

    NASA Astrophysics Data System (ADS)

    Kennedy, Emma V.; Foster, Nicola L.; Mumby, Peter J.; Stevens, Jamie R.

    2015-06-01

    Symbiodinium D, a relatively rare clade of algal endosymbiont with a global distribution, has attracted interest as some of its sub-cladal types induce increased thermal tolerance and associated trade-offs, including reduced growth rate in its coral hosts. Members of Symbiodinium D are increasingly reported to comprise low-abundance `cryptic' (<10 %) proportions of mixed coral endosymbiont communities, with unknown ecological implications. Real-time PCR (RT-PCR) targeted to specific types is sufficiently sensitive to detect these background symbiont levels. In this study, RT-PCR was employed to screen 552 colonies of the key Caribbean reef builder Orbicella annularis sampled across a 5.4 million km2 range for the presence of cryptic Symbiodinium `D1' (i.e., the principal Caribbean ITS2 variants, D1 and D1-4). All but one out of 33 populations analysed were shown to host low abundances of Symbiodinium D1, with an average of >30 % of corals per site found to harbour the symbiont. When the same samples were analysed using the conventional screening technique, denaturing gradient gel electrophoresis, Symbiodinium D1 was only detected in 12 populations and appeared to be hosted by <12 % of colonies where present (in agreement with other reported low prevalence/absences in O. annularis). Cryptic Symbiodinium D1 showed a mainly uniform distribution across the wider Caribbean region, although significantly more Mesoamerican Barrier Reef corals hosted cryptic Symbiodinium D1 than might be expected by chance, possibly as a consequence of intense warming in the region in 1998. Widespread prevalence of thermally tolerant Symbiodinium in O. annularis may potentially reflect a capacity for the coral to temporarily respond to warming events through symbiont shuffling. However, association with reduced coral calcification means that the ubiquitous nature of Symbiodinium D1 in O. annularis populations is unlikely to prevent long-term declines in reef health, at a time when

  15. Computer (PC/Network) Coordinator.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 22 subjects appropriate for use in a competency list for the occupation of computer (PC/network) coordinator, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 22 units are as…

  16. DOE Zero Energy Ready Home Case Study: New Town Builders — The ArtiZEN Plan, Denver, CO

    SciTech Connect

    none,

    2014-09-01

    The Grand Winner in the Production Builder category of the 2014 Housing Innovation Awards, this builder plans to convert all of its product lines to DOE Zero Energy Ready Home construction by the end of 2015. This home achieves HERS 38 without photovoltaics (PV) and HERS -3 with 8.0 kW of PV.

  17. Impaired target detection in schizophrenia and the ventral attentional network: Findings from a joint event-related potential-functional MRI analysis.

    PubMed

    Wynn, Jonathan K; Jimenez, Amy M; Roach, Brian J; Korb, Alexander; Lee, Junghee; Horan, William P; Ford, Judith M; Green, Michael F

    2015-01-01

    Schizophrenia patients have abnormal neural responses to salient, infrequent events. We integrated event-related potentials (ERP) and fMRI to examine the contributions of the ventral (salience) and dorsal (sustained) attention networks to this dysfunctional neural activation. Twenty-one schizophrenia patients and 22 healthy controls were assessed in separate sessions with ERP and fMRI during a visual oddball task. Visual P100, N100, and P300 ERP waveforms and fMRI activation were assessed. A joint independent components analysis (jICA) on the ERP and fMRI data were conducted. Patients exhibited reduced P300, but not P100 or N100, amplitudes to targets and reduced fMRI neural activation in both dorsal and ventral attentional networks compared with controls. However, the jICA revealed that the P300 was linked specifically to activation in the ventral (salience) network, including anterior cingulate, anterior insula, and temporal parietal junction, with patients exhibiting significantly lower activation. The P100 and N100 were linked to activation in the dorsal (sustained) network, with no group differences in level of activation. This joint analysis approach revealed the nature of target detection deficits that were not discernable by either imaging methodology alone, highlighting the utility of a multimodal fMRI and ERP approach to understand attentional network deficits in schizophrenia. PMID:26448909

  18. Impaired target detection in schizophrenia and the ventral attentional network: Findings from a joint event-related potential–functional MRI analysis

    PubMed Central

    Wynn, Jonathan K.; Jimenez, Amy M.; Roach, Brian J.; Korb, Alexander; Lee, Junghee; Horan, William P.; Ford, Judith M.; Green, Michael F.

    2015-01-01

    Schizophrenia patients have abnormal neural responses to salient, infrequent events. We integrated event-related potentials (ERP) and fMRI to examine the contributions of the ventral (salience) and dorsal (sustained) attention networks to this dysfunctional neural activation. Twenty-one schizophrenia patients and 22 healthy controls were assessed in separate sessions with ERP and fMRI during a visual oddball task. Visual P100, N100, and P300 ERP waveforms and fMRI activation were assessed. A joint independent components analysis (jICA) on the ERP and fMRI data were conducted. Patients exhibited reduced P300, but not P100 or N100, amplitudes to targets and reduced fMRI neural activation in both dorsal and ventral attentional networks compared with controls. However, the jICA revealed that the P300 was linked specifically to activation in the ventral (salience) network, including anterior cingulate, anterior insula, and temporal parietal junction, with patients exhibiting significantly lower activation. The P100 and N100 were linked to activation in the dorsal (sustained) network, with no group differences in level of activation. This joint analysis approach revealed the nature of target detection deficits that were not discernable by either imaging methodology alone, highlighting the utility of a multimodal fMRI and ERP approach to understand attentional network deficits in schizophrenia. PMID:26448909

  19. New Whole-House Solutions Case Study: New Town Builders' Power of Zero Energy Center - Denver, Colorado

    SciTech Connect

    2014-10-01

    New Town Builders, a builder of energy efficient homes in Denver, Colorado, offers a zero energy option for all the homes it builds. To attract a wide range of potential homebuyers to its energy efficient homes, New Town Builders created a "Power of Zero Energy Center" linked to its model home in the Stapleton community. This case study presents New Town Builders' marketing approach, which is targeted to appeal to homebuyers' emotions rather than overwhelming homebuyers with scientific details about the technology. The exhibits in the Power of Zero Energy Center focus on reduced energy expenses for the homeowner, improved occupant comfort, the reputation of the builder, and the lack of sacrificing the homebuyers' desired design features to achieve zero net energy in the home. This case study also contains customer and realtor testimonials related to the effectiveness of the Center in influencing homebuyers to purchase a zero energy home.

  20. Regional seismic event identification and improved locations with small arrays and networks. Final report, 7 May 1993-30 September 1995

    SciTech Connect

    Vernon, F.L.; Minster, J.B.; Orcutt, J.A.

    1995-09-20

    This final report contains a summary of our work on the use of seismic networks and arrays to improve locations and identify small seismic event. We have developed techniques to migrate 3-component array records of local, regional and teleseismic wavetrains to directly image buried two- and three-dimensional heterogeneities (e.g. layer irregularities, volumetric heterogeneities) in the vicinity of the array. We have developed a technique to empirically characterize local and regional seismic code by binning and stacking network recordings of dense aftershock sequences. The principle motivation for this work was to look for robust coda phases dependent on source depth. We have extended our ripple-fired event discriminant (based on the time-independence of coda produced by ripple firing) by looking for an independence of the coda from the recording direction (also indicative of ripple-firing).

  1. 14 CFR 65.104 - Repairman certificate-experimental aircraft builder-Eligibility, privileges and limitations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrator that the individual has the requisite skill to determine whether the aircraft is in a condition... country who has lawfully been admitted for permanent residence in the United States. (b) The holder of a repairman certificate (experimental aircraft builder) may perform condition inspections on the...

  2. CHARMM-GUI HMMM Builder for Membrane Simulations with the Highly Mobile Membrane-Mimetic Model.

    PubMed

    Qi, Yifei; Cheng, Xi; Lee, Jumin; Vermaas, Josh V; Pogorelov, Taras V; Tajkhorshid, Emad; Park, Soohyung; Klauda, Jeffery B; Im, Wonpil

    2015-11-17

    Slow diffusion of the lipids in conventional all-atom simulations of membrane systems makes it difficult to sample large rearrangements of lipids and protein-lipid interactions. Recently, Tajkhorshid and co-workers developed the highly mobile membrane-mimetic (HMMM) model with accelerated lipid motion by replacing the lipid tails with small organic molecules. The HMMM model provides accelerated lipid diffusion by one to two orders of magnitude, and is particularly useful in studying membrane-protein associations. However, building an HMMM simulation system is not easy, as it requires sophisticated treatment of the lipid tails. In this study, we have developed CHARMM-GUI HMMM Builder (http://www.charmm-gui.org/input/hmmm) to provide users with ready-to-go input files for simulating HMMM membrane systems with/without proteins. Various lipid-only and protein-lipid systems are simulated to validate the qualities of the systems generated by HMMM Builder with focus on the basic properties and advantages of the HMMM model. HMMM Builder supports all lipid types available in CHARMM-GUI and also provides a module to convert back and forth between an HMMM membrane and a full-length membrane. We expect HMMM Builder to be a useful tool in studying membrane systems with enhanced lipid diffusion. PMID:26588561

  3. Heating, Ventilation, and Air Conditioning Design Strategy for a Hot-Humid Production Builder

    SciTech Connect

    Kerrigan, P.

    2014-03-01

    BSC worked directly with the David Weekley Homes - Houston division to redesign three floor plans in order to locate the HVAC system in conditioned space. The purpose of this project is to develop a cost effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses. This is in preparation for the upcoming code changes in 2015. The builder wishes to develop an upgrade package that will allow for a seamless transition to the new code mandate. The following research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.

  4. Captivate MenuBuilder: Creating an Online Tutorial for Teaching Software

    ERIC Educational Resources Information Center

    Yelinek, Kathryn; Tarnowski, Lynn; Hannon, Patricia; Oliver, Susan

    2008-01-01

    In this article, the authors, students in an instructional technology graduate course, describe a process to create an online tutorial for teaching software. They created the tutorial for a cyber school's use. Five tutorial modules were linked together through one menu screen using the MenuBuilder feature in the Adobe Captivate program. The…

  5. BioBuilder as a database development and functional annotation platform for proteins

    PubMed Central

    Navarro, J Daniel; Talreja, Naveen; Peri, Suraj; Vrushabendra, BM; Rashmi, BP; Padma, N; Surendranath, Vineeth; Jonnalagadda, Chandra Kiran; Kousthub, PS; Deshpande, Nandan; Shanker, K; Pandey, Akhilesh

    2004-01-01

    Background The explosion in biological information creates the need for databases that are easy to develop, easy to maintain and can be easily manipulated by annotators who are most likely to be biologists. However, deployment of scalable and extensible databases is not an easy task and generally requires substantial expertise in database development. Results BioBuilder is a Zope-based software tool that was developed to facilitate intuitive creation of protein databases. Protein data can be entered and annotated through web forms along with the flexibility to add customized annotation features to protein entries. A built-in review system permits a global team of scientists to coordinate their annotation efforts. We have already used BioBuilder to develop Human Protein Reference Database , a comprehensive annotated repository of the human proteome. The data can be exported in the extensible markup language (XML) format, which is rapidly becoming as the standard format for data exchange. Conclusions As the proteomic data for several organisms begins to accumulate, BioBuilder will prove to be an invaluable platform for functional annotation and development of customizable protein centric databases. BioBuilder is open source and is available under the terms of LGPL. PMID:15099404

  6. Improving Water Management: Applying ModelBuilder to site water impoundments using AEM survey data

    SciTech Connect

    Sams, J.I.; Lipinski, B.A.; Harbert, W.P.; Ackman, T.E.

    2007-01-01

    ArcGIS ModelBuilder was used to create a GIS-based decision support model that incorporated digital elevation data and electromagnetic geophysical results gathered by helicopter to screen potential sites for water disposal impoundments produced from coal bed natural gas.

  7. HVAC Design Strategy for a Hot-Humid Production Builder, Houston, Texas (Fact Sheet)

    SciTech Connect

    Not Available

    2014-03-01

    BSC worked directly with the David Weekley Homes - Houston division to redesign three floor plans in order to locate the HVAC system in conditioned space. The purpose of this project is to develop a cost effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses. This is in preparation for the upcoming code changes in 2015. The builder wishes to develop an upgrade package that will allow for a seamless transition to the new code mandate. The following research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.

  8. Three Adapted Science Skill Builders for Junior and Senior High School Orthopaedically Handicapped Students.

    ERIC Educational Resources Information Center

    Cardullias, Peter J.; And Others

    The study was designed to determine how standard science skill builder activities can be modified or adapted for use by orthopedically handicapped students. Nine secondary level science experiments were selected for initial review and from these, three were selected for adaptation--use of the microscope, use of graduated cylinders, and use of the…

  9. 77 FR 28411 - Adrenalina, Affinity Technology Group, Inc., Braintech, Inc., Builders Transport, Incorporated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... COMMISSION Adrenalina, Affinity Technology Group, Inc., Braintech, Inc., Builders Transport, Incorporated... Exchange Commission that there is a lack of current and accurate information concerning the securities of... appears to the Securities and Exchange Commission that there is a lack of current and accurate...

  10. DOE Zero Energy Ready Home Case Study: BPC Green Builders, New Fairfield, Connecticut

    SciTech Connect

    none,

    2013-09-01

    This LEED Platinum home was built on the site of a 60-year-old bungalow that was demolished. It boasts views of Candlewood Lake, a great deal of daylight, and projected annual energy savings of almost $3,000. This home was awarded a 2013 Housing Innovation Award in the custom builder category.

  11. Enacting Social Justice to Teach Social Justice: The Pedagogy of Bridge Builders

    ERIC Educational Resources Information Center

    Eifler, Karen E.; Kerssen-Griep, Jeff; Thacker, Peter

    2008-01-01

    This article describes a particular endeavor, the Bridge Builders Academic Mentoring Program (BAMP), a partnership between a school of education in a Catholic university in the Northwest and a community-based rites of passage program for adolescent African American males. The partnership exemplifies tenets of Catholic social teaching, in that it…

  12. 13 CFR 120.391 - What is the Builders Loan Program?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What is the Builders Loan Program? 120.391 Section 120.391 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS...)(9) of the Act, SBA may make or guarantee loans to finance small general contractors to construct...

  13. Builder 3 & 2. Naval Education and Training Command Rate Training Manual and Nonresident Career Course. Revised.

    ERIC Educational Resources Information Center

    Countryman, Gene L.

    This Rate Training Manual (Textbook) and Nonresident Career Course form a correspondence, self-study package to provide information related to tasks assigned to Builders Third and Second Class. Focus is on constructing, maintaining, and repairing wooden, concrete, and masonry structures, concrete pavement, and waterfront and underwater structures;…

  14. A multi-station matched filter and coherent network processing approach to the automatic detection and relative location of seismic events

    NASA Astrophysics Data System (ADS)

    Gibbons, Steven J.; Näsholm, Sven Peter; Kværna, Tormod

    2014-05-01

    Correlation detectors facilitate seismic monitoring in the near vicinity of previously observed events at far lower detection thresholds than are possible using the methods applied in most existing processing pipelines. The use of seismic arrays has been demonstrated to be highly beneficial in pressing down the detection threshold, due to superior noise suppression, and also in eliminating vast numbers of false alarms by performing array processing on the multi-channel output of the correlation detectors. This last property means that it is highly desirable to run continuous detectors for sites of repeating seismic events on a single-array basis for many arrays across a global network. Spurious detections for a given signal template on a single array can however still occur when an unrelated wavefront crosses the array from a very similar direction to that of the master event wavefront. We present an algorithm which scans automatically the output from multiple stations - both array and 3-component - for coherence between the individual station correlator outputs that is consistent with a disturbance in the vicinity of the master event. The procedure results in a categorical rejection of an event hypothesis in the absence of support from stations other than the one generating the trigger and provides a fully automatic relative event location estimate when patterns in the correlation detector outputs are found to be consistent with a common event. This coherence-based approach removes the need to make explicit measurements of the time-differences for single stations and this eliminates a potential source of error. The method is demonstrated for the North Korea nuclear test site and the relative event location estimates obtained for the 2006, 2009, and 2013 events are compared with previous estimates from different station configurations.

  15. Performance evaluation of the retrieval of a two hours rainfall event through microwave tomography applied to a network of radio-base stations

    NASA Astrophysics Data System (ADS)

    Facheris, L.; Cuccoli, F.; Baldini, L.

    2012-04-01

    Critical precipitation events occurred over the Italian territory have been often characterized by high intensity and very fast development, frequently over small catchment areas. The detection of this kind of phenomena is a major issue that poses remarkable problems that cannot be tackled completely only with 'standard' instrumentation (even when available), such as a weather radars or raingauges. Indeed, the rainfall sampling modalities of these instruments may jeopardize the attempts to provide a sufficiently fast risk alert: - the point-like, time-integrated way of sampling of raingauges can completely/partially miss local rainfall cores of high intensity developing in the neighborhoods. Moreover, raingauges provide cumulated rainfall measurements intrinsically affected by a time delay. - In the case of weather radars, several factors may limit the advantages brought by range resolution and instantaneous sampling: precipitation might be sampled at an excessive height due to the distance of the radar site and/or the orography surrounding the valleys/catchments where the aforementioned kind of events is more likely to form up; distance may limit the resolution in the cross-range direction; beam screening due to orography causes a loss of power that is interpreted in the farther range bins as a reduced precipitation intensity. In this context, a positive role for flagging the criticality of a precipitation event can be played by signal attenuation measurements made along microwave links, as available through the infrastructure of a mobile communications network. Three are the interesting features of such networks: 1) the communications among radio-base stations occur where point-to-point electromagnetic visibility is guaranteed, namely along valleys or between tops/flanks of hills or mountains; 2) the extension of these links (few kilometres) is perfectly compatible with the detection of severe but localized precipitation events; 3) measurements can be made on a

  16. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  17. On the impact of RN network coverage on event selection and data fusion during the 2009 National Data Centres Preparedness Exercise

    NASA Astrophysics Data System (ADS)

    Becker, Andreas; Krysta, Monika; Auer, Matthias; Brachet, Nicolas; Ceranna, Lars; Gestermann, Nicolai; Nikkinen, Mika; Zähringer, Matthias

    2010-05-01

    The so-called National Data Centres (NDCs) to the Provisional Technical Secretariat of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Organization are in charge to provide for the final judgement on the CTBT relevance of explosion events encountered in the PTS International Monitoring System (IMS). The latter is a 321 stations network set-up by the PTS (to date completion level: 80%) in order to globally monitor for occurrence of CTBT relevant seismo-acoustic and radionuclide signals. In doing so, NDCs learn about any seismo-acoustic or radionuclide event by active retrieval or subscription to corresponding event lists and products provided by the International Data Centre (IDC) to the PTS. To prepare for their instrumental role in case of a CTBT relevant event, the NDCs jointly conduct annually so-called NDC Preparedness Exercises. In 2009, NDC Germany was in charge to lead the exercise and to choose a seismo-acoustic event out of the list of events provided by the PTS (Gestermann et al., EGU2010-13067). The novelty in this procedure was that also the infrasound readings and the monitoring coverage of existing (certified) radionuclide stations into the area of consideration were taken into account during the event selection process (Coyne et al., EGU2010-12660). Hence, the event finally chosen and examined took place near Kara-Zhyra mine in Eastern Kazakhstan on 28 November 2009 around 07:20:31 UTC (Event-ID 5727516). NDC Austria performed forward atmospheric transport modelling in order to predict RN measurements that should have occurred in the radionuclide IMS. In doing so the fictitious case that there would have been a release of radionuclides taking place at the same location (Wotawa and Schraik, 2010; EGU2010-4907) in a strength being typical for a non-contained nuclear explosion is examined. The stations indicated should then be analysed for their actual radionuclide readings in order to confirm the non nuclear character of

  18. Experimental evidence of the synergistic effects of warming and invasive algae on a temperate reef-builder coral

    PubMed Central

    Kersting, Diego K; Cebrian, Emma; Casado, Clara; Teixidó, Núria; Garrabou, Joaquim; Linares, Cristina

    2015-01-01

    In the current global climate change scenario, stressors overlap in space and time, and knowledge on the effects of their interaction is highly needed to understand and predict the response and resilience of organisms. Corals, among many other benthic organisms, are affected by an increasing number of global change-related stressors including warming and invasive species. In this study, the cumulative effects between warming and invasive algae were experimentally assessed on the temperate reef-builder coral Cladocora caespitosa. We first investigated the potential local adaptation to thermal stress in two distant populations subjected to contrasting thermal and necrosis histories. No significant differences were found between populations. Colonies from both populations suffered no necrosis after long-term exposure to temperatures up to 29 °C. Second, we tested the effects of the interaction of both warming and the presence of invasive algae. The combined exposure triggered critical synergistic effects on photosynthetic efficiency and tissue necrosis. At the end of the experiment, over 90% of the colonies subjected to warming and invasive algae showed signs of necrosis. The results are of particular concern when considering the predicted increase of extreme climatic events and the spread of invasive species in the Mediterranean and other seas in the future. PMID:26692424

  19. Experimental evidence of the synergistic effects of warming and invasive algae on a temperate reef-builder coral.

    PubMed

    Kersting, Diego K; Cebrian, Emma; Casado, Clara; Teixidó, Núria; Garrabou, Joaquim; Linares, Cristina

    2015-12-22

    In the current global climate change scenario, stressors overlap in space and time, and knowledge on the effects of their interaction is highly needed to understand and predict the response and resilience of organisms. Corals, among many other benthic organisms, are affected by an increasing number of global change-related stressors including warming and invasive species. In this study, the cumulative effects between warming and invasive algae were experimentally assessed on the temperate reef-builder coral Cladocora caespitosa. We first investigated the potential local adaptation to thermal stress in two distant populations subjected to contrasting thermal and necrosis histories. No significant differences were found between populations. Colonies from both populations suffered no necrosis after long-term exposure to temperatures up to 29 °C. Second, we tested the effects of the interaction of both warming and the presence of invasive algae. The combined exposure triggered critical synergistic effects on photosynthetic efficiency and tissue necrosis. At the end of the experiment, over 90% of the colonies subjected to warming and invasive algae showed signs of necrosis. The results are of particular concern when considering the predicted increase of extreme climatic events and the spread of invasive species in the Mediterranean and other seas in the future.

  20. Asian Dust Storm Events of 2001 and Associated Pollution Observed in New England by the AIRMAP Monitoring Network

    NASA Astrophysics Data System (ADS)

    Debell, L. J.; Vozzella, M. E.; Talbot, R. W.; Dibb, J. E.

    2002-12-01

    The Atmospheric Investigation, Regional Modeling, Analysis and Prediction (AIRMAP) program is operating 4 monitoring sites in New Hampshire, located at Fort Constitution (FC)(43.07oN, 70.71oW, 5m elevation), Thompson Farm (TF) (43.11oN, 70.95oW, 21m elevation), Castle Springs (CS) (43.75oN,71.35oW, 406m elevation) and Mount Washington (MW)(44.267oN, 71.30oW, 1909m elevation). Three chemically distinct, statistically extreme, regional scale dust aerosol events were observed at all four AIRMAP monitoring stations in NH between 4/18/01 and 5/13/01 (UTC). All three events, at all four sites, had days where the 24 hr bulk aerosol samples had Ca2+ concentrations that exceeded at least the 95th percentile of the site-specific, multi-year datasets. NO3- and SO42- were also enhanced above typical levels, ranging from above the 75th to above the 99th percentile. During all three events, mixing ratios of the gas phase pollutants O3 and CO were compared to mixing ratios on either side of the events. During event 1,enhancements above background levels were approximately 130 ppbv for CO and 30 ppbv for O3, very similar to the CO values in apparent Asian dust plumes sampled over Colorado at 6-7 km by aircraft measurements (http://www.cmdl.noaa.gov/info/asiandust.html); enhancements during events 2 and 3 were similar to event 1. The maximum elemental carbon value ever observed at TF, 0.97 μg/m3, occurred during the peak day of event 1. Elemental carbon was not substantially elevated during event 2 and no data were collected during event 3. Elemental ratios, determined by PIXE, on filters from events 1 and 3 were compared pairwise to each other and to published samples attributed to Asian dust storms. The AIRMAP samples collected on the same date at different sites showed good statistical agreement whereas samples collected at the same site on different dates show only moderate correlation. Of 17 published samples of Asian dust storm aerosol, collected well outside of the major

  1. Effect of densifying the GNSS GBAS network on monitoring the troposphere zenith total delay and precipitable water vapour content during severe weather events

    NASA Astrophysics Data System (ADS)

    Kapłon, Jan; Stankunavicius, Gintautas

    2016-04-01

    The dense ground based augmentation networks can provide the important information for monitoring the state of neutral atmosphere. The GNSS&METEO research group at Wroclaw University of Environmental and Life Sciences (WUELS) is operating the self-developed near real-time service estimating the troposphere parameters from GNSS data for the area of Poland. The service is operational since December 2012 and it's results calculated from ASG-EUPOS GBAS network (120 stations) data are supporting the EGVAP (http://egvap.dmi.dk) project. At first the zenith troposphere delays (ZTD) were calculated in hourly intervals, but since September 2015 the service was upgraded to include SmartNet GBAS network (Leica Geosystems Polska - 150 stations). The upgrade included as well: increasing the result interval to 30 minutes, upgrade from Bernese GPS Software v. 5.0 to Bernese GNSS Software v. 5.2 and estimation of the ZTD and it's horizontal gradients. Processing includes nowadays 270 stations. The densification of network from 70 km of mean distance between stations to 40 km created the opportunity to investigate on it's impact on resolution of estimated ZTD and integrated water vapour content (IWV) fields during the weather events of high intensity. Increase in density of ZTD measurements allows to define better the meso-scale features within different synoptic systems (e.g. frontal waves, meso-scale convective systems, squall lines etc). These meso-scale structures, as a rule are short living but fast developing and hardly predictable by numerical models. Even so, such limited size systems can produce very hazardous phenomena - like widespread squalls and thunderstorms, tornadoes, heavy rains, snowfalls, hail etc. because of prevalence of Cb clouds with high concentration of IWV. Study deals with two meteorological events: 2015-09-01 with the devastating squalls and rainfall bringing 2M Euro loss of property in northern Poland and 2015-10-12 with the very active front bringing

  2. Heating, Ventilation, and Air Conditioning Design Strategy for a Hot-Humid Production Builder

    SciTech Connect

    Kerrigan, P.

    2014-03-01

    Building Science Corporation (BSC) worked directly with the David Weekley Homes - Houston division to develop a cost-effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses in preparation for the upcoming code changes in 2015. This research project addressed the following questions: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost?

  3. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    PubMed

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  4. The GlycanBuilder and GlycoWorkbench glycoinformatics tools: updates and new developments.

    PubMed

    Damerell, David; Ceroni, Alessio; Maass, Kai; Ranzinger, Rene; Dell, Anne; Haslam, Stuart M

    2012-11-01

    During the EUROCarbDB project our group developed the GlycanBuilder and GlycoWorkbench glycoinformatics tools. This short communication summarizes the capabilities of these two tools and updates which have been made since the original publications in 2007 and 2008. GlycanBuilder is a tool that allows for the fast and intuitive drawing of glycan structures; this tool can be used standalone, embedded in web pages and can also be integrated into other programs. GlycoWorkbench has been designed to semi-automatically annotate glycomics data. This tool can be used to annotate mass spectrometry (MS) and MS/MS spectra of free oligosaccharides, N and O-linked glycans, GAGs (glycosaminoglycans) and glycolipids, as well as MS spectra of glycoproteins. PMID:23109548

  5. The Lexicon Builder Web service: Building Custom Lexicons from two hundred Biomedical Ontologies

    PubMed Central

    Parai, Gautam K.; Jonquet, Clement; Xu, Rong; Musen, Mark A.; Shah, Nigam H.

    2010-01-01

    Domain specific biomedical lexicons are extensively used by researchers for natural language processing tasks. Currently these lexicons are created manually by expert curators and there is a pressing need for automated methods to compile such lexicons. The Lexicon Builder Web service addresses this need and reduces the investment of time and effort involved in lexicon maintenance. The service has three components: Inclusion – selects one or several ontologies (or its branches) and includes preferred names and synonym terms; Exclusion – filters terms based on the term’s Medline frequency, syntactic type, UMLS semantic type and match with stopwords; Output – aggregates information, handles compression and output formats. Evaluation demonstrates that the service has high accuracy and runtime performance. It is currently being evaluated for several use cases to establish its utility in biomedical information processing tasks. The Lexicon Builder promotes collaboration, sharing and standardization of lexicons amongst researchers by automating the creation, maintainence and cross referencing of custom lexicons. PMID:21347046

  6. LipidBuilder: A Framework To Build Realistic Models for Biological Membranes.

    PubMed

    Bovigny, Christophe; Tamò, Giorgio; Lemmin, Thomas; Maïno, Nicolas; Dal Peraro, Matteo

    2015-12-28

    The physical and chemical characterization of biological membranes is of fundamental importance for understanding the functional role of lipid bilayers in shaping cells and organelles, steering vesicle trafficking and promoting membrane-protein signaling. Molecular dynamics simulations stand as a powerful tool to probe the properties of membranes at atomistic level. However, the biological membrane is highly complex, and closely mimicking its physiological constitution in silico is not a straightforward task. Here, we present LipidBuilder, a framework for creating and storing models of biologically relevant phospholipid species with acyl tails of heterogeneous composition. LipidBuilder also enables the assembly of these database-stored lipids into realistic bilayers featuring asymmetric distribution on layer leaflets and concentration of given membrane constituents as defined, for example, by lipidomics experiments. The ability of LipidBuilder to assemble robust membrane models was validated by simulating membranes of homogeneous lipid composition for which experimental data are available. Furthermore, taking advantage of the extensive lipid headgroup repertoire, we assembled models of membranes of heterogeneous nature as naturally found in viral (phage PRD1), bacterial (Salmonella enterica, Laurinavicius , S. ; Kakela , R. ; Somerharju , P. ; Bamford , D. H. ; Virology 2004 , 322 , 328 - 336 ) and plant (Chlorella kessleri, Rezanka , T. ; Podojil , M. ; J. Chromatogr. 1989 , 463 , 397 - 408 ) organisms. These realistic membrane models were built using a near-exact lipid composition revealed from analytical chemistry experiments. We suggest LipidBuilder as a useful tool to model biological membranes of near-biological complexity, and as a robust complement to the current efforts to characterize the biophysical properties of biological membrane using molecular simulation. PMID:26606666

  7. Development of a beam builder for automatic fabrication of large composite space structures

    NASA Technical Reports Server (NTRS)

    Bodle, J. G.

    1979-01-01

    The composite material beam builder which will produce triangular beams from pre-consolidated graphite/glass/thermoplastic composite material through automated mechanical processes is presented, side member storage, feed and positioning, ultrasonic welding, and beam cutoff are formed. Each process lends itself to modular subsystem development. Initial development is concentrated on the key processes for roll forming and ultrasonic welding composite thermoplastic materials. The construction and test of an experimental roll forming machine and ultrasonic welding process control techniques are described.

  8. DOE Zero Energy Ready Home Case Study: BPC Green Builders — Trolle Residence, Danbury, CT

    SciTech Connect

    none,

    2014-09-01

    The builder of this 1,650-ft2 cabin won a Custom Home honor in the 2014 Housing Innovations Awards. The home meets Passive House Standards with 5.5-in. of foil-faced polysiocyanurate foam boards lining the outside walls, R-55 of rigid EPS foam under the slab, R-86 of blown cellulose in the attic, triple-pane windows, and a single ductless heat pump to heat and cool the entire home.

  9. Measurement of sediment loads during flash flood events: 14 years of results from a six stream monitoring network on the southern Colorado Plateau

    NASA Astrophysics Data System (ADS)

    Griffiths, R. E.; Topping, D. J.

    2015-12-01

    In in arid and semi-arid environments, short-duration, high-intensity rainfall events—flash floods—are the primary driver of sediment transport in ephemeral streams. The spatial and temporal variability of these rainfall events results in episodic and irregular stream flow and resultant sediment transport. As a result of limited-flow durations, measuring discharge and collecting suspended-sediment samples on ephemeral streams in arid regions is difficult and time-consuming. Because of these limitations, few sediment-monitoring programs on ephemeral streams have been developed; some examples of sediment-monitoring gages and gaging networks constructed on arid ephemeral streams include Walnut Gulch, United States, Nahal Yael, Israel, and the Luni River Basin, India. The difficulty in making measurements of discharge and suspended-sediment concentration on arid ephemeral streams has led many researchers to use methods such as regional sediment-yield equations, sediment-rating curves, and peak discharge to total-sediment load relations. These methods can provide a cost-effective estimation of sediment yield from ungaged tributaries. However, these approaches are limited by, among other factors, time averaging, hysteresis, and differences in local and regional geology, rainfall, and vegetation. A monitoring network was established in 2000 on six ephemeral tributaries of the Colorado River in lower Glen and upper Marble canyons. Results from this monitoring network show that annual suspended-sediment loads for individual streams can vary by 5 orders of magnitude while the annual suspended-sediment load for the entire network may vary annually by 2 orders of magnitude, suspended-sediment loads during an individual flood event do not typically correlate with discharge, and local geology has a strong control on the sediment yield of a drainage basin. Comparing our results to previous estimates of sediment load from these drainages found that previous, indirect, methods

  10. Event-based distributed set-membership filtering for a class of time-varying non-linear systems over sensor networks with saturation effects

    NASA Astrophysics Data System (ADS)

    Wei, Guoliang; Liu, Shuai; Wang, Licheng; Wang, Yongxiong

    2016-07-01

    In this paper, based on the event-triggered mechanism, the problem of distributed set-membership filtering is concerned for a class of time-varying non-linear systems over sensor networks subject to saturation effects. Different from the traditional periodic sample-data approach, the filter is updated only when the predefined event is satisfied, which the event is defined according to the measurement output. For each node, the proposed novel event-triggered mechanism can reduce the unnecessary information transmission between sensors and filters. The purpose of the addressed problem is to design a series of distributed set-membership filters, for all the admissible unknown but bounded noises, non-linearities and sensor saturation, such that the set of all possible states can be determined. The desired filter parameters are obtained by solving a recursive linear matrix inequality that can be computed recursively using the available MATLAB toolbox. Finally, a simulation example is exploited to show the effectiveness of the proposed design approach in this paper.

  11. Solving a real-world problem using an evolving heuristically driven schedule builder.

    PubMed

    Hart, E; Ross, P; Nelson, J

    1998-01-01

    This work addresses the real-life scheduling problem of a Scottish company that must produce daily schedules for the catching and transportation of large numbers of live chickens. The problem is complex and highly constrained. We show that it can be successfully solved by division into two subproblems and solving each using a separate genetic algorithm (GA). We address the problem of whether this produces locally optimal solutions and how to overcome this. We extend the traditional approach of evolving a "permutation + schedule builder" by concentrating on evolving the schedule builder itself. This results in a unique schedule builder being built for each daily scheduling problem, each individually tailored to deal with the particular features of that problem. This results in a robust, fast, and flexible system that can cope with most of the circumstances imaginable at the factory. We also compare the performance of a GA approach to several other evolutionary methods and show that population-based methods are superior to both hill-climbing and simulated annealing in the quality of solutions produced. Population-based methods also have the distinct advantage of producing multiple, equally fit solutions, which is of particular importance when considering the practical aspects of the problem. PMID:10021741

  12. Solving a real-world problem using an evolving heuristically driven schedule builder.

    PubMed

    Hart, E; Ross, P; Nelson, J

    1998-01-01

    This work addresses the real-life scheduling problem of a Scottish company that must produce daily schedules for the catching and transportation of large numbers of live chickens. The problem is complex and highly constrained. We show that it can be successfully solved by division into two subproblems and solving each using a separate genetic algorithm (GA). We address the problem of whether this produces locally optimal solutions and how to overcome this. We extend the traditional approach of evolving a "permutation + schedule builder" by concentrating on evolving the schedule builder itself. This results in a unique schedule builder being built for each daily scheduling problem, each individually tailored to deal with the particular features of that problem. This results in a robust, fast, and flexible system that can cope with most of the circumstances imaginable at the factory. We also compare the performance of a GA approach to several other evolutionary methods and show that population-based methods are superior to both hill-climbing and simulated annealing in the quality of solutions produced. Population-based methods also have the distinct advantage of producing multiple, equally fit solutions, which is of particular importance when considering the practical aspects of the problem.

  13. Time-related patient data retrieval for the case studies from the pharmacogenomics research network.

    PubMed

    Zhu, Qian; Tao, Cui; Ding, Ying; Chute, Christopher G

    2012-11-01

    There are lots of question-based data elements from the pharmacogenomics research network (PGRN) studies. Many data elements contain temporal information. To semantically represent these elements so that they can be machine processiable is a challenging problem for the following reasons: (1) the designers of these studies usually do not have the knowledge of any computer modeling and query languages, so that the original data elements usually are represented in spreadsheets in human languages; and (2) the time aspects in these data elements can be too complex to be represented faithfully in a machine-understandable way. In this paper, we introduce our efforts on representing these data elements using semantic web technologies. We have developed an ontology, CNTRO, for representing clinical events and their temporal relations in the web ontology language (OWL). Here we use CNTRO to represent the time aspects in the data elements. We have evaluated 720 time-related data elements from PGRN studies. We adapted and extended the knowledge representation requirements for EliXR-TIME to categorize our data elements. A CNTRO-based SPARQL query builder has been developed to customize users' own SPARQL queries for each knowledge representation requirement. The SPARQL query builder has been evaluated with a simulated EHR triple store to ensure its functionalities.

  14. Modulation of a Fronto-Parietal Network in Event-Based Prospective Memory: An rTMS Study

    ERIC Educational Resources Information Center

    Bisiacchi, P. S.; Cona, G.; Schiff, S.; Basso, D.

    2011-01-01

    Event-based prospective memory (PM) is a multi-component process that requires remembering the delayed execution of an intended action in response to a pre-specified PM cue, while being actively engaged in an ongoing task. Some neuroimaging studies have suggested that both prefrontal and parietal areas are involved in the maintenance and…

  15. Detection of unidentified events through T-phase observed by the Dense Oceanfloor Network System for Earthquakes and Tsunamis (DONET)

    NASA Astrophysics Data System (ADS)

    Nakamura, T.; To, A.; Nakano, M.; Tsuboi, S.; Watanabe, T.; Kaneda, Y.

    2010-12-01

    DONET is a network of cabled ocean-bottom observatory with real-time recording systems deployed in the seismogenic zone of M8 class mega-thrust earthquake. The network consists of 20 stations with an interval of 15-20km. Each DONET station includes a broadband seismometer, strong motion seismometer, quartz-type pressure gauge, differential pressure gauge, hydrophone and thermometer. We started preliminary observations using one of the stations in March 2010. The full operation based on all the 20 stations will start in the next several months. The array of DONET stations, where the state-of-the-art instruments are placed on the deep-sea floor (about 2000 meters below sea level), constitutes a monitoring system for acoustic waves in the ocean, including those from seismic and artificial origins. In the Pacific coastal areas of Japan, there are also other seafloor observation networks operated by JMA (Japan Meteorological Agency) and ERI (Earthquake Research Institute). These deep-sea floor networks would form an array of submarine instruments with a large aperture (-1500 km) which enables us to study the generation source of far-field ocean acoustic waves. We analyzed the waveform data recorded by DONET and ocean bottom seismometers of JMA. All the stations are distributed within the distance of 100km each other and are placed near the Nankai trough. We detected many acoustic waves with duration of approximately 100 seconds propagated from the southwest. Using the JMA hypocenter bulletin, most of them are identified as T-phase associated with earthquakes that occurred along the Ryukyu trench. We also found many acoustic waves whose sources are unidentified and presumably located outside of the area of JMA bulletin. In the presentation, we show the waveforms observed by the DONET station and then show the distribution of the sources of T-phase.

  16. Adverse events and treatment failure leading to discontinuation of recently approved antipsychotic drugs in schizophrenia: A network meta-analysis.

    PubMed

    Tonin, Fernanda S; Piazza, Thais; Wiens, Astrid; Fernandez-Llimos, Fernando; Pontarolo, Roberto

    2015-12-01

    Objective:We aimed to gather evidence of the discontinuation rates owing to adverse events or treatment failure for four recently approved antipsychotics (asenapine, blonanserin, iloperidone, and lurasidone).Methods: A systematic review followed by pairwise meta-analysis and mixed treatment comparison meta analysis(MTC) was performed, including randomized controlled trials (RCTs) that compared the use of the above-mentioned drugs versus placebo in patients with schizophrenia. An electronic search was conducted in PubMed, Scopus, Science Direct, Scielo, the Cochrane Library, and International Pharmaceutical Abstracts(January 2015). The included trials were at least single blinded. The main outcome measures extracted were discontinuation owing to adverse events and discontinuation owing to treatment failure.Results: Fifteen RCTs were identified (n = 5400 participants) and 13 of them were amenable for use in our meta-analyses. No significant differences were observed between any of the four drugs and placebo as regards discontinuation owing to adverse events, whether in pairwise meta-analysis or in MTC. All drugs presented a better profile than placebo on discontinuation owing to treatment failure, both in pairwise meta-analysis and MTC. Asenapine was found to be the best therapy in terms of tolerability owing to failure,while lurasidone was the worst treatment in terms of adverse events. The evidence around blonanserin is weak.Conclusion: MTCs allowed the creation of two different rank orders of these four antipsychotic drugs in two outcome measures. This evidence-generating method allows direct and indirect comparisons, supporting approval and pricing decisions when lacking sufficient, direct, head-to-head trials.

  17. Large ice-wedge networks and tundra gley horizons in Northern France Upper Pleistocene loess: evidences of extreme cold events and cyclic millennial changes

    NASA Astrophysics Data System (ADS)

    Antoine, Pierre; Moine, Olivier; Guerin, Gilles

    2015-04-01

    Northern France loess-palaeosol sequences from the last interglacial-glacial cycle (Eemian-Weichselian) have been intensely studied during the last 20 years (about 100 individual sequences). Despite thickness variations of the different stratigraphic units, the sequences from the last interglacial-glacial cycle exhibit a particularly constant pedosedimentary pattern, including well-identified pedological and periglacial marker horizons that can be followed north- and eastward in Belgium and Germany. Within this system, new field investigations and luminescence (OSL) datings put in evidence at least four generations of large ice-wedge networks (10-14 m) preserved by loess deposits between ca. 50 and 20 ka. The best- and most systematically preserved network is presently dated at about 31-32 ka according to the OSL ages from its loess infilling. This main ice-wedge cast horizon systematically occurs at the boundary between Middle Pleniglacial brown soil complexes and the base of the Upper Pleniglacial typical loess cover. Consequently, it represents a major stratigraphic marker for correlations in Western Europe. According to recent OSL dating results, the first thick typical loess unit of the Upper Pleniglacial, covering the main ice-wedge cast horizon, has been deposited shortly after GIS-5 interstadial and could be contemporaneous of H3 event in deep-sea cores. In addition, it is shown that all the large ice wedge casts are developed from the surface of a tundra gley horizon (0.3 to 0.5 m in thickness). As it has been previously demonstrated that tundra gley layers were mainly formed during short interstadial events (malacology, sedimentology), a model linking tundra gley horizons, and ice wedges network regarding to DO stadial-interstadial cycles during the last glacial is proposed.

  18. Improving short-term forecasting during ramp events by means of Regime-Switching Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Gallego, C.; Costa, A.; Cuerva, A.

    2010-09-01

    Since nowadays wind energy can't be neither scheduled nor large-scale storaged, wind power forecasting has been useful to minimize the impact of wind fluctuations. In particular, short-term forecasting (characterised by prediction horizons from minutes to a few days) is currently required by energy producers (in a daily electricity market context) and the TSO's (in order to keep the stability/balance of an electrical system). Within the short-term background, time-series based models (i.e., statistical models) have shown a better performance than NWP models for horizons up to few hours. These models try to learn and replicate the dynamic shown by the time series of a certain variable. When considering the power output of wind farms, ramp events are usually observed, being characterized by a large positive gradient in the time series (ramp-up) or negative (ramp-down) during relatively short time periods (few hours). Ramp events may be motivated by many different causes, involving generally several spatial scales, since the large scale (fronts, low pressure systems) up to the local scale (wind turbine shut-down due to high wind speed, yaw misalignment due to fast changes of wind direction). Hence, the output power may show unexpected dynamics during ramp events depending on the underlying processes; consequently, traditional statistical models considering only one dynamic for the hole power time series may be inappropriate. This work proposes a Regime Switching (RS) model based on Artificial Neural Nets (ANN). The RS-ANN model gathers as many ANN's as different dynamics considered (called regimes); a certain ANN is selected so as to predict the output power, depending on the current regime. The current regime is on-line updated based on a gradient criteria, regarding the past two values of the output power. 3 Regimes are established, concerning ramp events: ramp-up, ramp-down and no-ramp regime. In order to assess the skillness of the proposed RS-ANN model, a single

  19. Adaptive Wireless Ad-hoc Sensor Networks for Long-term and Event-oriented Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Bumberger, Jan; Mollenhauer, Hannes; Remmler, Paul; Chirila, Andrei Marian; Mollenhauer, Olaf; Hutschenreuther, Tino; Toepfer, Hannes; Dietrich, Peter

    2016-04-01

    Ecosystems are often characterized by their high heterogeneity, complexity and dynamic. Hence, single point measurements are often not sufficient for their complete representation. The application of wireless sensor networks in terrestrial and aquatic environmental systems offer significant benefits as a better consideration to the local test conditions, due to the simple adjustment of the sensor distribution, the sensor types and the sample rate. Another advantage of wireless ad-hoc sensor networks is their self-organizing behavior, resulting in a major reduction in installation and operation costs and time. In addition, individual point measurements with a sensor are significantly improved by measuring at several points continuously. In this work a concept and realization for Long-term ecosystem research is given in the field monitoring of micrometeorology and soil parameters for the interaction of biotic and abiotic processes. This long term analyses are part of the Global Change Experimental Facility (GCEF), a large field-based experimental platform to assess the effects of climate change on ecosystem functions and processes under different land-use scenarios. Regarding to the adaptive behavior of the network, also a mobile version was developed to overcome the lack of information of temporally and spatially fixed measurements for the detection and recording of highly dynamic or time limited processes. First results of different field campaigns are given to present the potentials and limitations of this application in environmental science, especially for the monitoring of the interaction of biotic and abiotic processes, soil-atmosphere interaction and the validation of remote sensing data.

  20. Adaptive Wireless Ad-hoc Sensor Networks for Long-term and Event-oriented Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Bumberger, Jan; Mollenhauer, Hannes; Remmler, Paul; Schaedler, Martin; Schima, Robert; Mollenhauer, Olaf; Hutschenreuther, Tino; Toepfer, Hannes; Dietrich, Peter

    2015-04-01

    Ecosystems are often characterized by their high heterogeneity, complexity and dynamic. Hence, single point measurements are often not sufficient for their complete representation. The application of wireless sensor networks in terrestrial and aquatic environmental systems offer significant benefits as a better consideration to the local test conditions, due to the simple adjustment of the sensor distribution, the sensor types and the sample rate. Another advantage of wireless ad-hoc sensor networks is their self-organizing behavior, resulting in a major reduction in installation and operation costs and time. In addition, individual point measurements with a sensor are significantly improved by measuring at several points continuously. In this work a concept and realization for Long-term ecosystem research is given in the field monitoring of micrometeorology and soil parameters for the interaction of biotic and abiotic processes. This long term analyses are part of the Global Change Experimental Facility (GCEF), a large field-based experimental platform to assess the effects of climate change on ecosystem functions and processes under different land-use scenarios. Regarding to the adaptive behavior of the network, also a mobile version was developed to overcome the lack of information of temporally and spatially fixed measurements for the detection and recording of highly dynamic or time limited processes. First results of different field campaigns are given to present the potentials and limitations of this application in environmental science, especially for the monitoring of the interaction of biotic and abiotic processes, soil-atmosphere interaction and the validation of remote sensing data.

  1. Advanced Decentralized Water/Energy Network Design for Sustainable Infrastructure presentation at the 2012 National Association of Home Builders (NAHB) International Builders'Show

    EPA Science Inventory

    A university/industry panel will report on the progress and findings of a fivesteve-year project funded by the US Environmental Protection Agency. The project's end product will be a Web-based, 3D computer-simulated residential environment with a decision support system to assist...

  2. Multiple Horizontal Gene Transfer Events and Domain Fusions Have Created Novel Regulatory and Metabolic Networks in the Oomycete Genome

    PubMed Central

    Morris, Paul Francis; Schlosser, Laura Rose; Onasch, Katherine Diane; Wittenschlaeger, Tom; Austin, Ryan; Provart, Nicholas

    2009-01-01

    Complex enzymes with multiple catalytic activities are hypothesized to have evolved from more primitive precursors. Global analysis of the Phytophthora sojae genome using conservative criteria for evaluation of complex proteins identified 273 novel multifunctional proteins that were also conserved in P. ramorum. Each of these proteins contains combinations of protein motifs that are not present in bacterial, plant, animal, or fungal genomes. A subset of these proteins were also identified in the two diatom genomes, but the majority of these proteins have formed after the split between diatoms and oomycetes. Documentation of multiple cases of domain fusions that are common to both oomycetes and diatom genomes lends additional support for the hypothesis that oomycetes and diatoms are monophyletic. Bifunctional proteins that catalyze two steps in a metabolic pathway can be used to infer the interaction of orthologous proteins that exist as separate entities in other genomes. We postulated that the novel multifunctional proteins of oomycetes could function as potential Rosetta Stones to identify interacting proteins of conserved metabolic and regulatory networks in other eukaryotic genomes. However ortholog analysis of each domain within our set of 273 multifunctional proteins against 39 sequenced bacterial and eukaryotic genomes, identified only 18 candidate Rosetta Stone proteins. Thus the majority of multifunctional proteins are not Rosetta Stones, but they may nonetheless be useful in identifying novel metabolic and regulatory networks in oomycetes. Phylogenetic analysis of all the enzymes in three pathways with one or more novel multifunctional proteins was conducted to determine the probable origins of individual enzymes. These analyses revealed multiple examples of horizontal transfer from both bacterial genomes and the photosynthetic endosymbiont in the ancestral genome of Stramenopiles. The complexity of the phylogenetic origins of these metabolic pathways and

  3. Automated Builder and Database of Protein/Membrane Complexes for Molecular Dynamics Simulations

    PubMed Central

    Jo, Sunhwan; Kim, Taehoon; Im, Wonpil

    2007-01-01

    Molecular dynamics simulations of membrane proteins have provided deeper insights into their functions and interactions with surrounding environments at the atomic level. However, compared to solvation of globular proteins, building a realistic protein/membrane complex is still challenging and requires considerable experience with simulation software. Membrane Builder in the CHARMM-GUI website (http://www.charmm-gui.org) helps users to build such a complex system using a web browser with a graphical user interface. Through a generalized and automated building process including system size determination as well as generation of lipid bilayer, pore water, bulk water, and ions, a realistic membrane system with virtually any kinds and shapes of membrane proteins can be generated in 5 minutes to 2 hours depending on the system size. Default values that were elaborated and tested extensively are given in each step to provide reasonable options and starting points for both non-expert and expert users. The efficacy of Membrane Builder is illustrated by its applications to 12 transmembrane and 3 interfacial membrane proteins, whose fully equilibrated systems with three different types of lipid molecules (DMPC, DPPC, and POPC) and two types of system shapes (rectangular and hexagonal) are freely available on the CHARMM-GUI website. One of the most significant advantages of using the web environment is that, if a problem is found, users can go back and re-generate the whole system again before quitting the browser. Therefore, Membrane Builder provides the intuitive and easy way to build and simulate the biologically important membrane system. PMID:17849009

  4. Derived stimulus relations, semantic priming, and event-related potentials: testing a behavioral theory of semantic networks.

    PubMed

    Barnes-Holmes, Dermot; Staunton, Carmel; Whelan, Robert; Barnes-Holmes, Yvonne; Commins, Sean; Walsh, Derek; Stewart, Ian; Smeets, Paul M; Dymond, Simon

    2005-11-01

    Derived equivalence relations, it has been argued, provide a behavioral model of semantic or symbolic meaning in natural language, and thus equivalence relations should possess properties that are typically associated with semantic relations. The present study sought to test this basic postulate using semantic priming. Across three experiments, participants were trained and tested in two 4-member equivalence relations using word-like nonsense words. Participants also were exposed to a single- or two-word lexical decision task, and both direct (Experiment 1) and mediated (Experiments 2 and 3) priming effects for reaction times and event-related potentials were observed within but not across equivalence relations. The findings support the argument that derived equivalence relations provides a useful preliminary model of semantic relations.

  5. Building America Solution Center Shows Builders How to Save Materials Costs While Saving Energy

    SciTech Connect

    Gilbride, Theresa L.

    2015-06-15

    This short article was prepared for the U.S. Department of Energy's Building America Update newsletter. The article identifies energy and cost-saving benefits of using advanced framing techniques in new construction identified by research teams working with the DOE's Building America program. The article also provides links to guides in the Building America Solution Center that give how-to instructions for builders who want to implement advanced framing construction. The newsletter is issued monthly and can be accessed at http://energy.gov/eere/buildings/building-america-update-newsletter

  6. New Whole-House Solutions Case Study: A Production Builder's Passive House - Denver, Colorado

    SciTech Connect

    2015-05-01

    Brookfield Home’s first project is in a community called Midtown in Denver, Colorado, in which the builder took on the challenge of increased energy efficiency by creating a Passive House (PH)-certified model home. Brookfield worked with the U.S. Department of Energy’s Building America research team IBACOS to create the home, evaluate advanced building technologies, and use the home as a marketing tool for potential homebuyers. Brookfield also worked with KGA studio architects to create a new floor plan that would be constructed to the PH standard as an upgrade option.

  7. Design of a wireless sensor network with nanosecond time resolution for mapping of high-energy cosmic ray shower events

    NASA Astrophysics Data System (ADS)

    Frank, Michael P.; Junnarkar, Sachin S.; Fagan, Triesha; O'Neal, Ray H.; Takai, Helio

    2010-04-01

    We describe a low-cost, low-power wireless sensor network we are developing for high time-resolution (ns-scale) characterization of particle showers produced by ultra-high-energy (UHE) cosmic rays, to infer shower direction at sites where hard-wired data connections may be inconvenient to install. The front-end particle detector is a scintillator block monitored by a photomultiplier tube (PMT). We keep the sensor nodes synchronized to within 1 ns using periodic highintensity optical pulses from a light-emitting-diode (LED) overdriven at very high current (~30 A) in short (4 ns) bursts. With minimal optics, this signal is resolvable under free-space transmission in ambient light conditions at multi-meter distances using a high-speed avalanche photodiode (APD) receiver at each node. PMT pulse waveforms are digitized relative to this precise time reference on a Field Programmable Gate Array (FPGA) using a Time-over-Threshold (ToT)/Time-to-Digital Converter (TDC) digitizer developed at BNL. A central server receives timestamped, digitized PMT pulse waveforms from the sensor nodes via Wi-Fi and performs real-time data visualization & analysis. Total cost per sensor node is a few thousand dollars, with total power consumption per sensor node under 1 Watt, suitable for, e.g., solar-powered installations at remote field locations.

  8. Regional transportation network blocked by snowdrifts: assessment of risk reduction strategies by the example of the wind event of February 2015 in the Canton of Vaud, Switzerland

    NASA Astrophysics Data System (ADS)

    Voumard, Jérémie; Jaboyedoff, Michel; Derron, Marc-Henri

    2016-04-01

    where an accessibility is crucial to be maintained. We analyze then the road network to highlight the roads vulnerability from snowdrifts with topographic and meteorological indicators. We also assess the ratio cost/benefit of different measures limiting snowdrifts. We finally discuss strategies to reduce the risk of this winter meteorological event.

  9. Analysis of Modern Techniques for Nuclear-test Yield Determination of NTS Events Using Data From the Leo Brady Seismic Network

    NASA Astrophysics Data System (ADS)

    Schramm, K. A.; Bilek, S. L.; Abbott, R. E.

    2007-12-01

    Nuclear test detection is a challenging, but important task for treaty verification. Many techniques have been developed to discriminate between an explosion and an earthquake and if an explosion is detected, to determine its yield. Sandia National Laboratories (SNL) has maintained the Leo Brady Seismic Network (LBSN) since 1960 to record nuclear tests at the Nevada Test Site (NTS), providing a unique data set for yield determination. The LBSN is comprised of five permanent stations surrounding the NTS at regional distances, and data (in digital from post 1983) exists for almost all tests. Modern seismic data processing techniques can be used with this data to apply new methods to better determine the seismic yield. Using mb(Lg) we found that, when compared to published yields, our estimates were low for events over 100 kilotons (kt) and near the published value for events under 40 kt. We are currently measuring seismic-phase amplitudes, examining body- and surface-wave spectra and using seismic waveform modeling techniques to determine the seismic yield of NTS explosions using the waveforms from the LBSN.

  10. Master Builder.

    ERIC Educational Resources Information Center

    DeBray, Bernard J.

    1993-01-01

    A Missouri school district found the most cost-effective means of designing, bidding, and constructing new facilities to be hiring a construction management firm. With a construction manager, the school district's interests come first, and the district can tailor project-delivery strategies to specific needs. Outlines how the process works. (MLF)

  11. University Builders.

    ERIC Educational Resources Information Center

    Pearce, Martin

    This publication explores a diverse collection of new university buildings. Ranging from the design of vast new campuses, such as that by Wilford and Stirling at Temasek, Singapore, through to the relatively modest yet strategically important, such as the intervention by Allies and Morrison at Southampton, this book examines the new higher…

  12. Energy Builders.

    ERIC Educational Resources Information Center

    Instructor, 1982

    1982-01-01

    Due to increasing energy demands and decreasing supplies, it is important for teachers to provide students with a solid foundation for energy decision making. Activities are presented which offer hands-on experiences with four sources of energy: wind, water, sun, and fossil fuels. (JN)

  13. Career Builders

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2012-01-01

    While a main goal for corporate trainers traditionally has been to train employees to reach organizational goals, many trainers may find their roles expanding. With companies cutting back on staffing and consolidating multiple job roles into single positions, career development has taken on a much larger significance. At forward-thinking…

  14. Bio-logic builder: a non-technical tool for building dynamical, qualitative models.

    PubMed

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized "bio-logic" modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool.

  15. Bio-Logic Builder: A Non-Technical Tool for Building Dynamical, Qualitative Models

    PubMed Central

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A.

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized “bio-logic” modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool. PMID:23082121

  16. BioPartsBuilder: a synthetic biology tool for combinatorial assembly of biological parts

    PubMed Central

    Yang, Kun; Stracquadanio, Giovanni; Luo, Jingchuan; Boeke, Jef D.; Bader, Joel S.

    2016-01-01

    Summary: Combinatorial assembly of DNA elements is an efficient method for building large-scale synthetic pathways from standardized, reusable components. These methods are particularly useful because they enable assembly of multiple DNA fragments in one reaction, at the cost of requiring that each fragment satisfies design constraints. We developed BioPartsBuilder as a biologist-friendly web tool to design biological parts that are compatible with DNA combinatorial assembly methods, such as Golden Gate and related methods. It retrieves biological sequences, enforces compliance with assembly design standards and provides a fabrication plan for each fragment. Availability and implementation: BioPartsBuilder is accessible at http://public.biopartsbuilder.org and an Amazon Web Services image is available from the AWS Market Place (AMI ID: ami-508acf38). Source code is released under the MIT license, and available for download at https://github.com/baderzone/biopartsbuilder. Contact: joel.bader@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26568632

  17. Networks.

    ERIC Educational Resources Information Center

    Cerf, Vinton G.

    1991-01-01

    The demands placed on the networks transporting the information and knowledge generated by the increased diversity and sophistication of computational machinery are described. What is needed to support this increased flow, the structures already in place, and what must be built are topics of discussion. (KR)

  18. Car Builder: Design, Construct and Test Your Own Cars. School Version with Lesson Plans. [CD-ROM].

    ERIC Educational Resources Information Center

    Highsmith, Joni Bitman

    Car Builder is a scientific CD-ROM-based simulation program that lets students design, construct, modify, test, and compare their own cars. Students can design sedans, four-wheel-drive vehicles, vans, sport cars, and hot rods. They may select for aerodynamics, power, and racing ability, or economic and fuel efficiency. It is a program that teaches…

  19. Strategies for fitting nonlinear ecological models in R, AD Model Builder, and BUGS

    USGS Publications Warehouse

    Bolker, Benjamin M.; Gardner, Beth; Maunder, Mark; Berg, Casper W.; Brooks, Mollie; Comita, Liza; Crone, Elizabeth; Cubaynes, Sarah; Davies, Trevor; de Valpine, Perry; Ford, Jessica; Gimenez, Olivier; Kéry, Marc; Kim, Eun Jung; Lennert-Cody, Cleridy; Magunsson, Arni; Martell, Steve; Nash, John; Nielson, Anders; Regentz, Jim; Skaug, Hans; Zipkin, Elise

    2013-01-01

    1. Ecologists often use nonlinear fitting techniques to estimate the parameters of complex ecological models, with attendant frustration. This paper compares three open-source model fitting tools and discusses general strategies for defining and fitting models. 2. R is convenient and (relatively) easy to learn, AD Model Builder is fast and robust but comes with a steep learning curve, while BUGS provides the greatest flexibility at the price of speed. 3. Our model-fitting suggestions range from general cultural advice (where possible, use the tools and models that are most common in your subfield) to specific suggestions about how to change the mathematical description of models to make them more amenable to parameter estimation. 4. A companion web site (https://groups.nceas.ucsb.edu/nonlinear-modeling/projects) presents detailed examples of application of the three tools to a variety of typical ecological estimation problems; each example links both to a detailed project report and to full source code and data.

  20. City of Austin: Green habitat learning project. A green builder model home project

    SciTech Connect

    1995-12-01

    The purpose of the Year 14 UCETF project was to design and construct a residential structure that could serve as a demonstration facility, training site, and testing and monitoring laboratory for issues related to the implementation of sustainable building practices and materials. The Model Home Project builds on the previous and existing efforts, partially funded by the UCETF, of the City of Austin Green Builder Program to incorporate sustainable building practices into mainstream building activities. The Green Builder Program uses the term {open_quotes}green{close_quotes} as a synonym for sustainability. In the research and analysis that was completed for our earlier reports in Years 12 and 13, we characterized specific elements that we associate with sustainability and, thus, green building. In general, we refer to a modified life cycle assessment to ascertain if {open_quotes}green{close_quotes} building options reflect similar positive cyclical patterns found in nature (i.e. recyclability, recycled content, renewable resources, etc.). We additionally consider economic, human health and synergistic ecological impacts associated with our building choices and characterize the best choices as {open_quotes}green.{close_quotes} Our ultimate goal is to identify and use those {open_quotes}green{close_quotes} materials and processes that provide well for us now and do not compromise similar benefits for future generations. The original partnership developed for this project shifted during the year from a project stressing advanced (many prototypical) {open_quotes}green{close_quotes} building materials and techniques in a research and demonstration context, to off-the-shelf but underutilized {open_quotes}green{close_quotes} materials in the practical social context of using {open_quotes}green{close_quotes} technologies for low income housing. That project, discussed in this report, is called the Green Habitat Learning Project.

  1. Ocean acidification causes bleaching and productivity loss in coral reef builders.

    PubMed

    Anthony, K R N; Kline, D I; Diaz-Pulido, G; Dove, S; Hoegh-Guldberg, O

    2008-11-11

    Ocean acidification represents a key threat to coral reefs by reducing the calcification rate of framework builders. In addition, acidification is likely to affect the relationship between corals and their symbiotic dinoflagellates and the productivity of this association. However, little is known about how acidification impacts on the physiology of reef builders and how acidification interacts with warming. Here, we report on an 8-week study that compared bleaching, productivity, and calcification responses of crustose coralline algae (CCA) and branching (Acropora) and massive (Porites) coral species in response to acidification and warming. Using a 30-tank experimental system, we manipulated CO(2) levels to simulate doubling and three- to fourfold increases [Intergovernmental Panel on Climate Change (IPCC) projection categories IV and VI] relative to present-day levels under cool and warm scenarios. Results indicated that high CO(2) is a bleaching agent for corals and CCA under high irradiance, acting synergistically with warming to lower thermal bleaching thresholds. We propose that CO(2) induces bleaching via its impact on photoprotective mechanisms of the photosystems. Overall, acidification impacted more strongly on bleaching and productivity than on calcification. Interestingly, the intermediate, warm CO(2) scenario led to a 30% increase in productivity in Acropora, whereas high CO(2) lead to zero productivity in both corals. CCA were most sensitive to acidification, with high CO(2) leading to negative productivity and high rates of net dissolution. Our findings suggest that sensitive reef-building species such as CCA may be pushed beyond their thresholds for growth and survival within the next few decades whereas corals will show delayed and mixed responses.

  2. Advance Liquid Metal Reactor Discrete Dynamic Event Tree/Bayesian Network Analysis and Incident Management Guidelines (Risk Management for Sodium Fast Reactors)

    SciTech Connect

    Denman, Matthew R.; Groth, Katrina M.; Cardoni, Jeffrey N.; Wheeler, Timothy A.

    2015-04-01

    Accident management is an important component to maintaining risk at acceptable levels for all complex systems, such as nuclear power plants. With the introduction of self-correcting, or inherently safe, reactor designs the focus has shifted from management by operators to allowing the system's design to manage the accident. Inherently and passively safe designs are laudable, but nonetheless extreme boundary conditions can interfere with the design attributes which facilitate inherent safety, thus resulting in unanticipated and undesirable end states. This report examines an inherently safe and small sodium fast reactor experiencing a beyond design basis seismic event with the intend of exploring two issues : (1) can human intervention either improve or worsen the potential end states and (2) can a Bayesian Network be constructed to infer the state of the reactor to inform (1). ACKNOWLEDGEMENTS The authors would like to acknowledge the U.S. Department of Energy's Office of Nuclear Energy for funding this research through Work Package SR-14SN100303 under the Advanced Reactor Concepts program. The authors also acknowledge the PRA teams at Argonne National Laboratory, Oak Ridge National Laboratory, and Idaho National Laboratory for their continue d contributions to the advanced reactor PRA mission area.

  3. Single and combinatorial chromatin coupling events underlies the function of transcript factor krüppel-like factor 11 in the regulation of gene networks

    PubMed Central

    2014-01-01

    Background Krüppel-like factors (KLFs) are a group of master regulators of gene expression conserved from flies to human. However, scant information is available on either the mechanisms or functional impact of the coupling of KLF proteins to chromatin remodeling machines, a deterministic step in transcriptional regulation. Results and discussion In the current study, we use genome-wide analyses of chromatin immunoprecipitation (ChIP-on-Chip) and Affymetrix-based expression profiling to gain insight into how KLF11, a human transcription factor involved in tumor suppression and metabolic diseases, works by coupling to three co-factor groups: the Sin3-histone deacetylase system, WD40-domain containing proteins, and the HP1-histone methyltransferase system. Our results reveal that KLF11 regulates distinct gene networks involved in metabolism and growth by using single or combinatorial coupling events. Conclusion This study, the first of its type for any KLF protein, reveals that interactions with multiple chromatin systems are required for the full gene regulatory function of these proteins. PMID:24885560

  4. Seafloor spreading event in western Gulf of Aden during the November 2010-March 2011 period captured by regional seismic networks: evidence for diking events and interactions with a nascent transform zone

    NASA Astrophysics Data System (ADS)

    Ahmed, Abdulhakim; Doubre, Cécile; Leroy, Sylvie; Kassim, Mohamed; Keir, Derek; Abayazid, Ahmadine; Julie, Perrot; Laurence, Audin; Vergne, Jérome; Alexandre, Nercessian; Jacques, Eric; Khanbari, Khaled; Sholan, Jamal; Rolandone, Frédérique; Al-Ganad, Ismael

    2016-05-01

    In November 2010, intense seismic activity including 29 events with a magnitude above 5.0, started in the western part of the Gulf of Aden, where the structure of the oceanic spreading ridge is characterized by a series of N115°-trending slow-spreading segments set within an EW-trending rift. Using signals recorded by permanent and temporary networks in Djibouti and Yemen, we located 1122 earthquakes, with a magnitude ranging from 2.1 to 5.6 from 2010 November 1 to 2011 March 31. By looking in detail at the space-time distribution of the overall seismicity, and both the frequency and the moment tensor of large earthquakes, we re-examine the chronology of this episode. In addition, we also interpret the origin of the activity using high-resolution bathymetric data, as well as from observations of seafloor cable damage caused by high temperatures and lava flows. The analysis allows us to identify distinct active areas. First, we interpret that this episode is mainly related to a diking event along a specific ridge segment, located at E044°. In light of previous diking episodes in nearby subaerial rift segments, for which field constraints and both seismic and geodetic data exist, we interpret the space-time evolution of the seismicity of the first few days. Migration of earthquakes suggests initial magma ascent below the segment centre. This is followed by a southeastward dike propagation below the rift immediately followed by a northwestward dike propagation below the rift ending below the northern ridge wall. The cumulative seismic moment associated with this sequence reaches 9.1 × 1017 Nm, and taking into account a very low seismic versus geodetic moment, we estimate a horizontal opening of ˜0.58-2.9 m. The seismic activity that followed occurred through several bursts of earthquakes aligned along the segment axis, which are interpreted as short dike intrusions implying fast replenishment of the crustal magma reservoir feeding the dikes. Over the whole period

  5. Building America Best Practices Series: Volume 4; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Mixed-Humid Climate

    SciTech Connect

    Baechler, M. C.; Taylor, Z. T.; Bartlett, R.; Gilbride, T.; Hefty, M.; Steward, H.; Love, P. M.; Palmer, J. A.

    2005-09-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the mixed-humid climate region. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builders team-from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  6. Building America Best Practices Series: Volume 3; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in Cold and Very Cold Climates

    SciTech Connect

    Not Available

    2005-08-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the cold and very cold climates. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builder's team-from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  7. Building America Best Practices Series: Volume 5; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Marine Climate

    SciTech Connect

    Baechler, M. C.; Taylor, Z. T.; Bartlett, R.; Gilbride, T.; Hefty, M.; Steward, H.; Love, P. M.; Palmer, J. A.

    2006-10-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the Marine climate region. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builder's team--from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  8. Seismicity Along the Endeavour Segment of the Juan de Fuca Ridge: Automated Event Locations for an Ocean-Bottom Seismometer Network

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Wilcock, W. S.; Hooft, E. E.; Toomey, D. R.; McGill, P. R.

    2007-12-01

    From 2003-2006, the W.M. Keck Foundation supported the operation of a network of eight ocean-bottom seismometers (OBSs) that were deployed with a remotely operated vehicle along the central portion of the Endeavour Segment of the Juan de Fuca mid-ocean ridge as part of a multidisciplinary prototype NEPTUNE experiment. Data from 2003-2004 were initially analyzed during a research apprenticeship class at the University of Washington's Friday Harbor Laboratories. Eight student analysts located ~13,000 earthquakes along the Endeavour Segment. Analysis of data from 2004-2005 has to date been limited to locating ~6,000 earthquakes associated with a swarm in February-March 2005 near the northern end of the Endeavour Segment. The remaining data includes several significant swarms and it is anticipated that tens of thousands of earthquakes still need to be located. In order to efficiently obtain a complete catalog of high-quality locations for the 3-year experiment, we are developing an automatic method for earthquake location. We first apply a 5-Hz high-pass filter and identify triggers when the ratio of the root-mean square (RMS) amplitudes in short- and long- term windows exceeds a specified threshold. We search for events that are characterized by triggers within a short time interval on the majority of stations and use the signal spectra to eliminate events that are the result of 20-Hz Fin and Blue whale vocalizations. An autoregressive technique is applied to a short time window centered on the trigger time to pick P-wave times on each station's vertical channel. We locate the earthquake with these picks and either attempt to repick or eliminate arrivals with unacceptable residuals. Preliminary S-wave picks are then made on the horizontal channels by applying a 5-12 Hz bandpass filter, identifying the peak RMS amplitude for a short running window, and making a pick at the time the RMS amplitude rises above 50% of this value. The picks are refined using the

  9. Seismicity Along the Endeavour Segment of the Juan de Fuca Ridge: Automated Event Locations for an Ocean-Bottom Seismometer Network

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Wilcock, W. S.; Hooft, E. E.; Toomey, D. R.; McGill, P. R.

    2004-12-01

    From 2003-2006, the W.M. Keck Foundation supported the operation of a network of eight ocean-bottom seismometers (OBSs) that were deployed with a remotely operated vehicle along the central portion of the Endeavour Segment of the Juan de Fuca mid-ocean ridge as part of a multidisciplinary prototype NEPTUNE experiment. Data from 2003-2004 were initially analyzed during a research apprenticeship class at the University of Washington's Friday Harbor Laboratories. Eight student analysts located ~13,000 earthquakes along the Endeavour Segment. Analysis of data from 2004-2005 has to date been limited to locating ~6,000 earthquakes associated with a swarm in February-March 2005 near the northern end of the Endeavour Segment. The remaining data includes several significant swarms and it is anticipated that tens of thousands of earthquakes still need to be located. In order to efficiently obtain a complete catalog of high-quality locations for the 3-year experiment, we are developing an automatic method for earthquake location. We first apply a 5-Hz high-pass filter and identify triggers when the ratio of the root-mean square (RMS) amplitudes in short- and long- term windows exceeds a specified threshold. We search for events that are characterized by triggers within a short time interval on the majority of stations and use the signal spectra to eliminate events that are the result of 20-Hz Fin and Blue whale vocalizations. An autoregressive technique is applied to a short time window centered on the trigger time to pick P-wave times on each station's vertical channel. We locate the earthquake with these picks and either attempt to repick or eliminate arrivals with unacceptable residuals. Preliminary S-wave picks are then made on the horizontal channels by applying a 5-12 Hz bandpass filter, identifying the peak RMS amplitude for a short running window, and making a pick at the time the RMS amplitude rises above 50% of this value. The picks are refined using the

  10. The effects of high-frequency oscillations in hippocampal electrical activities on the classification of epileptiform events using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Chiu, Alan W. L.; Jahromi, Shokrollah S.; Khosravani, Houman; Carlen, Peter L.; Bardakjian, Berj L.

    2006-03-01

    The existence of hippocampal high-frequency electrical activities (greater than 100 Hz) during the progression of seizure episodes in both human and animal experimental models of epilepsy has been well documented (Bragin A, Engel J, Wilson C L, Fried I and Buzsáki G 1999 Hippocampus 9 137-42 Khosravani H, Pinnegar C R, Mitchell J R, Bardakjian B L, Federico P and Carlen P L 2005 Epilepsia 46 1-10). However, this information has not been studied between successive seizure episodes or utilized in the application of seizure classification. In this study, we examine the dynamical changes of an in vitro low Mg2+ rat hippocampal slice model of epilepsy at different frequency bands using wavelet transforms and artificial neural networks. By dividing the time-frequency spectrum of each seizure-like event (SLE) into frequency bins, we can analyze their burst-to-burst variations within individual SLEs as well as between successive SLE episodes. Wavelet energy and wavelet entropy are estimated for intracellular and extracellular electrical recordings using sufficiently high sampling rates (10 kHz). We demonstrate that the activities of high-frequency oscillations in the 100-400 Hz range increase as the slice approaches SLE onsets and in later episodes of SLEs. Utilizing the time-dependent relationship between different frequency bands, we can achieve frequency-dependent state classification. We demonstrate that activities in the frequency range 100-400 Hz are critical for the accurate classification of the different states of electrographic seizure-like episodes (containing interictal, preictal and ictal states) in brain slices undergoing recurrent spontaneous SLEs. While preictal activities can be classified with an average accuracy of 77.4 ± 6.7% utilizing the frequency spectrum in the range 0-400 Hz, we can also achieve a similar level of accuracy by using a nonlinear relationship between 100-400 Hz and <4 Hz frequency bands only.

  11. Staged Event Architecture

    SciTech Connect

    Hoschek, Wolfgang; Berket, Karlo

    2005-05-30

    Sea is a framework for a Staged Event Architecture, designed around non-blocking asynchronous communication facilities that are decoupled from the threading model chosen by any given application, Components for P networking and in-memory communication are provided. The Sea Java library encapsulates these concepts. Sea is used to easily build efficient and flexible low-level network clients and servers, and in particular as a basic communication substrate for Peer-to-Peer applications.

  12. Building America Best Practices Series: Volume 4; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Mixed-Humid Climate

    SciTech Connect

    2005-09-01

    This guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the mixed-humid climate region.

  13. Building America Best Practices Series: Builders Challenge Guide to 40% Whole-House Energy Savings in the Marine Climate (Volume 11)

    SciTech Connect

    Pacific Northwest National Laboratory

    2010-09-01

    With the measures described in this guide, builders in the marine climate can build homes that have whole-house energy savings of 40% over the Building America benchmark with no added overall costs for consumers.

  14. Building America Best Practices Series: Volume 3; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Cold and Very Cold Climates

    SciTech Connect

    2005-08-01

    The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the cold and very cold climates.

  15. electronic Ligand Builder and Optimisation Workbench (eLBOW): A tool for ligand coordinate and restraint generation

    SciTech Connect

    Moriarty, Nigel; Grosse-Kunstleve, Ralf; Adams, Paul

    2009-07-01

    The electronic Ligand Builder and Optimisation Workbench (eLBOW) is a program module of the PHENIX suite of computational crystallographic software. It's designed to be a flexible procedure using simple and fast quantum chemical techniques to provide chemically accurate information for novel and known ligands alike. A variety of input formats and options allow for the attainment of a number of diverse goals including geometry optimisation and generation of restraints.

  16. New Whole-House Solutions Case Study: HVAC Design Strategy for a Hot-Humid Production Builder, Houston, Texas

    SciTech Connect

    2014-03-01

    Building Science Corporation (BSC) worked directly with the David Weekley Homes - Houston division to develop a cost-effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses in preparation for the upcoming code changes in 2015. The following research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.

  17. Quantifying the spatio-temporal pattern of the ground impact of space weather events using dynamical networks formed from the SuperMAG database of ground based magnetometer stations.

    NASA Astrophysics Data System (ADS)

    Dods, Joe; Chapman, Sandra; Gjerloev, Jesper

    2016-04-01

    Quantitative understanding of the full spatial-temporal pattern of space weather is important in order to estimate the ground impact. Geomagnetic indices such as AE track the peak of a geomagnetic storm or substorm, but cannot capture the full spatial-temporal pattern. Observations by the ~100 ground based magnetometers in the northern hemisphere have the potential to capture the detailed evolution of a given space weather event. We present the first analysis of the full available set of ground based magnetometer observations of substorms using dynamical networks. SuperMAG offers a database containing ground station magnetometer data at a cadence of 1min from 100s stations situated across the globe. We use this data to form dynamic networks which capture spatial dynamics on timescales from the fast reconfiguration seen in the aurora, to that of the substorm cycle. Windowed linear cross-correlation between pairs of magnetometer time series along with a threshold is used to determine which stations are correlated and hence connected in the network. Variations in ground conductivity and differences in the response functions of magnetometers at individual stations are overcome by normalizing to long term averages of the cross-correlation. These results are tested against surrogate data in which phases have been randomised. The network is then a collection of connected points (ground stations); the structure of the network and its variation as a function of time quantify the detailed dynamical processes of the substorm. The network properties can be captured quantitatively in time dependent dimensionless network parameters and we will discuss their behaviour for examples of 'typical' substorms and storms. The network parameters provide a detailed benchmark to compare data with models of substorm dynamics, and can provide new insights on the similarities and differences between substorms and how they correlate with external driving and the internal state of the

  18. Automated Geometric Model Builder Using Range Image Sensor Data: Final Acquistion

    SciTech Connect

    Diegert, C.; Sackos, J.

    1999-02-01

    This report documents a data collection where we recorded redundant range image data from multiple views of a simple scene, and recorded accurate survey measurements of the same scene. Collecting these data was a focus of the research project Automated Geometric Model Builder Using Range Image Sensor Data (96-0384), supported by Sandia's Laboratory-Directed Research and Development (LDRD) Program during fiscal years 1996, 1997, and 1998. The data described here are available from the authors on CDROM, or electronically over the Internet. Included in this data distribution are Computer-Aided Design (CAD) models we constructed from the survey measurements. The CAD models are compatible with the SolidWorks 98 Plus system, the modern Computer-Aided Design software system that is central to Sandia's DeskTop Engineering Project (DTEP). Integration of our measurements (as built) with the constructive geometry process of the CAD system (as designed) delivers on a vision of the research project. This report on our final data collection will also serve as a final report on the project.

  19. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    SciTech Connect

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang; Lo, Chaomei; Gorton, Ian; Liu, Yan

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to be extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.

  20. Building energy analysis of Electrical Engineering Building from DesignBuilder tool: calibration and simulations

    NASA Astrophysics Data System (ADS)

    Cárdenas, J.; Osma, G.; Caicedo, C.; Torres, A.; Sánchez, S.; Ordóñez, G.

    2016-07-01

    This research shows the energy analysis of the Electrical Engineering Building, located on campus of the Industrial University of Santander in Bucaramanga - Colombia. This building is a green pilot for analysing energy saving strategies such as solar pipes, green roof, daylighting, and automation, among others. Energy analysis was performed by means of DesignBuilder software from virtual model of the building. Several variables were analysed such as air temperature, relative humidity, air velocity, daylighting, and energy consumption. According to two criteria, thermal load and energy consumption, critical areas were defined. The calibration and validation process of the virtual model was done obtaining error below 5% in comparison with measured values. The simulations show that the average indoor temperature in the critical areas of the building was 27°C, whilst relative humidity reached values near to 70% per year. The most critical discomfort conditions were found in the area of the greatest concentration of people, which has an average annual temperature of 30°C. Solar pipes can increase 33% daylight levels into the areas located on the upper floors of the building. In the case of the green roofs, the simulated results show that these reduces of nearly 31% of the internal heat gains through the roof, as well as a decrease in energy consumption related to air conditioning of 5% for some areas on the fourth and fifth floor. The estimated energy consumption of the building was 69 283 kWh per year.

  1. Technological process supervising using vision systems cooperating with the LabVIEW vision builder

    NASA Astrophysics Data System (ADS)

    Hryniewicz, P.; Banaś, W.; Gwiazda, A.; Foit, K.; Sękala, A.; Kost, G.

    2015-11-01

    One of the most important tasks in the production process is to supervise its proper functioning. Lack of required supervision over the production process can lead to incorrect manufacturing of the final element, through the production line downtime and hence to financial losses. The worst result is the damage of the equipment involved in the manufacturing process. Engineers supervise the production flow correctness use the great range of sensors supporting the supervising of a manufacturing element. Vision systems are one of sensors families. In recent years, thanks to the accelerated development of electronics as well as the easier access to electronic products and attractive prices, they become the cheap and universal type of sensors. These sensors detect practically all objects, regardless of their shape or even the state of matter. The only problem is considered with transparent or mirror objects, detected from the wrong angle. Integrating the vision system with the LabVIEW Vision and the LabVIEW Vision Builder it is possible to determine not only at what position is the given element but also to set its reorientation relative to any point in an analyzed space. The paper presents an example of automated inspection. The paper presents an example of automated inspection of the manufacturing process in a production workcell using the vision supervising system. The aim of the work is to elaborate the vision system that could integrate different applications and devices used in different production systems to control the manufacturing process.

  2. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  3. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  4. Closed Crawl Space Performance: Proof of Concept in the Production Builder Marketplace

    SciTech Connect

    Malkin-Weber, Melissa; Dastur, Cyrus; Mauceri, Maria; Hannas, Benjamin

    2008-10-30

    characteristics with regard to infiltration and duct leakage. Researchers settled on two closed crawl space designs, one with insulation located in the framed floor structure above the crawl space and one with insulation on the crawl space perimeter wall, as the designs with the most widespread potential for application. Researchers based this assessment not only on the field performance, but also on input from residential builders, pest control professionals, code officials, installers, and building scientists active in the region. The key findings from the field demonstration were that (1) closed crawl spaces stay substantially drier than traditional wall-vented crawl spaces during humid climate conditions, and (2) the houses built on the closed crawl space foundations saved, on average, 15% or more on annual energy used for space heating and cooling. A comparison of the actual energy performance of the homes versus the performance predicted by a popular HERS software application showed that the software was unable to predict the demonstrated savings, in some cases predicting an energy penalty. Findings from the 2005 project were summarized in a publication titled Closed Crawl Spaces: An Introduction to Design, Construction and Performance. Since its release, the publication has received widespread use by builders, homeowners, installers and code officials concerned about crawl space construction. The findings were also used to create major revisions to the NC Residential Code, which were adopted in 2004 and immediately began to reduce the regulatory barriers to widespread commercialization of the technology in NC, particularly in new residential construction. Full project details are located at www.crawlspaces.org.

  5. An Automated Force Field Topology Builder (ATB) and Repository: Version 1.0.

    PubMed

    Malde, Alpeshkumar K; Zuo, Le; Breeze, Matthew; Stroet, Martin; Poger, David; Nair, Pramod C; Oostenbrink, Chris; Mark, Alan E

    2011-12-13

    The Automated force field Topology Builder (ATB, http://compbio.biosci.uq.edu.au/atb ) is a Web-accessible server that can provide topologies and parameters for a wide range of molecules appropriate for use in molecular simulations, computational drug design, and X-ray refinement. The ATB has three primary functions: (1) to act as a repository for molecules that have been parametrized as part of the GROMOS family of force fields, (2) to act as a repository for pre-equilibrated systems for use as starting configurations in molecular dynamics simulations (solvent mixtures, lipid systems pre-equilibrated to adopt a specific phase, etc.), and (3) to generate force field descriptions of novel molecules compatible with the GROMOS family of force fields in a variety of formats (GROMOS, GROMACS, and CNS). Force field descriptions of novel molecules are derived using a multistep process in which results from quantum mechanical (QM) calculations are combined with a knowledge-based approach to ensure compatibility (as far as possible) with a specific parameter set of the GROMOS force field. The ATB has several unique features: (1) It requires that the user stipulate the protonation and tautomeric states of the molecule. (2) The symmetry of the molecule is analyzed to ensure that equivalent atoms are assigned identical parameters. (3) Charge groups are assigned automatically. (4) Where the assignment of a given parameter is ambiguous, a range of possible alternatives is provided. The ATB also provides several validation tools to assist the user to assess the degree to which the topology generated may be appropriate for a given task. In addition to detailing the steps involved in generating a force field topology compatible with a specific GROMOS parameter set (GROMOS 53A6), the challenges involved in the automatic generation of force field parameters for atomic simulations in general are discussed.

  6. Henry Daniel Cogswell, DDS (1819-1900): a temperance advocate, philanthropist and builder of ice-water fountains.

    PubMed

    Christen, A G; Theobald, M S; Christen, J A

    1999-07-01

    Henry Daniel Cogswell (Fig. 1), the second of five children, was born in Tolland, Connecticut on March 3, 1819. His father, George Washington Cogswell, was a general carpenter, architect and builder of moderate circumstances. In 1827, when Henry was eight, his mother died. The following year, Henry's father moved to Orwell, (Oswego County) New York, in hopes of improving his financial condition. Henry was left behind in the care of his paternal grandfather, who died several months later, leaving the 10-year old boy, stranded and forced to rely upon his own resources. (In those times, when families were separated, individual members had limited means of locating one another.)

  7. A study on the use of imputation methods for experimentation with Radial Basis Function Network classifiers handling missing attribute values: the good synergy between RBFNs and EventCovering method.

    PubMed

    Luengo, Julián; García, Salvador; Herrera, Francisco

    2010-04-01

    The presence of Missing Values in a data set can affect the performance of a classifier constructed using that data set as a training sample. Several methods have been proposed to treat missing data and the one used more frequently is the imputation of the Missing Values of an instance. In this paper, we analyze the improvement of performance on Radial Basis Function Networks by means of the use of several imputation methods in the classification task with missing values. The study has been conducted using data sets with real Missing Values, and data sets with artificial Missing Values. The results obtained show that EventCovering offers a very good synergy with Radial Basis Function Networks. It allows us to overcome the negative impact of the presence of Missing Values to a certain degree.

  8. Skeletal muscle hypertrophy and structure and function of skeletal muscle fibres in male body builders

    PubMed Central

    D'Antona, Giuseppe; Lanfranconi, Francesca; Pellegrino, Maria Antonietta; Brocca, Lorenza; Adami, Raffaella; Rossi, Rosetta; Moro, Giorgio; Miotti, Danilo; Canepari, Monica; Bottinelli, Roberto

    2006-01-01

    Needle biopsy samples were taken from vastus lateralis muscle (VL) of five male body builders (BB, age 27.4 ± 0.93 years; mean ±s.e.m.), who had being performing hypertrophic heavy resistance exercise (HHRE) for at least 2 years, and from five male active, but untrained control subjects (CTRL, age 29.9 ± 2.01 years). The following determinations were performed: anatomical cross-sectional area and volume of the quadriceps and VL muscles in vivo by magnetic resonance imaging (MRI); myosin heavy chain isoform (MHC) distribution of the whole biopsy samples by SDS-PAGE; cross-sectional area (CSA), force (Po), specific force (Po/CSA) and maximum shortening velocity (Vo) of a large population (n= 524) of single skinned muscle fibres classified on the basis of MHC isoform composition by SDS-PAGE; actin sliding velocity (Vf) on pure myosin isoforms by in vitro motility assays. In BB a preferential hypertrophy of fast and especially type 2X fibres was observed. The very large hypertrophy of VL in vivo could not be fully accounted for by single muscle fibre hypertrophy. CSA of VL in vivo was, in fact, 54% larger in BB than in CTRL, whereas mean fibre area was only 14% larger in BB than in CTRL. MHC isoform distribution was shifted towards 2X fibres in BB. Po/CSA was significantly lower in type 1 fibres from BB than in type 1 fibres from CTRL whereas both type 2A and type 2X fibres were significantly stronger in BB than in CTRL. Vo of type 1 fibres and Vf of myosin 1 were significantly lower in BB than in CTRL, whereas no difference was observed among fast fibres and myosin 2A. The findings indicate that skeletal muscle of BB was markedly adapted to HHRE through extreme hypertrophy, a shift towards the stronger and more powerful fibre types and an increase in specific force of muscle fibres. Such adaptations could not be fully accounted for by well known mechanisms of muscle plasticity, i.e. by the hypertrophy of single muscle fibre (quantitative mechanism) and by a

  9. Operation of a digital seismic network on Mount St. Helens volcano and observations of long period seismic events that originate under the volcano

    SciTech Connect

    Fehler, M.; Chouet, B.

    1982-09-01

    A 9 station digital seismic array was operated on Mount St. Helens volcano in Washington State during 1981. One of the stations was placed inside the crater of the volcano, six were located on the flanks of the volcano within two km of the crater and two were approximately ten km from the crater. Four of the instruments recorded three components of motion and the remaining five recorded only the vertical component. A one day experiment was carried out during which the crater monitoring seismometer was complimented by the addition of two ink recording instruments. During the one day experiment six observers recorded times of rockfall, felt-earthquake occurrences, and changes in steam emissions from the dome in the crater. Using information obtained during the one day experiment seismic events recorded by the digital instruments were classified as earthquakes, rockfalls, helicopter noise and a type of event that is unique to volcanoes which is called long period. Waveforms of these long period events have a duration of up to 30 seconds and a spectrum that is peaked at approximately 2 Hz. The frequency at which the peak in the spectrum occurs is nearly the same at all stations which means that the unique waveform of long period events is due to a source effect, not a path effect. The peak frequency is fairly insensitive to the amplitude of the signal which means that the size of the source region is constant, independent of the signal amplitude. Long period events were not felt and were accompanied by no visible changes inside the crater which lead to the conclusion that they are some sort of seismic disturbance generated inside the Volcano.

  10. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  11. Saharan-dust events characterization as example of Operational Oceanography product from a multidisciplinary real-time monitoring network in the Macaronesian region (Red ACOMAR)

    NASA Astrophysics Data System (ADS)

    Barrera, C.; Gelado, M. D.; Rueda, M. J.; Moran, R.; Llerandi, C.; Cardona, L.; Llinas, O.

    2009-04-01

    To detect and predict changes in coastal and open-ocean ecosystems is a huge requirement monitoring in detail and real-time their baseline physical, geological and chemical properties. In these regards and following the trends and general objectives established by GOOS (Global Ocean Observing System) through its coastal ecosystems module COOP (Coastal Ocean Observations Panel), the present paper describes the design, first development stages and some derived results of a monitoring network, named Red ACOMAR Canarias (Red de Alerta, Control y Observación MARina de Canarias, in English: Network for Marine Surveillance, Control and Observation in the Canaries) developed in the Macaronesia region.Since 1999, the Red ACOMAR is based in a core project supported throughout several proposals at the same time, developed in the coastal and open-ocean areas around the Canary Islands archipelago. The network integrates a wide group of devices and monitoring systems (moored and drifting buoys, gliders, remote sensing, turtles, land based meteorological stations, research vessels,…) working in real-time. The network has a control centre that manages communications and data processing, and provides real-time information in a functional form to end-users from socio-economic important sectors, which make an exhaustive use of the coastal area in the region. The access to the information by the users is done through a web site. The Red ACOMAR is nowadays directly linked with other similar proposals existing in the area, mainly from scientist groups in Azores and Madeira archipelagos, as well as from other European countries, working all together with the aim to bring out a regional contribution in Operational Oceanography to the end-users requirements.

  12. ChemSkill Builder 2000, Version 6.1 [CD-ROM] (by James D. Spain and Harold J. Peters)

    NASA Astrophysics Data System (ADS)

    Keeney-Kennicutt, Reviewed By Wendy L.

    2000-07-01

    One of the major challenges for faculty teaching general chemistry is how to encourage students to practice solving problems. We know that for students to develop chemical intuition and problem-solving skills, they must "get their hands dirty" as they decipher and unravel problems inherent to our discipline. One tool that I've used since its release in 1996 is the ChemSkill Builder, an electronic homework package. The latest version, ChemSkill Builder (CSB) 2000, version 6.1, is an excellent, effective integration of teaching and testing most quantitative and conceptual learning objectives in an interactive way. It is inexpensive and easy to use for both students and faculty. The CSB 2000 package of personalized problem sets, specifically designed to complement most general chemistry courses, is a program on CD-ROM for PC Windows users (3.1, 95, or 98), with more than 1500 questions and a 3 1/2-in. record-management disk. There is a separate grade-management disk for the instructor. It has 24 gradable chapters, each with 5 or 6 sections, plus two new chapters that are not graded: Polymer Chemistry and an Appendix of Chemical Skills. Each section begins with a short review of the topic and many have interactive explanations. If students miss an answer, they are given a second chance for 70% credit. If they still miss, the worked-out solution is presented in detail. Students can work each section as many times as they wish to improve their scores. Periodically, the students download their data directly into a PC set up by the instructor. The data can be easily converted into an ASCII file and merged with a spreadsheet. The use of CD-ROM solves the sporadic problems associated with previous versions on 3 1/2-in. disks: software glitches, failed disks, and system incompatibilities. The quality and number of graphics and interactive exercises are much improved in this latest version. I particularly enjoyed the interactive explanations of significant figures and

  13. Extensin network formation in Vitis vinifera callus cells is an essential and causal event in rapid and H2O2-induced reduction in primary cell wall hydration

    PubMed Central

    2011-01-01

    Background Extensin deposition is considered important for the correct assembly and biophysical properties of primary cell walls, with consequences to plant resistance to pathogens, tissue morphology, cell adhesion and extension growth. However, evidence for a direct and causal role for the extensin network formation in changes to cell wall properties has been lacking. Results Hydrogen peroxide treatment of grapevine (Vitis vinifera cv. Touriga) callus cell walls was seen to induce a marked reduction in their hydration and thickness. An analysis of matrix proteins demonstrated this occurs with the insolubilisation of an abundant protein, GvP1, which displays a primary structure and post-translational modifications typical of dicotyledon extensins. The hydration of callus cell walls free from saline-soluble proteins did not change in response to H2O2, but fully regained this capacity after addition of extensin-rich saline extracts. To assay the specific contribution of GvP1 cross-linking and other wall matrix proteins to the reduction in hydration, GvP1 levels in cell walls were manipulated in vitro by binding selected fractions of extracellular proteins and their effect on wall hydration during H2O2 incubation assayed. Conclusions This approach allowed us to conclude that a peroxidase-mediated formation of a covalently linked network of GvP1 is essential and causal in the reduction of grapevine callus wall hydration in response to H2O2. Importantly, this approach also indicated that extensin network effects on hydration was only partially irreversible and remained sensitive to changes in matrix charge. We discuss this mechanism and the importance of these changes to primary wall properties in the light of extensin distribution in dicotyledons. PMID:21672244

  14. Transformational Events

    ERIC Educational Resources Information Center

    Denning, Peter J.; Hiles, John E.

    2006-01-01

    Transformational Events is a new pedagogic pattern that explains how innovations (and other transformations) happened. The pattern is three temporal stages: an interval of increasingly unsatisfactory ad hoc solutions to a persistent problem (the "mess"), an offer of an invention or of a new way of thinking, and a period of widespread adoption and…

  15. Maintenance Event

    Atmospheric Science Data Center

    2014-07-22

    Time:  08:00 am - 08:30 am EDT Event Impact:  Science Directorate websites will ... outage Thursday morning, 7/24, to perform upgrades to the web environment and are expected to be down for about 30 minutes. ...

  16. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  17. Quantitative Microbial Risk Assessment Tutorial – SDMProjectBuilder: Import Local Data Files to Identify and Modify Contamination Sources and Input Parameters

    EPA Science Inventory

    Twelve example local data support files are automatically downloaded when the SDMProjectBuilder is installed on a computer. They allow the user to modify values to parameters that impact the release, migration, fate, and transport of microbes within a watershed, and control delin...

  18. Building America Best Practices Series, Volume 9: Builders Challenge Guide to 40% Whole-House Energy Savings in the Hot-Dry and Mixed-Dry Climates

    SciTech Connect

    Baechler, Michael C.; Gilbride, Theresa L.; Hefty, Marye G.; Williamson, Jennifer L.; Ruiz, Kathleen A.; Bartlett, Rosemarie; Love, Pat M.

    2009-10-23

    This best practices guide is the ninth in a series of guides for builders produced by the U.S. Department of Energy’s Building America Program. This guide book is a resource to help builders design and construct homes that are among the most energy-efficient available, while addressing issues such as building durability, indoor air quality, and occupant health, safety, and comfort. With the measures described in this guide, builders in the hot-dry and mixed-dry climates can achieve homes that have whole house energy savings of 40% over the Building America benchmark (a home built to mid-1990s building practices roughly equivalent to the 1993 Model Energy Code) with no added overall costs for consumers. These best practices are based on the results of research and demonstration projects conducted by Building America’s research teams. The guide includes information for managers, designers, marketers, site supervisors, and subcontractors, as well as case studies of builders who are successfully building homes that cut energy use by 40% in the hot-dry and mixed-dry climates.

  19. Building America Best Practices Series: Volume 1; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Hot and Humid Climate

    SciTech Connect

    2004-12-01

    This Building America Best Practices guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the hot and humid climate.

  20. Building America Best Practices Series Volume 11. Builders Challenge Guide to 40% Whole-House Energy Savings in the Marine Climate

    SciTech Connect

    Baechler, Michael C.; Gilbride, Theresa L.; Hefty, Marye G.; Cole, Pamala C.; Williamson, Jennifer L.; Love, Pat M.

    2010-09-01

    This best practices guide is the eleventh in a series of guides for builders produced by the U.S. Department of Energy’s Building America Program. This guide book is a resource to help builders design and construct homes that are among the most energy-efficient available, while addressing issues such as building durability, indoor air quality, and occupant health, safety, and comfort. With the measures described in this guide, builders in the marine climate (portions of Washington, Oregon, and California) can achieve homes that have whole house energy savings of 40% over the Building America benchmark (a home built to mid-1990s building practices roughly equivalent to the 1993 Model Energy Code) with no added overall costs for consumers. These best practices are based on the results of research and demonstration projects conducted by Building America’s research teams. The guide includes information for managers, designers, marketers, site supervisors, and subcontractors, as well as case studies of builders who are successfully building homes that cut energy use by 40% in the marine climate. This document is available on the web at www.buildingamerica.gov. This report was originally cleared 06-29-2010. This version is Rev 1 cleared in Nov 2010. The only change is the reference to the Energy Star Windows critieria shown on pg 8.25 was updated to match the criteria - Version 5.0, 04/07/2009, effective 01/04/2010.

  1. Building America Best Practices Series: Volume 5; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Marine Climate

    SciTech Connect

    2006-10-01

    This best practices guide is part of a series produced by Building America. The guide book is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in the Marine climate region.

  2. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  3. Building America Best Practices Series Volume 12: Builders Challenge Guide to 40% Whole-House Energy Savings in the Cold and Very Cold Climates

    SciTech Connect

    Baechler, Michael C.; Gilbride, Theresa L.; Hefty, Marye G.; Cole, Pamala C.; Love, Pat M.

    2011-02-01

    This best practices guide is the twelfth in a series of guides for builders produced by PNNL for the U.S. Department of Energy’s Building America program. This guide book is a resource to help builders design and construct homes that are among the most energy-efficient available, while addressing issues such as building durability, indoor air quality, and occupant health, safety, and comfort. With the measures described in this guide, builders in the cold and very cold climates can build homes that have whole-house energy savings of 40% over the Building America benchmark with no added overall costs for consumers. The best practices described in this document are based on the results of research and demonstration projects conducted by Building America’s research teams. Building America brings together the nation’s leading building scientists with over 300 production builders to develop, test, and apply innovative, energy-efficient construction practices. Building America builders have found they can build homes that meet these aggressive energy-efficiency goals at no net increased costs to the homeowners. Currently, Building America homes achieve energy savings of 40% greater than the Building America benchmark home (a home built to mid-1990s building practices roughly equivalent to the 1993 Model Energy Code). The recommendations in this document meet or exceed the requirements of the 2009 IECC and 2009 IRC and thos erequirements are highlighted in the text. This document will be distributed via the DOE Building America website: www.buildingamerica.gov.

  4. The Misuse of Anabolic-Androgenic Steroids among Iranian Recreational Male Body-Builders and Their Related Psycho-Socio-Demographic factors

    PubMed Central

    ANGOORANI, Hooman; HALABCHI, Farzin

    2015-01-01

    Background: The high prevalence and potential side effects of anabolic-androgenic steroids (AAS) misuse by athletes has made it a major public health concern. Epidemiological studies on the abuse of such drugs are mandatory for developing effective preventive drug control programs in sports community. This study aimed to investigate the prevalence of AAS abuse and their association with some psycho-socio-demographic factors in Iranian male recreational body-builders. Methods: Between March and October 2011; 906 recreational male body-builders from 103 randomly selected bodybuilding clubs in Tehran, Iran were participated in this study. Some psycho-socio- demographic factors including age, job, average family income, family size, sport experience (months), weekly duration of the sporting activity (h), purpose of participation in sporting activity, mental health as well as body image (via General Health Questionnaire and Multidimensional Body-Self Relations Questionnaire, respectively), and history of AAS use were obtained by interviews using questionnaires. Results: Participants were all recreational male body-builders [mean age (SD): 25.7 (7.1), ranging 14–56 yr]. Self-report of AAS abuse was registered in 150 body-builders (16.6%). Among different psycho-socio-demographic factors, only family income and sport experience were inversely associated with AAS abuse. Conclusion: Lifetime prevalence of AAS abuse is relatively high among recreational body-builders based on their self-report. Some psycho-socio-demographic factors including family income and sport experience may influence the prevalence of AAS abuse. PMID:26811817

  5. Charter Schools as Nation Builders: Democracy Prep and Civic Education. Policy Brief 4

    ERIC Educational Resources Information Center

    Lautzenheiser, Daniel; Kelly, Andrew P.

    2013-01-01

    This policy brief is the first in a series of in-depth case studies exploring how top-performing charter schools have incorporated civic learning in their school curriculum and school culture. This paper introduces Democracy Prep, a network of seven public charter schools with a civic mission at its core. Democracy Prep's founder and…

  6. Mind Builders: Multidisciplinary Challenges for Cooperative Team-Building and Competition

    ERIC Educational Resources Information Center

    Fleisher, Paul; Ziegler, Donald

    2006-01-01

    For more than twenty years, the Richmond, Virginia Public Schools' program for gifted students has conducted an interscholastic competition similar to the nationally known competition, Destination Imagination. In the featured contest of this yearly event, teams of five students present solutions to engineering problems that they have worked on for…

  7. Healthcare-associated infections studies project: an American Journal of Infection Control and National Healthcare Safety Network data quality collaboration-LabID Clostridium Difficile event 2013.

    PubMed

    Hebden, Joan N; Anttila, Angela; Allen-Bridson, Kathy; Morrell, Gloria C; Wright, Marc-Oliver; Horan, Teresa

    2013-10-01

    This is the first in a series of case studies that will be published in American Journal of Infection Control following the Centers for Disease Control and Prevention/National Healthcare Safety Network (NHSN) surveillance definition update of 2013. These cases reflect some of the complex patient scenarios infection professionals encounter during daily surveillance of health care-associated infections using NHSN definitions. Answers to the questions posed and immediate feedback in the form of answers and explanations are available at: http://www.surveymonkey.com/s/AJIC-NHSN-LbId2013. All individual participant answers will remain confidential, although it is the authors' hope to share a summary of the findings at a later date. Cases, answers, and explanations have been reviewed and approved by NHSN staff. Active participation is encouraged and recommended. Review/reference Chapter 12-Multidrug-resistant organism &C difficile infection module protocol, of the NHSN Patient Safety Component Manual (http://www.cdc.gov/nhsn/PDFs/pscManual/12pscMDRO_CDADcurrent.pdf), for information you may need to answer the case study questions.

  8. [Adverse events prevention ability].

    PubMed

    Aparo, Ugo Luigi; Aparo, Andrea

    2007-03-01

    The issue of how to address medical errors is the key to improve the health care system performances. Operational evidence collected in the last five years shows that the solution is only partially linked to future technological developments. Cultural and organisational changes are mandatory to help to manage and drastically reduce the adverse events in health care organisations. Classical management, merely based on coordination and control, is inadequate. Proactive, self-organising network based structures must be put in place and managed using adaptive, fast evolving management tools. PMID:17484160

  9. [Adverse events prevention ability].

    PubMed

    Aparo, Ugo Luigi; Aparo, Andrea

    2007-03-01

    The issue of how to address medical errors is the key to improve the health care system performances. Operational evidence collected in the last five years shows that the solution is only partially linked to future technological developments. Cultural and organisational changes are mandatory to help to manage and drastically reduce the adverse events in health care organisations. Classical management, merely based on coordination and control, is inadequate. Proactive, self-organising network based structures must be put in place and managed using adaptive, fast evolving management tools.

  10. Responding to the Event Deluge

    NASA Technical Reports Server (NTRS)

    Williams, Roy D.; Barthelmy, Scott D.; Denny, Robert B.; Graham, Matthew J.; Swinbank, John

    2012-01-01

    We present the VOEventNet infrastructure for large-scale rapid follow-up of astronomical events, including selection, annotation, machine intelligence, and coordination of observations. The VOEvent.standard is central to this vision, with distributed and replicated services rather than centralized facilities. We also describe some of the event brokers, services, and software that .are connected to the network. These technologies will become more important in the coming years, with new event streams from Gaia, LOF AR, LIGO, LSST, and many others

  11. Network Views

    ERIC Educational Resources Information Center

    Alexander, Louis

    2010-01-01

    The world changed in 2008. The financial crisis brought with it a deepening sense of insecurity, and the desire to be connected to a network increased. Throughout the summer and fall of 2008, events were unfolding with alarming rapidity. The Massachusetts Institute of Technology (MIT) Alumni Association wanted to respond to this change in the…

  12. Building America Best Practices Series: Volume 2; Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Hot-Dry and Mixed-Dry Climates

    SciTech Connect

    2005-09-01

    This guidebook is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the hot-dry and mixed-dry climates.

  13. Events diary

    NASA Astrophysics Data System (ADS)

    2000-01-01

    as Imperial College, the Royal Albert Hall, the Royal College of Art, the Natural History and Science Museums and the Royal Geographical Society. Under the heading `Shaping the future together' BA2000 will explore science, engineering and technology in their wider cultural context. Further information about this event on 6 - 12 September may be obtained from Sandra Koura, BA2000 Festival Manager, British Association for the Advancement of Science, 23 Savile Row, London W1X 2NB (tel: 0171 973 3075, e-mail: sandra.koura@britassoc.org.uk ). Details of the creating SPARKS events may be obtained from creating.sparks@britassoc.org.uk or from the website www.britassoc.org.uk . Other events 3 - 7 July, Porto Alegre, Brazil VII Interamerican conference on physics education: The preparation of physicists and physics teachers in contemporary society. Info: IACPE7@if.ufrgs.br or cabbat1.cnea.gov.ar/iacpe/iacpei.htm 27 August - 1 September, Barcelona, Spain GIREP conference: Physics teacher education beyond 2000. Info: www.blues.uab.es/phyteb/index.html

  14. Robotic Follow-up of Microlensing Events

    NASA Astrophysics Data System (ADS)

    Street, Rachel; Microlensing Project, RoboNet

    2009-05-01

    Several hundred galactic microlensing events are now routinely discovered every year, of which a few exhibit anomalous behavior due to the presence of an exoplanet orbiting the lensing body. Ground based follow-up of these events requires a co-ordinated observing program using network of telescopes observing around the clock. The RoboNet microlensing project is taking advantage of the robotic scheduling capabilities of LCOGT and the Liverpool Telescope to provide responsive photometric follow-up of carefully selected events. Currently LCOGT has two, 2m telescopes available via our network and are in the process of building and deploying networks of 1m and 0.4m telescopes. Once online, these facilities will provide 24hr coverage of microlensing events. Here we highlight results from the RoboNet Project to date and describe the software we have developed to optimize our response to planetary events.

  15. HBP Builder: A Tool to Generate Hyperbranched Polymers and Hyperbranched Multi-Arm Copolymers for Coarse-grained and Fully Atomistic Molecular Simulations

    PubMed Central

    Yu, Chunyang; Ma, Li; Li, Shanlong; Tan, Haina; Zhou, Yongfeng; Yan, Deyue

    2016-01-01

    Computer simulation has been becoming a versatile tool that can investigate detailed information from the microscopic scale to the mesoscopic scale. However, the crucial first step of molecular simulation is model building, particularly for hyperbranched polymers (HBPs) and hyperbranched multi-arm copolymers (HBMCs) with complex and various topological structures. Unlike well-defined polymers, not only the molar weight of HBPs/HBMCs with polydispersity, but the HBPs/HBMCs with the same degree of polymerization (DP) and degree of branching (DB) also have many possible topological structures, thus making difficulties for user to build model in molecular simulation. In order to build a bridge between model building and molecular simulation of HBPs and HBMCs, we developed HBP Builder, a C language open source HBPs/HBMCs building toolkit. HBP Builder implements an automated protocol to build various coarse-grained and fully atomistic structures of HBPs/HBMCs according to user’s specific requirements. Meanwhile, coarse-grained and fully atomistic output structures can be directly employed in popular simulation packages, including HOOMD, Tinker and Gromacs. Moreover, HBP Builder has an easy-to-use graphical user interface and the modular architecture, making it easy to extend and reuse it as a part of other program. PMID:27188541

  16. HBP Builder: A Tool to Generate Hyperbranched Polymers and Hyperbranched Multi-Arm Copolymers for Coarse-grained and Fully Atomistic Molecular Simulations

    NASA Astrophysics Data System (ADS)

    Yu, Chunyang; Ma, Li; Li, Shanlong; Tan, Haina; Zhou, Yongfeng; Yan, Deyue

    2016-05-01

    Computer simulation has been becoming a versatile tool that can investigate detailed information from the microscopic scale to the mesoscopic scale. However, the crucial first step of molecular simulation is model building, particularly for hyperbranched polymers (HBPs) and hyperbranched multi-arm copolymers (HBMCs) with complex and various topological structures. Unlike well-defined polymers, not only the molar weight of HBPs/HBMCs with polydispersity, but the HBPs/HBMCs with the same degree of polymerization (DP) and degree of branching (DB) also have many possible topological structures, thus making difficulties for user to build model in molecular simulation. In order to build a bridge between model building and molecular simulation of HBPs and HBMCs, we developed HBP Builder, a C language open source HBPs/HBMCs building toolkit. HBP Builder implements an automated protocol to build various coarse-grained and fully atomistic structures of HBPs/HBMCs according to user’s specific requirements. Meanwhile, coarse-grained and fully atomistic output structures can be directly employed in popular simulation packages, including HOOMD, Tinker and Gromacs. Moreover, HBP Builder has an easy-to-use graphical user interface and the modular architecture, making it easy to extend and reuse it as a part of other program.

  17. AIRID: an application of the KAS/Prospector expert system builder to airplane identification

    SciTech Connect

    Aldridge, J.P.

    1984-01-01

    The Knowledge Acquisition System/Prospector expert system building tool developed by SRI, International, has been used to construct an expert system to identify aircraft on the basis of observables such as wing shape, engine number/location, fuselage shape, and tail assembly shape. Additional detailed features are allowed to influence the identification as other favorable features. Constraints on the observations imposed by bad weather and distant observations have been included as contexts to the models. Models for Soviet and US fighter aircraft have been included. Inclusion of other types of aircraft such as bombers, transports, and reconnaissance craft is straightforward. Two models permit exploration of the interaction of semantic and taxonomic networks with the models. A full set of text data for fluid communication with the user has been included. The use of demons as triggered output responses to enhance utility to the user has been explored. This paper presents discussion of the ease of building the expert system using this powerful tool and problems encountered in the construction process.

  18. Solar project description for Greenmoss Builders Incorporated single-family residence, Waitsfield, Vermont

    NASA Astrophysics Data System (ADS)

    1981-10-01

    A solar installed in a two story single-family residence in Vermont is discussed. Passive solar space heating and active solar domestic hot water preheating are the outputs. The space heating system includes a Trombe wall with 278 square feet of window, gas fired auxiliary units, and air distribution. The heating modes are: solar only; solar plus supplemental heat sources (gas fired furnace and/or wood stoves) with thermal curtain open; residual (stored) solar plus supplemental sources with thermal curtain closed; and heating by furnace and/or stove only. An array of 88 sq. ft. of flat plate collectors with water as the working fluid preheats the potable hot water supply before it is fully heated by a gas fired water heater. Freeze protection is by drain down. Modes of operation are collector to preheat, preheat to domestic hot water, and draindown. The instrumentation for the National Solar Data network is described. Original cost estimates for provisioning and installation of the solar system are given.

  19. Synchronous changes in the seismicity rate and ocean-bottom hydrostatic pressures along the Nankai trough: A possible slow slip event detected by the Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET)

    NASA Astrophysics Data System (ADS)

    Suzuki, Kensuke; Nakano, Masaru; Takahashi, Narumi; Hori, Takane; Kamiya, Shinichiro; Araki, Eiichiro; Nakata, Ryoko; Kaneda, Yoshiyuki

    2016-06-01

    We detected long-term hydrostatic pressure changes at ocean-bottom stations of the Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET) along the Nankai trough, off southwestern Japan. We detected these changes after removing the contributions of ocean mass variations and sensor drift from the records. In addition, we detected a decrease in the background seismicity rate of a nearby earthquake cluster that was synchronous with the hydrostatic pressure changes. We interpreted these observed hydrostatic pressure changes to reflect vertical deformation of the ocean floor of 3-8 cm, and we consider the cause of the seafloor crustal deformation to be a slow slip event (SSE) beneath the stations. Because the pressure changes were observed at stations with distances less than 20 km to each other, we inferred that the SSE occurred in the shallow part of the sedimentary wedge, such as on a splay fault system. The synchronous observation of an SSE and a seismicity rate change suggests that both were triggered by a change in the regional stress that may be associated with stress accumulation and release processes occurring along the Nankai trough. These data show that continuous and careful monitoring of crustal activities by DONET stations provides an effective way to detect seismic and geodetic signals related to the occurrence of megathrust or other types of large earthquakes.

  20. Functional Connectivity in MRI Is Driven by Spontaneous BOLD Events

    PubMed Central

    Allan, Thomas W.; Francis, Susan T.; Caballero-Gaudes, Cesar; Morris, Peter G.; Liddle, Elizabeth B.; Liddle, Peter F.; Brookes, Matthew J.; Gowland, Penny A.

    2015-01-01

    Functional brain signals are frequently decomposed into a relatively small set of large scale, distributed cortical networks that are associated with different cognitive functions. It is generally assumed that the connectivity of these networks is static in time and constant over the whole network, although there is increasing evidence that this view is too simplistic. This work proposes novel techniques to investigate the contribution of spontaneous BOLD events to the temporal dynamics of functional connectivity as assessed by ultra-high field functional magnetic resonance imaging (fMRI). The results show that: 1) spontaneous events in recognised brain networks contribute significantly to network connectivity estimates; 2) these spontaneous events do not necessarily involve whole networks or nodes, but clusters of voxels which act in concert, forming transiently synchronising sub-networks and 3) a task can significantly alter the number of localised spontaneous events that are detected within a single network. These findings support the notion that spontaneous events are the main driver of the large scale networks that are commonly detected by seed-based correlation and ICA. Furthermore, we found that large scale networks are manifestations of smaller, transiently synchronising sub-networks acting dynamically in concert, corresponding to spontaneous events, and which do not necessarily involve all voxels within the network nodes oscillating in unison. PMID:25922945

  1. Caregiving as a Family Network Event.

    ERIC Educational Resources Information Center

    Mellins, Claude A.; And Others

    Caregiving studies often focus on the impaired elder who is the care receiver and the one family member who is perceived as the primary caregiver. Such studies fail to consider all members of the family, whether or not they are involved in providing care. This study was conducted to explore the effects of an elder's health-related dependency on…

  2. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  3. Tectonic events in Greenland

    NASA Astrophysics Data System (ADS)

    Dahl-Jensen, T.; Voss, P.; Larsen, T.; Pinna, L.

    2012-12-01

    In Greenland a station separation of around 400km mean that many earthquakes are only detected on one or two stations. The development of the seismic monitoring have gone from having only three seismic stations in Greenland up to the late 1990'ies, till today where there are 18 permanent stations. All stations are equipped with broadband sensors and all of the permanent stations transmit data in real time. The recent major improvement of the seismic monitoring is performed by the Greenland ice sheet monitoring network (GLISN, http://glisn.info). The primary goal of GLISN is to provide broadband seismic data for the detection of glacial earthquakes. GLISN is now fully implemented with Iridium real time data transfer is in operation at five stations. In the Ammassalik region in Southeast Greenland, where small earthquakes often are felt, data from a temporary additional station has been utilized for a study covering 9 months in 2008/9. In this period 62 local earthquakes have been analyzed and re-located. Some of the events had formerly been located from distant stations by using a universal earth model. The result of this localization was a scattered distribution of the events in the region. The locations have now been improved by using a local earth model along with phase readings from two local stations not previously included; ANG in Tasiilaq and ISOG in Isortoq. From relocating the events two zones with a higher degree of seismicity than in the rest of the region are observed. The first zone is located by felsic intrusions. The second zone is at the boundary between the Archaean Craton and the Ammasalik region where reworked Archaean gneisses are dominating the geology. During the analysis it was observed that the additional information from the local stations are of great importance for the result. Active broad band stations in Greenland

  4. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    DOE PAGES

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events thatmore » generated the phase picks.« less

  5. Pickless event detection and location: The waveform correlation event detection system (WCEDS) revisited

    SciTech Connect

    Arrowsmith, Stephen John; Young, Christopher J.; Ballard, Sanford; Slinkard, Megan Elizabeth

    2016-01-01

    The standard paradigm for seismic event monitoring breaks the event detection problem down into a series of processing stages that can be categorized at the highest level into station-level processing and network-level processing algorithms (e.g., Le Bras and Wuster (2002)). At the station-level, waveforms are typically processed to detect signals and identify phases, which may subsequently be updated based on network processing. At the network-level, phase picks are associated to form events, which are subsequently located. Furthermore, waveforms are typically directly exploited only at the station-level, while network-level operations rely on earth models to associate and locate the events that generated the phase picks.

  6. Initial Results on Soil Moisture in Relation to Timing of Snowpack, Temperature, and Heavy Vs. Moderate Rain Events from a New Soil Monitoring Network in the Southern Rocky Mountains

    NASA Astrophysics Data System (ADS)

    Osenga, E. C.; Schnissel, J.; Katzenberger, J.

    2014-12-01

    The Roaring Fork Valley (RFV) in the Rocky Mountains of Colorado is comprised of a diversity of ecosystems occurring within a single watershed. From one end of the valley to the other, the landscape undergoes an over 1500m gain in elevation, creating a unique opportunity for comparison of conditions in different habitats of close geographic proximity. Interested in comparing the ecological responses of these different habitats in the context of rising global temperatures, the Aspen Global Change Institute (AGCI) partnered with City of Aspen, Pitkin Country Open Space and Trails, and the Aspen Center for Environmental Studies to install soil monitoring stations at multiple elevations within the watershed. Soil moisture was identified as the primary indicator for monitoring because there is a dearth of local soil moisture data, and soil moisture plays a vital role in plant survival and correlates closely with precipitation and air temperature. Additionally, as precipitation regimes shift in the future, there is a need to better understand the interplay between vegetative water availability during the critical early growing season and timing, areal extent, and depth of snowpack. Two initial soil monitoring stations were installed in undeveloped, montane ecosystems of the Roaring Fork Watershed in 2012. Each station measures air temperature; relative humidity; rainfall; and soil moisture at 5, 20, and 52 cm depths. Two additional soil monitoring stations are being established over the summer of 2014, and additional stations within the Roaring Fork soil moisture network are planned for future years. Early data from the existing sites indicate the importance of timing of snowmelt in maintaining soil moisture through the early dry months of summer and dissimilarity between the impact of moderate and heavy rain events on soil moisture at different depths. These data have implications for restoration, management, and planning for local ecosystems and have significance for

  7. A framework for incorporating DTI Atlas Builder registration into tract-based spatial statistics and a simulated comparison to standard TBSS

    NASA Astrophysics Data System (ADS)

    Leming, Matthew; Steiner, Rachel; Styner, Martin

    2016-03-01

    Tract-based spatial statistics (TBSS)6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder7. To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS10). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder's registration enhances TBSS group-based studies.

  8. Cross comparison of four DPRK events

    NASA Astrophysics Data System (ADS)

    Bobrov, Dmitry; Kitov, Ivan; Rozhkov, Mikhail

    2016-04-01

    Seismic signals were detected by the IMS seismic network from four announced underground test conducted by the DPRK in 2006, 2009, 2013, and 2016. These data allow thorough comparison of relative locations, including depth estimates, and magnitudes using several techniques based on waveforms cross correlation. Seismic signals from these events also provide waveform templates for detection of possible aftershocks with magnitudes by two-to-three units lower than the events themselves. We have processed one month of continuous data after each of four events and detected no aftershocks. Independent Component Analysis based Blind Source Separation was conducted for all events at different stations to compare the robustness of the source function recovery.

  9. A discrete event method for wave simulation

    SciTech Connect

    Nutaro, James J

    2006-01-01

    This article describes a discrete event interpretation of the finite difference time domain (FDTD) and digital wave guide network (DWN) wave simulation schemes. The discrete event method is formalized using the discrete event system specification (DEVS). The scheme is shown to have errors that are proportional to the resolution of the spatial grid. A numerical example demonstrates the relative efficiency of the scheme with respect to FDTD and DWN schemes. The potential for the discrete event scheme to reduce numerical dispersion and attenuation errors is discussed.

  10. Assessing Special Events.

    ERIC Educational Resources Information Center

    Neff, Bonita Dostal

    Special events defined as being "newsworthy events" are becoming a way of American life. They are also a means for making a lot of money. Examples of special events that are cited most frequently are often the most minor of events; e.g., the open house, the new business opening day gala, or a celebration of some event in an organization. Little…

  11. Event Segmentation Ability Uniquely Predicts Event Memory

    PubMed Central

    Sargent, Jesse Q.; Zacks, Jeffrey M.; Hambrick, David Z.; Zacks, Rose T.; Kurby, Christopher A.; Bailey, Heather R.; Eisenberg, Michelle L.; Beck, Taylor M.

    2013-01-01

    Memory for everyday events plays a central role in tasks of daily living, autobiographical memory, and planning. Event memory depends in part on segmenting ongoing activity into meaningful units. This study examined the relationship between event segmentation and memory in a lifespan sample to answer the following question: Is the ability to segment activity into meaningful events a unique predictor of subsequent memory, or is the relationship between event perception and memory accounted for by general cognitive abilities? Two hundred and eight adults ranging from 20 to 79 years old segmented movies of everyday events and attempted to remember the events afterwards. They also completed psychometric ability tests and tests measuring script knowledge for everyday events. Event segmentation and script knowledge both explained unique variance in event memory above and beyond the psychometric measures, and did so as strongly in older as in younger adults. These results suggest that event segmentation is a basic cognitive mechanism, important for memory across the lifespan. PMID:23942350

  12. CAISSON: Interconnect Network Simulator

    NASA Technical Reports Server (NTRS)

    Springer, Paul L.

    2006-01-01

    Cray response to HPCS initiative. Model future petaflop computer interconnect. Parallel discrete event simulation techniques for large scale network simulation. Built on WarpIV engine. Run on laptop and Altix 3000. Can be sized up to 1000 simulated nodes per host node. Good parallel scaling characteristics. Flexible: multiple injectors, arbitration strategies, queue iterators, network topologies.

  13. Paul Fritts and company, organ builders: The evolution of the mechanical-action organ in the United States during the 20th century with historical emphasis on the instruments of Paul Fritts

    NASA Astrophysics Data System (ADS)

    Still, Tamara G.

    Paul Fritts is known internationally as a builder of mechanical-action pipe organs based on historical models of the 17th and 18 th centuries. He is one of a number of contemporary builders who have researched renaissance and baroque organs as a point of departure for their designs. As builders have returned to antique models to inform their craft, they have debated the importance of certain aspects of classical design. Included here is discussion of some aspects of design with regard to Fritts and other important mechanical-action organbuilders. Topics of discussion include the significance of pipe metals, temperaments and tunings, key action and case design. Some of the professional debate is scientific, other is subjective and very emotional. This dissertation explores the work of Paul Fritts and chronicles his contribution. Biographical information is included.

  14. Building America Best Practices Series: Volume 2. Builders and Buyers Handbook for Improving New Home Efficiency, Comfort, and Durability in the Hot-Dry and Mixed-Dry Climates

    SciTech Connect

    Baechler, M. C.; Taylor, Z. T.; Bartlett, R.; Gilbride, T.; Hefty, M.; Love, P. M.

    2005-09-01

    This best practices guide is part of a series produced by Building America. The guidebook is a resource to help builders large and small build high-quality, energy-efficient homes that achieve 30% energy savings in space conditioning and water heating in the hot-dry and mixed-dry climates. The savings are in comparison with the 1993 Model Energy Code. The guide contains chapters for every member of the builder's team—from the manager to the site planner to the designers, site supervisors, the trades, and marketers. There is also a chapter for homeowners on how to use the book to provide help in selecting a new home or builder.

  15. Event-Based Science.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    1992-01-01

    Suggests that an event-based science curriculum can provide the framework for deciding what to retain in an overloaded science curriculum. Provides examples of current events and the science concepts explored related to the event. (MDH)

  16. A wireless time synchronized event control system

    NASA Astrophysics Data System (ADS)

    Klug, Robert; Williams, Jonathan; Scheffel, Peter

    2014-05-01

    McQ has developed a wireless, time-synchronized, event control system to control, monitor, and record events with precise timing over large test sites for applications such as high speed rocket sled payload testing. Events of interest may include firing rocket motors and launch sleds, initiating flares, ejecting bombs, ejecting seats, triggering high speed cameras, measuring sled velocity, and triggering events based on a velocity window or other criteria. The system consists of Event Controllers, a Launch Controller, and a wireless network. The Event Controllers can be easily deployed at areas of interest within the test site and maintain sub-microsecond timing accuracy for monitoring sensors, electronically triggering other equipment and events, and providing timing signals to other test equipment. Recorded data and status information is reported over the wireless network to a server and user interface. Over the wireless network, the user interface configures the system based on a user specified mission plan and provides real time command, control, and monitoring of the devices and data. An overview of the system, its features, performance, and potential uses is presented.

  17. Automated beam builder

    NASA Technical Reports Server (NTRS)

    Muench, W. K.

    1980-01-01

    Requirements for the space fabrication of large space structures are considered with emphasis on the design, development, manufacture, and testing of a machine which automatically produces a basic building block aluminum beam. Particular problems discussed include those associated with beam cap forming; brace storage, dispensing, and transporting; beam component fastening; and beam cut-off. Various critical process tests conducted to develop technology for a machine to produce composite beams are also discussed.

  18. Tracking Solar Events through Iterative Refinement

    NASA Astrophysics Data System (ADS)

    Kempton, D. J.; Angryk, R. A.

    2015-11-01

    In this paper, we combine two approaches to multiple-target tracking: the first is a hierarchical approach to iteratively growing track fragments across gaps in detections; the second is a network flow based optimization method for data association. The network flow based optimization method is utilized for data association in an iteratively growing manner. This process is applied to solar data, retrieved from the Heliophysics Event Knowledge base (HEK) and utilizes precomputed image parameter values. These precomputed image parameter values are used to compare visual similarity of detected events, to determine the best matching track fragment associations, which leads to a globally optimal track fragment association hypothesis.

  19. LHCb Online event processing and filtering

    NASA Astrophysics Data System (ADS)

    Alessio, F.; Barandela, C.; Brarda, L.; Frank, M.; Franek, B.; Galli, D.; Gaspar, C.; Herwijnen, E. v.; Jacobsson, R.; Jost, B.; Köstner, S.; Moine, G.; Neufeld, N.; Somogyi, P.; Stoica, R.; Suman, S.

    2008-07-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. The entire data-flow is controlled and configured by means of a SCADA system and several databases. After an overview of the LHCb data acquisition and its design principles this paper will emphasize the LHCb event filter system, which is now implemented using the final hardware and will be ready for data-taking for the LHC startup. Control, configuration and security aspects will also be discussed.

  20. Episodes, events, and models

    PubMed Central

    Khemlani, Sangeet S.; Harrison, Anthony M.; Trafton, J. Gregory

    2015-01-01

    We describe a novel computational theory of how individuals segment perceptual information into representations of events. The theory is inspired by recent findings in the cognitive science and cognitive neuroscience of event segmentation. In line with recent theories, it holds that online event segmentation is automatic, and that event segmentation yields mental simulations of events. But it posits two novel principles as well: first, discrete episodic markers track perceptual and conceptual changes, and can be retrieved to construct event models. Second, the process of retrieving and reconstructing those episodic markers is constrained and prioritized. We describe a computational implementation of the theory, as well as a robotic extension of the theory that demonstrates the processes of online event segmentation and event model construction. The theory is the first unified computational account of event segmentation and temporal inference. We conclude by demonstrating now neuroimaging data can constrain and inspire the construction of process-level theories of human reasoning. PMID:26578934

  1. VAFLE: visual analytics of firewall log events

    NASA Astrophysics Data System (ADS)

    Ghoniem, Mohammad; Shurkhovetskyy, Georgiy; Bahey, Ahmed; Otjacques, Benoît.

    2013-12-01

    In this work, we present VAFLE, an interactive network security visualization prototype for the analysis of firewall log events. Keeping it simple yet effective for analysts, we provide multiple coordinated interactive visualizations augmented with clustering capabilities customized to support anomaly detection and cyber situation awareness. We evaluate the usefulness of the prototype in a use case with network traffic datasets from previous VAST Challenges, illustrating its effectiveness at promoting fast and well-informed decisions. We explain how a security analyst may spot suspicious traffic using VAFLE. We further assess its usefulness through a qualitative evaluation involving network security experts, whose feedback is reported and discussed.

  2. Using Bayesian Belief Networks and event trees for volcanic hazard assessment and decision support : reconstruction of past eruptions of La Soufrière volcano, Guadeloupe and retrospective analysis of 1975-77 unrest.

    NASA Astrophysics Data System (ADS)

    Komorowski, Jean-Christophe; Hincks, Thea; Sparks, Steve; Aspinall, Willy; Legendre, Yoann; Boudon, Georges

    2013-04-01

    Since 1992, mild but persistent seismic and fumarolic unrest at La Soufrière de Guadeloupe volcano has prompted renewed concern about hazards and risks, crisis response planning, and has rejuvenated interest in geological studies. Scientists monitoring active volcanoes frequently have to provide science-based decision support to civil authorities during such periods of unrest. In these circumstances, the Bayesian Belief Network (BBN) offers a formalized evidence analysis tool for making inferences about the state of the volcano from different strands of data, allowing associated uncertainties to be treated in a rational and auditable manner, to the extent warranted by the strength of the evidence. To illustrate the principles of the BBN approach, a retrospective analysis is undertaken of the 1975-77 crisis, providing an inferential assessment of the evolving state of the magmatic system and the probability of subsequent eruption. Conditional dependencies and parameters in the BBN are characterized quantitatively by structured expert elicitation. Revisiting data available in 1976 suggests the probability of magmatic intrusion would have been evaluated high at the time, according with subsequent thinking about the volcanological nature of the episode. The corresponding probability of a magmatic eruption therefore would have been elevated in July and August 1976; however, collective uncertainty about the future course of the crisis was great at the time, even if some individual opinions were certain. From this BBN analysis, while the more likely appraised outcome - based on observational trends at 31 August 1976 - might have been 'no eruption' (mean probability 0.5; 5-95 percentile range 0.8), an imminent magmatic eruption (or blast) could have had a probability of about 0.4, almost as substantial. Thus, there was no real scientific basis to assert one scenario was more likely than the other. This retrospective evaluation adds objective probabilistic expression to

  3. The global event system

    SciTech Connect

    Winans, J.

    1994-03-02

    The support for the global event system has been designed to allow an application developer to control the APS event generator and receiver boards. This is done by the use of four new record types. These records are customized and are only supported by the device support modules for the APS event generator and receiver boards. The use of the global event system and its associated records should not be confused with the vanilla EPICS events and the associated event records. They are very different.

  4. Seismicity in Pennsylvania: Evidence for Anthropogenic Events?

    NASA Astrophysics Data System (ADS)

    Homman, K.; Nyblade, A.

    2015-12-01

    The deployment and operation of the USArray Transportable Array (TA) and the PASEIS (XY) seismic networks in Pennsylvania during 2013 and 2014 provide a unique opportunity for investigating the seismicity of Pennsylvania. These networks, along with several permanent stations in Pennsylvania, resulted in a total of 104 seismometers in and around Pennsylvania that have been used in this study. Event locations were first obtained with Antelope Environmental Monitoring Software using P-wave arrival times. Arrival times were hand picked using a 1-5 Hz bandpass filter to within 0.1 seconds. Events were then relocated using a velocity model developed for Pennsylvania and the HYPOELLIPSE location code. In this study, 1593 seismic events occurred between February 2013 and December 2014 in Pennsylvania. These events ranged between magnitude (ML) 1.04 and 2.89 with an average MLof 1.90. Locations of the events occur across the state in many areas where no seismicity has been previously reported. Preliminary results indicate that most of these events are related to mining activity. Additional work using cross-correlation techniques is underway to examine a number of event clusters for evidence of hydraulic fracturing or wastewater injection sources.

  5. FTA Basic Event & Cut Set Ranking.

    1999-05-04

    Version 00 IMPORTANCE computes various measures of probabilistic importance of basic events and minimal cut sets to a fault tree or reliability network diagram. The minimal cut sets, the failure rates and the fault duration times (i.e., the repair times) of all basic events contained in the minimal cut sets are supplied as input data. The failure and repair distributions are assumed to be exponential. IMPORTANCE, a quantitative evaluation code, then determines the probability ofmore » the top event and computes the importance of minimal cut sets and basic events by a numerical ranking. Two measures are computed. The first describes system behavior at one point in time; the second describes sequences of failures that cause the system to fail in time. All measures are computed assuming statistical independence of basic events. In addition, system unavailability and expected number of system failures are computed by the code.« less

  6. News Education: Physics Education Networks meeting has global scale Competition: Competition seeks the next Brian Cox Experiment: New measurement of neutrino time-of-flight consistent with the speed of light Event: A day for all those who teach physics Conference: Students attend first Anglo-Japanese international science conference Celebration: Will 2015 be the 'Year of Light'? Teachers: Challenging our intuition in spectacular fashion: the fascinating world of quantum physics awaits Research: Science sharpens up sport Learning: Kittinger and Baumgartner: on a mission to the edge of space International: London International Youth Science Forum calls for leading young scientists Competition: Physics paralympian challenge needs inquisitive, analytical, artistic and eloquent pupils Forthcoming events

    NASA Astrophysics Data System (ADS)

    2012-05-01

    Education: Physics Education Networks meeting has global scale Competition: Competition seeks the next Brian Cox Experiment: New measurement of neutrino time-of-flight consistent with the speed of light Event: A day for all those who teach physics Conference: Students attend first Anglo-Japanese international science conference Celebration: Will 2015 be the 'Year of Light'? Teachers: Challenging our intuition in spectacular fashion: the fascinating world of quantum physics awaits Research: Science sharpens up sport Learning: Kittinger and Baumgartner: on a mission to the edge of space International: London International Youth Science Forum calls for leading young scientists Competition: Physics paralympian challenge needs inquisitive, analytical, artistic and eloquent pupils Forthcoming events

  7. Vaccine Adverse Events

    MedlinePlus

    ... Vaccines, Blood & Biologics Animal & Veterinary Cosmetics Tobacco Products Vaccines, Blood & Biologics Home Vaccines, Blood & Biologics Safety & Availability ( ... Center for Biologics Evaluation & Research Vaccine Adverse Events Vaccine Adverse Events Share Tweet Linkedin Pin it More ...

  8. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  9. Pleasant events, unpleasant events, and depression.

    PubMed

    Sweeney, P D; Shaeffer, D E; Golin, S

    1982-07-01

    A review of previous research on Lewinsohn's model of depression shows that the causal link between a lack of response-contingent positive reinforcement and subsequent depression remains unsubstantiated. The present study was designed to explicitly test this causal relationship through the use of cross-lagged panel correlation. Measures of depression and pleasant events were taken at two different points in time separated by 1 month. The results revealed that the null hypothesis of spuriousness could not be rejected, indicating the relation often found between a lack of pleasant events and depression is probably due to some unmeasured third variable. The results also indicated that there is no causal relation between unpleasant events and depression. In summary, the causal assumptions in Lewinsohn's theory of depression were not supported by the data. Possible third-variable explanations of the data and their implications are discussed.

  10. NASA Integrated Network COOP

    NASA Technical Reports Server (NTRS)

    Anderson, Michael L.; Wright, Nathaniel; Tai, Wallace

    2012-01-01

    Natural disasters, terrorist attacks, civil unrest, and other events have the potential of disrupting mission-essential operations in any space communications network. NASA's Space Communications and Navigation office (SCaN) is in the process of studying options for integrating the three existing NASA network elements, the Deep Space Network, the Near Earth Network, and the Space Network, into a single integrated network with common services and interfaces. The need to maintain Continuity of Operations (COOP) after a disastrous event has a direct impact on the future network design and operations concepts. The SCaN Integrated Network will provide support to a variety of user missions. The missions have diverse requirements and include anything from earth based platforms to planetary missions and rovers. It is presumed that an integrated network, with common interfaces and processes, provides an inherent advantage to COOP in that multiple elements and networks can provide cross-support in a seamless manner. The results of trade studies support this assumption but also show that centralization as a means of achieving integration can result in single points of failure that must be mitigated. The cost to provide this mitigation can be substantial. In support of this effort, the team evaluated the current approaches to COOP, developed multiple potential approaches to COOP in a future integrated network, evaluated the interdependencies of the various approaches to the various network control and operations options, and did a best value assessment of the options. The paper will describe the trade space, the study methods, and results of the study.

  11. VOEventNet: Event Messaging for Astronomy

    NASA Astrophysics Data System (ADS)

    Drake, Andrew J.; Djorgovski, G.; Graham, M.; Williams, R.; Mahabal, A.; Donalek, C.; Glikman, E.; Bloom, J.; Vastrand, T.; White, R.; Rabinowitz, D.; Baltay, C.

    2006-12-01

    The time domain remains one of the the least explored areas in modern astronomy. In the near future the next generation of large synoptic sky surveys (Pan-STARRs, Skymapper, LSST) will probe the time dependent nature of the sky by detecting hundreds of thousands of astronomical transients (variable stars, asteroids, GRBs, lensing events). A global event distribution and follow-up network is required to characterize the nature of these transients. For over a year the VOEventNet project has been in the process of implementing a transient event follow-up network which distributes crafted structured data packets called VOEvents. These packets have been designed to be general enough to contain metadata for transients seen at all wavelengths, yet interpretable by robotic telescope systems (which are already automatically responding with follow-up observations). The VOEventNet project currently has transient event follow-up with the Palomar 60 and 200in (Caltech), RAPTOR (LANL), PARITEL and KAIT (UCB) as well as UK telescopes. VOEventNet transient event streams are publicly available. The subscription, publication and reception of VOEvents is implimented with a number of open source software clients. The software and details of how to receive streams of events are available from http://www.voeventnet.org. Current event streams include OGLE microlensing events, SDSS Supernovae, GCN GRBs, Raptor and Palomar-Quest optical transients. In the near future, many additional streams of VOEvents will be available, including optical transients from the ESSENCE, Planet and MOA projects, as well as those from UKIRT and JCMT telescopes. We also expect that transient event alerts will be available from Solar, X-ray and Radio telescopes.

  12. Quartets and unrooted phylogenetic networks.

    PubMed

    Gambette, Philippe; Berry, Vincent; Paul, Christophe

    2012-08-01

    Phylogenetic networks were introduced to describe evolution in the presence of exchanges of genetic material between coexisting species or individuals. Split networks in particular were introduced as a special kind of abstract network to visualize conflicts between phylogenetic trees which may correspond to such exchanges. More recently, methods were designed to reconstruct explicit phylogenetic networks (whose vertices can be interpreted as biological events) from triplet data. In this article, we link abstract and explicit networks through their combinatorial properties, by introducing the unrooted analog of level-k networks. In particular, we give an equivalence theorem between circular split systems and unrooted level-1 networks. We also show how to adapt to quartets some existing results on triplets, in order to reconstruct unrooted level-k phylogenetic networks. These results give an interesting perspective on the combinatorics of phylogenetic networks and also raise algorithmic and combinatorial questions.

  13. Quartets and unrooted phylogenetic networks.

    PubMed

    Gambette, Philippe; Berry, Vincent; Paul, Christophe

    2012-08-01

    Phylogenetic networks were introduced to describe evolution in the presence of exchanges of genetic material between coexisting species or individuals. Split networks in particular were introduced as a special kind of abstract network to visualize conflicts between phylogenetic trees which may correspond to such exchanges. More recently, methods were designed to reconstruct explicit phylogenetic networks (whose vertices can be interpreted as biological events) from triplet data. In this article, we link abstract and explicit networks through their combinatorial properties, by introducing the unrooted analog of level-k networks. In particular, we give an equivalence theorem between circular split systems and unrooted level-1 networks. We also show how to adapt to quartets some existing results on triplets, in order to reconstruct unrooted level-k phylogenetic networks. These results give an interesting perspective on the combinatorics of phylogenetic networks and also raise algorithmic and combinatorial questions. PMID:22809417

  14. Dialogue on private events

    PubMed Central

    Palmer, David C.; Eshleman, John; Brandon, Paul; Layng, T. V. Joe; McDonough, Christopher; Michael, Jack; Schoneberger, Ted; Stemmer, Nathan; Weitzman, Ray; Normand, Matthew

    2004-01-01

    In the fall of 2003, the authors corresponded on the topic of private events on the listserv of the Verbal Behavior Special Interest Group. Extracts from that correspondence raised questions about the role of response amplitude in determining units of analysis, whether private events can be investigated directly, and whether covert behavior differs from other behavior except in amplitude. Most participants took a cautious stance, noting not only conceptual pitfalls and empirical difficulties in the study of private events, but doubting the value of interpretive exercises about them. Others argued that despite such obstacles, in domains where experimental analyses cannot be done, interpretation of private events in the light of laboratory principles is the best that science can offer. One participant suggested that the notion that private events can be behavioral in nature be abandoned entirely; as an alternative, the phenomena should be reinterpreted only as physiological events. PMID:22477293

  15. Networks model of the East Turkistan terrorism

    NASA Astrophysics Data System (ADS)

    Li, Ben-xian; Zhu, Jun-fang; Wang, Shun-guo

    2015-02-01

    The presence of the East Turkistan terrorist network in China can be traced back to the rebellions on the BAREN region in Xinjiang in April 1990. This article intends to research the East Turkistan networks in China and offer a panoramic view. The events, terrorists and their relationship are described using matrices. Then social network analysis is adopted to reveal the network type and the network structure characteristics. We also find the crucial terrorist leader. Ultimately, some results show that the East Turkistan network has big hub nodes and small shortest path, and that the network follows a pattern of small world network with hierarchical structure.

  16. Global Seismic Event Detection Using Surface Waves: 15 Possible Antarctic Glacial Sliding Events

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.; Walker, K. T.; Fricker, H. A.

    2008-12-01

    To identify overlooked or anomalous seismic events not listed in standard catalogs, we have developed an algorithm to detect and locate global seismic events using intermediate-period (35-70s) surface waves. We apply our method to continuous vertical-component seismograms from the global seismic networks as archived in the IRIS UV FARM database from 1997 to 2007. We first bandpass filter the seismograms, apply automatic gain control, and compute envelope functions. We then examine 1654 target event locations defined at 5 degree intervals and stack the seismogram envelopes along the predicted Rayleigh-wave travel times. The resulting function has spatial and temporal peaks that indicate possible seismic events. We visually check these peaks using a graphical user interface to eliminate artifacts and assign an overall reliability grade (A, B or C) to the new events. We detect 78% of events in the Global Centroid Moment Tensor (CMT) catalog. However, we also find 840 new events not listed in the PDE, ISC and REB catalogs. Many of these new events were previously identified by Ekstrom (2006) using a different Rayleigh-wave detection scheme. Most of these new events are located along oceanic ridges and transform faults. Some new events can be associated with volcanic eruptions such as the 2000 Miyakejima sequence near Japan and others with apparent glacial sliding events in Greenland (Ekstrom et al., 2003). We focus our attention on 15 events detected from near the Antarctic coastline and relocate them using a cross-correlation approach. The events occur in 3 groups which are well-separated from areas of cataloged earthquake activity. We speculate that these are iceberg calving and/or glacial sliding events, and hope to test this by inverting for their source mechanisms and examining remote sensing data from their source regions.

  17. Features, Events, and Processes: Disruptive Events

    SciTech Connect

    J. King

    2004-03-31

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  18. Features, Events, and Processes: Disruptive Events

    SciTech Connect

    P. Sanchez

    2004-11-08

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA).

  19. Committed Sport Event Volunteers

    ERIC Educational Resources Information Center

    Han, Keunsu; Quarterman, Jerome; Strigas, Ethan; Ha, Jaehyun; Lee, Seungbum

    2013-01-01

    The purpose of this study was to investigate the relationships among selected demographic characteristics (income, education and age), motivation and commitment of volunteers at a sporting event. Three-hundred and five questionnaires were collected from volunteers in a marathon event and analyzed using structural equation modeling (SEM). Based on…

  20. Activating Event Knowledge

    ERIC Educational Resources Information Center

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or…

  1. Traumatic events and children

    MedlinePlus

    ... a one-time traumatic event or a repeated trauma that happens over and over again. Examples of one-time traumatic events are: Natural disasters, such as a tornado, hurricane, fire, or flood Rape Witness shooting or stabbing of a person Sudden ...

  2. Contrasting Large Solar Events

    NASA Astrophysics Data System (ADS)

    Lanzerotti, Louis J.

    2010-10-01

    After an unusually long solar minimum, solar cycle 24 is slowly beginning. A large coronal mass ejection (CME) from sunspot 1092 occurred on 1 August 2010, with effects reaching Earth on 3 August and 4 August, nearly 38 years to the day after the huge solar event of 4 August 1972. The prior event, which those of us engaged in space research at the time remember well, recorded some of the highest intensities of solar particles and rapid changes of the geomagnetic field measured to date. What can we learn from the comparisons of these two events, other than their essentially coincident dates? One lesson I took away from reading press coverage and Web reports of the August 2010 event is that the scientific community and the press are much more aware than they were nearly 4 decades ago that solar events can wreak havoc on space-based technologies.

  3. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  4. The ATLAS Event Service: A new approach to event processing

    NASA Astrophysics Data System (ADS)

    Calafiura, P.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.

    2015-12-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre-staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabilities, its architecture and the highly scalable tools and technologies employed in its implementation, and its applications in ATLAS processing on HPCs, commercial cloud resources, volunteer computing, and grid resources. Notice: This manuscript has been authored by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  5. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  6. Event shape sorting

    NASA Astrophysics Data System (ADS)

    Kopečná, Renata; Tomášik, Boris

    2016-04-01

    We propose a novel method for sorting events of multiparticle production according to the azimuthal anisotropy of their momentum distribution. Although the method is quite general, we advocate its use in analysis of ultra-relativistic heavy-ion collisions where a large number of hadrons is produced. The advantage of our method is that it can automatically sort out samples of events with histograms that indicate similar distributions of hadrons. It takes into account the whole measured histograms with all orders of anisotropy instead of a specific observable ( e.g., v_2 , v_3 , q_2 . It can be used for more exclusive experimental studies of flow anisotropies which are then more easily compared to theoretical calculations. It may also be useful in the construction of mixed-events background for correlation studies as it allows to select events with similar momentum distribution.

  7. Special Event Production.

    ERIC Educational Resources Information Center

    Currents, 2002

    2002-01-01

    Offers a descriptive table of software that helps higher education institutions orchestrate events. Information includes vendor, contact, software, price, database engine/server platform, specific features, and client type. (EV)

  8. CCG - News & Events

    Cancer.gov

    NCI's Center for Cancer Genomics (CCG) has been widely recognized for its research efforts to facilitiate advances in cancer genomic research and improve patient outcomes. Find the latest news about and events featuring CCG.

  9. Holter and Event Monitors

    MedlinePlus

    ... Holter and event monitors are similar to an EKG (electrocardiogram). An EKG is a simple test that detects and records ... for diagnosing heart rhythm problems. However, a standard EKG only records the heartbeat for a few seconds. ...

  10. RAS Initiative - Events

    Cancer.gov

    The NCI RAS Initiative has organized multiple events with outside experts to discuss how the latest scientific and technological breakthroughs can be applied to discover vulnerabilities in RAS-driven cancers.

  11. "Universe" event at AIMS

    NASA Astrophysics Data System (ADS)

    2008-06-01

    Report of event of 11 May 2008 held at the African Institute of Mathematical Sciences (Muizenberg, Cape), with speakers Michael Griffin (Administrator of NASA), Stephen Hawking (Cambridge), David Gross (Kavli Institute, Santa Barbara) and George Smoot (Berkeley).

  12. QCD (&) event generators

    SciTech Connect

    Skands, Peter Z.; /Fermilab

    2005-07-01

    Recent developments in QCD phenomenology have spurred on several improved approaches to Monte Carlo event generation, relative to the post-LEP state of the art. In this brief review, the emphasis is placed on approaches for (1) consistently merging fixed-order matrix element calculations with parton shower descriptions of QCD radiation, (2) improving the parton shower algorithms themselves, and (3) improving the description of the underlying event in hadron collisions.

  13. Infrasound Event Analysis into the IDC Operations

    NASA Astrophysics Data System (ADS)

    Mialle, Pierrick; Bittner, Paulina; Brachet, Nicolas; Brown, David; Given, Jeffrey; Le Bras, Ronan; Coyne, John

    2010-05-01

    The first atmospheric event built only from infrasound arrivals was reported in the Reviewed Event Bulletin (REB) of the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO) in 2003. In the last decade, 42 infrasound stations from the International Monitoring System (IMS) have been installed and are transmitting data to the IDC. The growing amount of infrasound data and detections produced by the automatic system challenged the station and network processing at the IDC, which required the Organization to redesign the way infrasound data are processed. Each infrasound array is processed separately for signal detection using a progressive multi-channel correlation method (DFX-PMCC). For each detection, signal features - onset time, amplitude, frequency, duration, azimuth, phase velocity, F-statistics - are measured and used to identify a detection as infrasonic, seismic, or noise (including clutter). Infrasonic signals along with seismic and hydroacoustic signals are subsequently associated with Global Association software (GA) between stations to locate events. During detection and association phases, criteria are applied to eliminate clutter, identify signals of interest, and keep the number of automatic events containing infrasound detections to a manageable level for analyst review. The IDC has developed analysis and visualization tools specifically for infrasound review (e.g. Geotool-PMCC). The IDC has continued to build the Infrasound Reference Event Database (IRED) from observations on the IMS network. This database assists both the routine IDC infrasound analysis and analyst training as it reflects the global detection capability of the network, illustrates the spatial and temporal variability of the observed phenomena, and demonstrates the various origins of infragenic sources. Since 2007, the IDC has introduced new analyst procedures to review and add selected infrasound events to the REB. In early 2010, the IDC

  14. Collaboration in the School Social Network

    ERIC Educational Resources Information Center

    Schultz-Jones, Barbara

    2009-01-01

    Social networks are fundamental to all people. Their social network describes how they are connected to others: close relationships, peripheral relationships, and those relationships that help connect them to other people, events, or things. As information specialists, school librarians develop a multidimensional social network that enables them…

  15. Activating Event Knowledge

    PubMed Central

    Hare, Mary; Jones, Michael; Thomson, Caroline; Kelly, Sarah; McRae, Ken

    2009-01-01

    An increasing number of results in sentence and discourse processing demonstrate that comprehension relies on rich pragmatic knowledge about real-world events, and that incoming words incrementally activate such knowledge. If so, then even outside of any larger context, nouns should activate knowledge of the generalized events that they denote or typically play a role in. We used short stimulus onset asynchrony priming to demonstrate that (1) event nouns prime people (sale-shopper) and objects (trip-luggage) commonly found at those events; (2) location nouns prime people/animals (hospital-doctor) and objects (barn-hay) commonly found at those locations; and (3) instrument nouns prime things on which those instruments are commonly used (key-door), but not the types of people who tend to use them (hose-gardener). The priming effects are not due to normative word association. On our account, facilitation results from event knowledge relating primes and targets. This has much in common with computational models like LSA or BEAGLE in which one word primes another if they frequently occur in similar contexts. LSA predicts priming for all six experiments, whereas BEAGLE correctly predicted that priming should not occur for the instrument-people relation but should occur for the other five. We conclude that event-based relations are encoded in semantic memory and computed as part of word meaning, and have a strong influence on language comprehension. PMID:19298961

  16. Complex Event Recognition Architecture

    NASA Technical Reports Server (NTRS)

    Fitzgerald, William A.; Firby, R. James

    2009-01-01

    Complex Event Recognition Architecture (CERA) is the name of a computational architecture, and software that implements the architecture, for recognizing complex event patterns that may be spread across multiple streams of input data. One of the main components of CERA is an intuitive event pattern language that simplifies what would otherwise be the complex, difficult tasks of creating logical descriptions of combinations of temporal events and defining rules for combining information from different sources over time. In this language, recognition patterns are defined in simple, declarative statements that combine point events from given input streams with those from other streams, using conjunction, disjunction, and negation. Patterns can be built on one another recursively to describe very rich, temporally extended combinations of events. Thereafter, a run-time matching algorithm in CERA efficiently matches these patterns against input data and signals when patterns are recognized. CERA can be used to monitor complex systems and to signal operators or initiate corrective actions when anomalous conditions are recognized. CERA can be run as a stand-alone monitoring system, or it can be integrated into a larger system to automatically trigger responses to changing environments or problematic situations.

  17. Search for the Higgs boson using neural networks in events with missing energy and b-quark jets in pp collisions at square root(s) = 1.96 TeV.

    PubMed

    Aaltonen, T; Adelman, J; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Asaadi, J; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Barria, P; Bartos, P; Bauer, G; Beauchemin, P-H; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Camarda, S; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Chung, K; Chung, W H; Chung, Y S; Chwalek, T; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Corbo, M; Cordelli, M; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'Orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; d'Errico, M; Di Canto, A; di Giovanni, G P; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Dorigo, T; Dube, S; Ebina, K; Elagin, A; Erbacher, R; Errede, D; Errede, S; Ershaidat, N; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Garosi, P; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Group, R C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harr, R F; Hartz, M; Hatakeyama, K; Hays, C; Heck, M; Heinrich, J; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Hughes, R E; Hurwitz, M; Husemann, U; Hussein, M; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jang, D; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Ketchum, W; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kuhr, T; Kulkarni, N P; Kurata, M; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; LeCompte, T; Lee, E; Lee, H S; Lee, J S; Lee, S W; Leone, S; Lewis, J D; Lin, C-J; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Lovas, L; Lucchesi, D; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lys, J; Lysak, R; MacQueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Mastrandrea, P; Mathis, M; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Mesropian, C; Miao, T; Mietlicki, D; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moed, S; Moggi, N; Mondragon, M N; Moon, C S; Moore, R; Morello, M J; Morlock, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Nett, J; Neu, C; Neubauer, M S; Neubauer, S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Osterberg, K; Pagan Griso, S; Pagliarone, C; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramanov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Peiffer, T; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Potamianos, K; Poukhov, O; Prokoshin, F; Pronko, A; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Renton, P; Renz, M; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Rutherford, B; Saarikko, H; Safonov, A; Sakumoto, W K; Santi, L; Sartori, L; Sato, K; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Simonenko, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Suh, J S; Sukhanov, A; Suslov, I; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tang, J; Tecchio, M; Teng, P K; Thom, J; Thome, J; Thompson, G A; Thomson, E; Tipton, P; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Trovato, M; Tsai, S-Y; Tu, Y; Turini, N; Ukegawa, F; Uozumi, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Vidal, M; Vila, I; Vilar, R; Vogel, M; Volobouev, I; Volpi, G; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wagner-Kuhr, J; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Weinelt, J; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wolfe, H; Wright, T; Wu, X; Würthwein, F; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yi, K; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanetti, A; Zeng, Y; Zhang, X; Zheng, Y; Zucchelli, S

    2010-04-01

    We report on a search for the standard model Higgs boson produced in association with a W or Z boson in pp collisions at square root(s)=1.96 TeV recorded by the CDF II experiment at the Tevatron in a data sample corresponding to an integrated luminosity of 2.1 fb(-1). We consider events which have no identified charged leptons, an imbalance in transverse momentum, and two or three jets where at least one jet is consistent with originating from the decay of a b hadron. We find good agreement between data and background predictions. We place 95% confidence level upper limits on the production cross section for several Higgs boson masses ranging from 110 GeV/c(2) to 150 GeV/c(2). For a mass of 115 GeV/c(2) the observed (expected) limit is 6.9 (5.6) times the standard model prediction.

  18. News Teaching Support: New schools network launched Competition: Observatory throws open doors to a select few Festival: Granada to host 10th Ciencia en Acción Centenary: Science Museum celebrates 100 years Award: Queen's birthday honour for science communicator Teacher Training: Training goes where it's needed Conference: Physics gets creative in Christchurch Conference: Conference is packed with ideas Poster Campaign: Bus passengers learn about universe Forthcoming events

    NASA Astrophysics Data System (ADS)

    2009-09-01

    Teaching Support: New schools network launched Competition: Observatory throws open doors to a select few Festival: Granada to host 10th Ciencia en Acción Centenary: Science Museum celebrates 100 years Award: Queen's birthday honour for science communicator Teacher Training: Training goes where it's needed Conference: Physics gets creative in Christchurch Conference: Conference is packed with ideas Poster Campaign: Bus passengers learn about universe Forthcoming events

  19. Event tunnel: exploring event-driven business processes.

    PubMed

    Suntinger, Martin; Obweger, Hannes; Schiefer, Josef; Gröller, M Eduard

    2008-01-01

    Event-based systems monitor business processes in real time. The event-tunnel visualization sees the stream of events captured from such systems as a cylindrical tunnel. The tunnel allows for back-tracing business incidents and exploring event patterns' root causes. The authors couple this visualization with tools that let users search for relevant events within a data repository.

  20. Concepts of event-by-event analysis

    SciTech Connect

    Stroebele, H.

    1995-07-15

    The particles observed in the final state of nuclear collisions can be divided into two classes: those which are susceptible to strong interactions and those which are not, like leptons and the photon. The bulk properties of the {open_quotes}matter{close_quotes} in the reaction zone may be read-off the kinematical characteristics of the particles observable in the final state. These characteristics are strongly dependent on the last interaction these particles have undergone. In a densly populated reaction zone strongly interacting particles will experience many collisions after they have been formed and before they emerge into the asymptotic final state. For the particles which are not sensitive to strong interactions their formation is also their last interaction. Thus photons and leptons probe the period during which they are produced whereas hadrons reflect the so called freeze-out processes, which occur during the late stage in the evolution of the reaction when the population density becomes small and the mean free paths long. The disadvantage of the leptons and photons is their small production cross section; they cannot be used in an analysis of the characteristics of individual collision events, because the number of particles produced per event is too small. The hadrons, on the other hand, stem from the freeze-out period. Information from earlier periods requires multiparticle observables in the most general sense. It is one of the challenges of present day high energy nuclear physics to establish and understand global observables which differentiate between mere hadronic scenarios, i.e superposition of hadronic interactions, and the formation of a partonic (short duration) steady state which can be considered a new state of matter, the Quark-Gluon Plasma.

  1. Contribution of Infrasound to IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif Mohamed; Medinskaya, Tatiana; Mialle, Pierrick

    2016-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003 but infrasound detections could not be used by the Global Association (GA) Software due to the unmanageable high number of spurious associations. Offline improvements of the automatic processing took place to reduce the number of false detections to a reasonable level. In February 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts and large earthquakes belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. Presence of infrasound detection associated to an event from a mining area indicates a surface explosion. Satellite imaging and a database of active mines can be used to confirm the origin of such events. This presentation will summarize the contribution of 6 years of infrasound data to IDC bulletins and provide examples of events recorded at the IMS infrasound network. Results of this study may help to improve location of small events with observations on infrasound stations.

  2. Patient stratification and identification of adverse event correlations in the space of 1190 drug related adverse events

    PubMed Central

    Roitmann, Eva; Eriksson, Robert; Brunak, Søren

    2014-01-01

    Purpose: New pharmacovigilance methods are needed as a consequence of the morbidity caused by drugs. We exploit fine-grained drug related adverse event information extracted by text mining from electronic medical records (EMRs) to stratify patients based on their adverse events and to determine adverse event co-occurrences. Methods: We analyzed the similarity of adverse event profiles of 2347 patients extracted from EMRs from a mental health center in Denmark. The patients were clustered based on their adverse event profiles and the similarities were presented as a network. The set of adverse events in each main patient cluster was evaluated. Co-occurrences of adverse events in patients (p-value < 0.01) were identified and presented as well. Results: We found that each cluster of patients typically had a most distinguishing adverse event. Examination of the co-occurrences of adverse events in patients led to the identification of potentially interesting adverse event correlations that may be further investigated as well as provide further patient stratification opportunities. Conclusions: We have demonstrated the feasibility of a novel approach in pharmacovigilance to stratify patients based on fine-grained adverse event profiles, which also makes it possible to identify adverse event correlations. Used on larger data sets, this data-driven method has the potential to reveal unknown patterns concerning adverse event occurrences. PMID:25249979

  3. Detection of extreme climate events in semi-arid biomes using a combination of near-field and satellite based remote sensing across the New Mexico Elevation Gradient network of flux towers

    NASA Astrophysics Data System (ADS)

    Litvak, M. E.; Krofcheck, D. J.; Maurer, G.

    2015-12-01

    Semi-arid biomes in the Southwestern U.S. over the past decade have experienced high inter- and intra-annual variability in precipitation and vapor-pressure deficit (VPD), and from recent observations, are particularly vulnerable to both VPD and drought. Given the large land area occupied by semi-arid biomes in the U.S., the ability to quantify how climate extremes alter ecosystem function, in addition to being able to use satellites to remotely detect when these climate extremes occur, is crucial to scale the impact of these events on regional carbon dynamics. In an effort to understand how well commonly employed remote sensing platforms capture the impact of extreme events on semi-arid biomes, we coupled a 9-year record of eddy-covariance measurements made across an elevation/aridity gradient in NM with remote sensing data sets from tower-based phenocams, MODIS and Landsat 7 ETM+. We compared anomalies in air temperature, vapor pressure deficit, and precipitation, to the degree in variability of remote sensing vegetation indices (e.g, NDVI, EVI, 2G-Rbi, LST, etc.), and tower-derived gross primary productivity (GPP), across a range of temporal lags to quantify : 1) how sensitive vegetation indices from various platforms, LST, and carbon uptake are to climate disturbances, and the extremity of the disturbance; 2) how well correlated vegetation indices and tower fluxes are on monthly, seasonal and annual time scales, and if the degree to which they are correlated is related to the extent of climate anomalies during that period; and 3) the lags in the response of both GPP and vegetation indices to climate-anomalies and how well correlated these were on various time scales. Our initial results show differential sensitivities across a range of semi-arid ecosystems to drought and vapor pressure deficit. We see the strongest sensitivity of vegetation indices, and correlations between vegetation indices and tower GPP in the low and high elevation biomes that have a more

  4. RETRIEVAL EVENTS EVALUATION

    SciTech Connect

    T. Wilson

    1999-11-12

    The purpose of this analysis is to evaluate impacts to the retrieval concept presented in the Design Analysis ''Retrieval Equipment and Strategy'' (Reference 6), from abnormal events based on Design Basis Events (DBE) and Beyond Design Basis Events (BDBE) as defined in two recent analyses: (1) DBE/Scenario Analysis for Preclosure Repository Subsurface Facilities (Reference 4); and (2) Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (Reference 5) The objective of this task is to determine what impacts the DBEs and BDBEs have on the equipment developed for retrieval. The analysis lists potential impacts and recommends changes to be analyzed in subsequent design analyses for developed equipment, or recommend where additional equipment may be needed, to allow retrieval to be performed in all DBE or BDBE situations. This analysis supports License Application design and therefore complies with the requirements of Systems Description Document input criteria comparison as presented in Section 7, Conclusions. In addition, the analysis discusses the impacts associated with not using concrete inverts in the emplacement drifts. The ''Retrieval Equipment and Strategy'' analysis was based on a concrete invert configuration in the emplacement drift. The scope of the analysis, as presented in ''Development Plan for Retrieval Events Evaluation'' (Reference 3) includes evaluation and criteria of the following: Impacts to retrieval from the emplacement drift based on DBE/BDBEs, and changes to the invert configuration for the preclosure period. Impacts to retrieval from the main drifts based on DBE/BDBEs for the preclosure period.

  5. Solar extreme events

    NASA Astrophysics Data System (ADS)

    Hudson, Hugh S.

    2015-08-01

    Solar flares and CMEs have a broad range of magnitudes. This review discusses the possibility of “extreme events,” defined as those with magnitudes greater than have been seen in the existing historical record. For most quantitative measures, this direct information does not extend more than a century and a half into the recent past. The magnitude distributions (occurrence frequencies) of solar events (flares/CMEs) typically decrease with the parameter measured or inferred (peak flux, mass, energy etc. Flare radiation fluxes tend to follow a power law slightly flatter than S-2, where S represents a peak flux; solar particle events (SPEs) follow a still flatter power law up to a limiting magnitude, and then appear to roll over to a steeper distribution, which may take an exponential form or follow a broken power law. This inference comes from the terrestrial 14C record and from the depth dependence of various radioisotope proxies in the lunar regolith and in meteorites. Recently major new observational results have impacted our use of the relatively limited historical record in new ways: the detection of actual events in the 14C tree-ring records, and the systematic observations of flares and “superflares” by the Kepler spacecraft. I discuss how these new findings may affect our understanding of the distribution function expected for extreme solar events.

  6. United States National seismograph network

    USGS Publications Warehouse

    Masse, R.P.; Filson, J.R.; Murphy, A.

    1989-01-01

    The USGS National Earthquake Information Center (NEIC) has planned and is developing a broadband digital seismograph network for the United States. The network will consist of approximately 150 seismograph stations distributed across the contiguous 48 states and across Alaska, Hawaii, Puerto Rico and the Virgin Islands. Data transmission will be via two-way satellite telemetry from the network sites to a central recording facility at the NEIC in Golden, Colorado. The design goal for the network is the on-scale recording by at least five well-distributed stations of any seismic event of magnitude 2.5 or greater in all areas of the United States except possibly part of Alaska. All event data from the network will be distributed to the scientific community on compact disc with read-only memory (CD-ROM). ?? 1989.

  7. A space-based radio frequency transient event classifier

    SciTech Connect

    Moore, K.R.; Blain, P.C.; Caffrey, M.P.; Franz, R.C.; Henneke, K.M.; Jones, R.G.

    1996-12-31

    The FORTE (Fast On-Orbit Recording of Transient Events) satellite will record RF transients in space. These transients will be classified onboard the spacecraft with an Event Classifier--specialized hardware that performs signal preprocessing and neural network classification. The authors describe the Event Classifier, future directions, and implications for telecommunications satellites. Telecommunication satellites are susceptible to damage from environmental factors such as deep dielectric charging and surface discharges. The event classifier technology the authors are developing is capable of sensing the surface discharges and could be useful for mitigating their effects. In addition, the techniques they are using for processing weak signals in noisy environments are relevant to telecommunications.

  8. Frequent epigenetic inactivation of KIBRA, an upstream member of the Salvador/Warts/Hippo (SWH) tumor suppressor network, is associated with specific genetic event in B-cell acute lymphocytic leukemia.

    PubMed

    Hill, Victoria K; Dunwell, Thomas L; Catchpoole, Daniel; Krex, Dietmar; Brini, Anna T; Griffiths, Mike; Craddock, Charles; Maher, Eamonn R; Latif, Farida

    2011-03-01

    The WW-domain containing protein KIBRA has recently been identified as a new member of the Salvador/Warts/Hippo (SWH) pathway in Drosophila and is shown to act as a tumor suppressor gene in Drosophila. This pathway is conserved in humans and members of the pathway have been shown to act as tumor suppressor genes in mammalian systems. We determined the methylation status of the 5' CpG island associated with the KIBRA gene in human cancers. In a large panel of cancer cell lines representing common epithelial cancers KIBRA was unmethylated. But in pediatric acute lymphocytic leukemia (ALL) cell lines KIBRA showed frequent hypermethylation and silencing of gene expression, which could be reversed by treatment with 5-aza-2'-deoxycytidine. In ALL patient samples KIBRA was methylated in 70% B-ALL but was methylated in < 20% T-ALL leukemia (p = 0.0019). In B-ALL KIBRA methylation was associated with ETV6/RUNX1 [t(12;21) (p13;q22)] chromosomal translocation (p = 0.0082) phenotype, suggesting that KIBRA may play an important role in t(12;21) leukemogenesis. In ALL paired samples at diagnosis and remission KIBRA methylation was seen in diagnostic but not in any of the remission samples accompanied by loss of KIBRA expression in disease state compared to patients in remission. Hence KIBRA methylation occurs frequently in B-cell acute lymphocytic leukemia but not in epithelial cancers and is linked to specific genetic event in B-ALL.

  9. A Search for the Higgs Boson Using Neural Networks in Events with Missing Energy and \\boldit{b}-quark Jets in $p\\bar p$ Collisions at $\\sqrt{s}=1.96$ TeV

    SciTech Connect

    Aaltonen, T.; Adelman, J.; Alvarez Gonzalez, B.; Amerio, S.; Amidei, D.; Anastassov, A.; Annovi, A.; Antos, J.; Apollinari, G.; Apresyan, A.; Arisawa, T.; /Waseda U. /Dubna, JINR

    2009-11-01

    We report on a search for the standard model Higgs boson produced in association with a W or Z boson in p{bar p} collisions at {radical}s = 1.96 TeV recorded by the CDF II experiment at the Tevatron in a data sample corresponding to an integrated luminosity of 2.1 fb{sup -1}. We consider events which have no identified charged leptons, an imbalance in transverse momentum, and two or three jets where at least one jet is consistent with originating from the decay of a b hadron. We find good agreement between data and predictions. We place 95% confidence level upper limits on the production cross section for several Higgs boson masses ranging from 110 GeV/c{sup 2} to 150 GeV/c{sup 2}. For a mass of 115 GeV/c{sup 2} the observed (expected) limit is 6.9 (5.6) times the standard model prediction.

  10. Pharmacogenomics of suicidal events

    PubMed Central

    Brent, David; Melhem, Nadine; Turecki, Gustavo

    2010-01-01

    Pharmacogenomic studies of antidepressant treatment-emergent suicidal events in depressed patients report associations with polymorphisms in genes involved in transcription (CREB1), neuroprotection (BDNF and NTRK2), glutamatergic and noradrenergic neurotransmission (GRIA3, GRIK2 and ADRA2A), the stress and inflammatory responses (FKBP5 and IL28RA), and the synthesis of glycoproteins (PAPLN). Nearly all of the reported events in these studies were modest one-time increases in suicidal ideation. In 3231 unique subjects across six studies, 424 (13.1%) patients showed increases in suicidal ideation, eight (0.25%) attempted suicide and four (0.12%) completed suicide. Systems related to most of these genes have also been implicated in studies of suicidal behavior irrespective of treatment. Future pharmacogenomic studies should target events that are clinically significant, related clinical phenotypes of response and medication side effects, and biological pathways that are involved in these outcomes in order to improve treatment approaches. PMID:20504254

  11. The response of the high-latitude ionosphere to the coronal mass ejection event of April 6, 2000: A practical demonstration of space weather nowcasting with the Super Dual Auroral Radar Network HF radars

    NASA Astrophysics Data System (ADS)

    Ruohoniemi, J. M.; Barnes, R. J.; Greenwald, R. A.; Shepherd, S. G.

    2001-12-01

    The ionosphere at high latitudes is the site of important effects in space weather. These include strong electrical currents that may disrupt power systems through induced currents and density irregularities that can degrade HF and satellite communication links. With the impetus provided by the National Space Weather Program, the radars of the Super Dual Auroral Radar Network have been applied to the real-time specification (``nowcasting'') of conditions in the high-latitude ionosphere. A map of the plasma convection in the northern high-latitude ionosphere is continually generated at the Johns Hopkins University Applied Physics Laboratory (JHU/APL) SuperDARN web site using data downloaded in real time from the radars via Internet connections. Other nowcast items include information on the conditions of HF propagation, the spatial extent of auroral effects, and the total cross polar cap potential variation. Time series of various parameters and an animated replay of the last 2 hours of convection patterns are also available for review. By comparing with simultaneous measurements from an upstream satellite, it is possible to infer the effective delay from the detection of changes in the solar wind at the satellite to the arrival of related effects in the high-latitude ionosphere. We discuss the space weather products available from the JHU/APL SuperDARN web site and their uses by simulating a nowcast of the ionosphere on April 6, 2000, during the arrival of a coronal mass ejection (CME) -related shock. The nowcast convection pattern in particular satisfies a critical need for timely, comprehensive information on ionospheric electric fields.

  12. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. PMID:20804980

  13. LAN attack detection using Discrete Event Systems.

    PubMed

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed.

  14. Network Cosmology

    PubMed Central

    Krioukov, Dmitri; Kitsak, Maksim; Sinkovits, Robert S.; Rideout, David; Meyer, David; Boguñá, Marián

    2012-01-01

    Prediction and control of the dynamics of complex networks is a central problem in network science. Structural and dynamical similarities of different real networks suggest that some universal laws might accurately describe the dynamics of these networks, albeit the nature and common origin of such laws remain elusive. Here we show that the causal network representing the large-scale structure of spacetime in our accelerating universe is a power-law graph with strong clustering, similar to many complex networks such as the Internet, social, or biological networks. We prove that this structural similarity is a consequence of the asymptotic equivalence between the large-scale growth dynamics of complex networks and causal networks. This equivalence suggests that unexpectedly similar laws govern the dynamics of complex networks and spacetime in the universe, with implications to network science and cosmology. PMID:23162688

  15. Network cosmology.

    PubMed

    Krioukov, Dmitri; Kitsak, Maksim; Sinkovits, Robert S; Rideout, David; Meyer, David; Boguñá, Marián

    2012-01-01

    Prediction and control of the dynamics of complex networks is a central problem in network science. Structural and dynamical similarities of different real networks suggest that some universal laws might accurately describe the dynamics of these networks, albeit the nature and common origin of such laws remain elusive. Here we show that the causal network representing the large-scale structure of spacetime in our accelerating universe is a power-law graph with strong clustering, similar to many complex networks such as the Internet, social, or biological networks. We prove that this structural similarity is a consequence of the asymptotic equivalence between the large-scale growth dynamics of complex networks and causal networks. This equivalence suggests that unexpectedly similar laws govern the dynamics of complex networks and spacetime in the universe, with implications to network science and cosmology.

  16. Teaching with Current Events

    ERIC Educational Resources Information Center

    Peralta, Andrew

    2005-01-01

    This article describes how a teacher changed all his plans to teach the hurricane. When the Hurricane Katrina hit the Gulf Coast, kids become naturally curious and seek answers in an event this big. The author suggests the use of tragedies to help them grow as students and as citizens.

  17. Language As Social Event.

    ERIC Educational Resources Information Center

    Harste, Jerome C.

    A taxonomy developed for the study of the growth and development of written language from the perspective of social event was tested with a group of 68 children, aged three to six years. The subjects were presented with a wide variety of environmental print messages (road signs, toys, fast food signs, and household products) and were questioned…

  18. Interictal networks in magnetoencephalography.

    PubMed

    Malinowska, Urszula; Badier, Jean-Michel; Gavaret, Martine; Bartolomei, Fabrice; Chauvel, Patrick; Bénar, Christian-George

    2014-06-01

    Epileptic networks involve complex relationships across several brain areas. Such networks have been shown on intracerebral EEG (stereotaxic EEG, SEEG), an invasive technique. Magnetoencephalography (MEG) is a noninvasive tool, which was recently proven to be efficient for localizing the generators of epileptiform discharges. However, despite the importance of characterizing non-invasively network aspects in partial epilepsies, only few studies have attempted to retrieve fine spatiotemporal dynamics of interictal discharges with MEG. Our goal was to assess the relevance of magnetoencephalography for detecting and characterizing the brain networks involved in interictal epileptic discharges. We propose here a semi-automatic method based on independent component analysis (ICA) and on co-occurrence of events across components. The method was evaluated in a series of seven patients by comparing its results with networks identified in SEEG. On both MEG and SEEG, we found that interictal discharges can involve remote regions which are acting in synchrony. More regions were identified in SEEG (38 in total) than in MEG (20). All MEG regions were confirmed by SEEG when an electrode was present in the vicinity. In all patients, at least one region could be identified as leading according to our criteria. A majority (71%) of MEG leaders were confirmed by SEEG. We have therefore shown that MEG measurements can extract a significant proportion of the networks visible in SEEG. This suggests that MEG can be a useful tool for defining noninvasively interictal epileptic networks, in terms of regions and patterns of connectivity, in search for a "primary irritative zone".

  19. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    SciTech Connect

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  20. Network Solutions.

    ERIC Educational Resources Information Center

    Vietzke, Robert; And Others

    1996-01-01

    This special section explains the latest developments in networking technologies, profiles school districts benefiting from successful implementations, and reviews new products for building networks. Highlights include ATM (asynchronous transfer mode), cable modems, networking switches, Internet screening software, file servers, network management…

  1. Performance testing open source products for the TMT event service

    NASA Astrophysics Data System (ADS)

    Gillies, K.; Bhate, Yogesh

    2014-07-01

    The software system for TMT is a distributed system with many components on many computers. Each component integrates with the overall system using a set of software services. The Event Service is a publish-subscribe message system that allows the distribution of demands and other events. The performance requirements for the Event Service are demanding with a goal of over 60 thousand events/second. This service is critical to the success of the TMT software architecture; therefore, a project was started to survey the open source and commercial market for viable software products. A trade study led to the selection of five products for thorough testing using a specially constructed computer/network configuration and test suite. The best performing product was chosen as the basis of a prototype Event Service implementation. This paper describes the process and performance tests conducted by Persistent Systems that led to the selection of the product for the prototype Event Service.

  2. Biological event composition

    PubMed Central

    2012-01-01

    Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes

  3. Artificial neural-network based feeder reconfiguration for loss reduction in distribution systems

    SciTech Connect

    Hoyong Kim; Yunseok Ko; Kyunghee Jung . Dept. of Distribution System)

    1993-07-01

    Neural networks have the capability to map the complex and extremely non-linear relationship between the load levels of zone and system topologies, which is required for feeder reconfiguration in distribution systems. This study is intended to propose the strategies to reconfigure the feeder, by using artificial neural networks with mapping ability. Artificial neural networks determine the appropriate system topology that reduces the power loss according to the variation of load pattern. The control strategy can be easily obtained from the system topology which is provided by artificial neural networks. Artificial neural networks are in groups. The first group estimates the proper load level from the load data of each zone, and the second determines the appropriate system topology from the input load level. In addition, several programs with the training set builder are developed for the design, the training and the accuracy test of artificial neural networks. The authors also evaluate the performance of neural networks designed here, on the test distribution system. Neural networks are implemented in FORTRAN language, and trained on the personal computer COMPAQ 386.

  4. The Colombia Seismological Network

    NASA Astrophysics Data System (ADS)

    Blanco Chia, J. F.; Poveda, E.; Pedraza, P.

    2013-05-01

    The latest seismological equipment and data processing instrumentation installed at the Colombia Seismological Network (RSNC) are described. System configuration, network operation, and data management are discussed. The data quality and the new seismological products are analyzed. The main purpose of the network is to monitor local seismicity with a special emphasis on seismic activity surrounding the Colombian Pacific and Caribbean oceans, for early warning in case a Tsunami is produced by an earthquake. The Colombian territory is located at the South America northwestern corner, here three tectonic plates converge: Nazca, Caribbean and the South American. The dynamics of these plates, when resulting in earthquakes, is continuously monitored by the network. In 2012, the RSNC registered in 2012 an average of 67 events per day; from this number, a mean of 36 earthquakes were possible to be located well. In 2010 the network was also able to register an average of 67 events, but it was only possible to locate a mean of 28 earthquakes daily. This difference is due to the expansion of the network. The network is made up of 84 stations equipped with different kind of broadband 40s, 120s seismometers, accelerometers and short period 1s sensors. The signal is transmitted continuously in real-time to the Central Recording Center located at Bogotá, using satellite, telemetry, and Internet. Moreover, there are some other stations which are required to collect the information in situ. Data is recorded and processed digitally using two different systems, EARTHWORM and SEISAN, which are able to process and share the information between them. The RSNC has designed and implemented a web system to share the seismological data. This innovative system uses tools like Java Script, Oracle and programming languages like PHP to allow the users to access the seismicity registered by the network almost in real time as well as to download the waveform and technical details. The coverage

  5. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  6. Human Rights Event Detection from Heterogeneous Social Media Graphs.

    PubMed

    Chen, Feng; Neill, Daniel B

    2015-03-01

    Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions. PMID:27442843

  7. Human Rights Event Detection from Heterogeneous Social Media Graphs.

    PubMed

    Chen, Feng; Neill, Daniel B

    2015-03-01

    Human rights organizations are increasingly monitoring social media for identification, verification, and documentation of human rights violations. Since manual extraction of events from the massive amount of online social network data is difficult and time-consuming, we propose an approach for automated, large-scale discovery and analysis of human rights-related events. We apply our recently developed Non-Parametric Heterogeneous Graph Scan (NPHGS), which models social media data such as Twitter as a heterogeneous network (with multiple different node types, features, and relationships) and detects emerging patterns in the network, to identify and characterize human rights events. NPHGS efficiently maximizes a nonparametric scan statistic (an aggregate measure of anomalousness) over connected subgraphs of the heterogeneous network to identify the most anomalous network clusters. It summarizes each event with information such as type of event, geographical locations, time, and participants, and provides documentation such as links to videos and news reports. Building on our previous work that demonstrates the utility of NPHGS for civil unrest prediction and rare disease outbreak detection, we present an analysis of human rights events detected by NPHGS using two years of Twitter data from Mexico. NPHGS was able to accurately detect relevant clusters of human rights-related tweets prior to international news sources, and in some cases, prior to local news reports. Analysis of social media using NPHGS could enhance the information-gathering missions of human rights organizations by pinpointing specific abuses, revealing events and details that may be blocked from traditional media sources, and providing evidence of emerging patterns of human rights violations. This could lead to more timely, targeted, and effective advocacy, as well as other potential interventions.

  8. Tidal disruption events

    NASA Astrophysics Data System (ADS)

    Levan, A.

    2014-07-01

    Tidal disruption events (TDEs) provide a powerful probe of many astrophysical processes. They occur when the powerful tidal field around a black hole disrupts a passing star which is subsequently accreted. The resulting signal is a powerful X-ray, UV/opt and possibly even radio source, that provides us with a view of accretion aroud supermassive black holes from switch-on to switch-off over the timescale of years. TDEs probe accretion physics, the ubquity of black holes in galactic nuclei and dynamics in their cores, offering a novel route to addressing these issues. I will review observations of TDEs over the past decade, outlining how samples of candidates have been gradually building, and how they can be identified against other more common transient events. I will also discuss the implications of the discovery of a population of TDEs apparently launching relativisitc jets, and how these powerful transients may be detected in upcoming X-ray to radio surveys.

  9. Single event mass spectrometry

    DOEpatents

    Conzemius, Robert J.

    1990-01-16

    A means and method for single event time of flight mass spectrometry for analysis of specimen materials. The method of the invention includes pulsing an ion source imposing at least one pulsed ion onto the specimen to produce a corresponding emission of at least one electrically charged particle. The emitted particle is then dissociated into a charged ion component and an uncharged neutral component. The ion and neutral components are then detected. The time of flight of the components are recorded and can be used to analyze the predecessor of the components, and therefore the specimen material. When more than one ion particle is emitted from the specimen per single ion impact, the single event time of flight mass spectrometer described here furnis This invention was made with Government support under Contract No. W-7405-ENG82 awarded by the Department of Energy. The Government has certain rights in the invention.

  10. Some Aviation Growth Events

    NASA Technical Reports Server (NTRS)

    Spearman, M. Leroy

    2002-01-01

    The growth of aviation since the first flight of a heavier-than-air powered manned vehicle in 1903 has been somewhat remarkable. Some of the events that have influenced this growth are reviewed in this paper. This review will include some events prior to World War I; the influence of the war itself; the events during the post-war years including the establishment of aeronautical research laboratories; and the influence of World War II which, among other things, introduced new technologies that included rocket and jet propulsion and supersonic aerodynamics. The subsequent era of aeronautical research and the attendant growth in aviation over the past half century will be reviewed from the view point of the author who, since 1944, has been involved in the NACA/NASA aeronautical research effort at what is now the Langley Research Center in Hampton, Virginia. The review will discuss some of the research programs related to the development of some experimental aircraft, the Century series of fighter aircraft, multi-mission aircraft, advanced military aircraft and missiles, advanced civil aircraft, supersonic transports, spacecraft and others.

  11. Semantic Networks and Social Networks

    ERIC Educational Resources Information Center

    Downes, Stephen

    2005-01-01

    Purpose: To illustrate the need for social network metadata within semantic metadata. Design/methodology/approach: Surveys properties of social networks and the semantic web, suggests that social network analysis applies to semantic content, argues that semantic content is more searchable if social network metadata is merged with semantic web…

  12. Development of Memory for Events.

    ERIC Educational Resources Information Center

    Ratner, Hilary Horn; And Others

    1986-01-01

    Examines development of event memory by determining how personally experienced events with two types of structure were reported by kindergartners and adults. Events in making and playing with clay were organized causally and temporally. Results show that adults and children used a goal-based hierarchical structure to remember events, although use…

  13. Postdocs Attend Special Events during Postdoc Appreciation Week | Poster

    Cancer.gov

    NCI at Frederick postdocs were treated to special events by the Fellows and Young Investigators Committee during National Postdoc Appreciation Week, September 15–19. At the first Frederick fellows seminar of the fall on September 17, postdocs were invited to hear their colleagues present highlights of their research and stay for pizza and ice cream, compliments of the committee. Postdocs are also invited to a special networking event at Barley and Hops on September 24.

  14. Event selection services in ATLAS

    NASA Astrophysics Data System (ADS)

    Cranshaw, J.; Cuhadar-Donszelmann, T.; Gallas, E.; Hrivnac, J.; Kenyon, M.; McGlone, H.; Malon, D.; Mambelli, M.; Nowak, M.; Viegas, F.; Vinek, E.; Zhang, Q.

    2010-04-01

    ATLAS has developed and deployed event-level selection services based upon event metadata records ("TAGS") and supporting file and database technology. These services allow physicists to extract events that satisfy their selection predicates from any stage of data processing and use them as input to later analyses. One component of these services is a web-based Event-Level Selection Service Interface (ELSSI). ELSSI supports event selection by integrating run-level metadata, luminosity-block-level metadata (e.g., detector status and quality information), and event-by-event information (e.g., triggers passed and physics content). The list of events that survive after some selection criterion is returned in a form that can be used directly as input to local or distributed analysis; indeed, it is possible to submit a skimming job directly from the ELSSI interface using grid proxy credential delegation. ELSSI allows physicists to explore ATLAS event metadata as a means to understand, qualitatively and quantitatively, the distributional characteristics of ATLAS data. In fact, the ELSSI service provides an easy interface to see the highest missing ET events or the events with the most leptons, to count how many events passed a given set of triggers, or to find events that failed a given trigger but nonetheless look relevant to an analysis based upon the results of offline reconstruction, and more. This work provides an overview of ATLAS event-level selection services, with an emphasis upon the interactive Event-Level Selection Service Interface.

  15. Changes in Caribbean coral disease prevalence after the 2005 bleaching event.

    PubMed

    Cróquer, Aldo; Weil, Ernesto

    2009-11-16

    Bleaching events and disease epizootics have increased during the past decades, suggesting a positive link between these 2 causes in producing coral mortality. However, studies to test this hypothesis, integrating a broad range of hierarchical spatial scales from habitats to distant localities, have not been conducted in the Caribbean. In this study, we examined links between bleaching intensity and disease prevalence collected from 6 countries, 2 reef sites for each country, and 3 habitats within each reef site (N = 6 x 2 x 3 = 36 site-habitat combinations) during the peak of bleaching in 2005 and a year after, in 2006. Patterns of disease prevalence and bleaching were significantly correlated (Rho = 0.58, p = 0.04). Higher variability in disease prevalence after bleaching occurred among habitats at each particular reef site, with a significant increase in prevalence recorded in 4 of the 10 site-habitats where bleaching was intense and a non-significant increase in disease prevalence in 18 out of the 26 site-habitats where bleaching was low to moderate. A significant linear correlation was found (r = 0.89, p = 0.008) between bleaching and the prevalence of 2 virulent diseases (yellow band disease and white plague) affecting the Montastraea species complex. Results of this study suggest that if bleaching events become more intense and frequent, disease-related mortality of Caribbean coral reef builders could increase, with uncertain effects on coral reef resilience.

  16. Small world picture of worldwide seismic events

    NASA Astrophysics Data System (ADS)

    Ferreira, Douglas S. R.; Papa, Andrés R. R.; Menezes, Ronaldo

    2014-08-01

    The understanding of long-distance relations between seismic activities has for long been of interest to seismologists and geologists. In this paper we have used data from the worldwide earthquake catalog for the period between 1972 and 2011 to generate a network of sites around the world for earthquakes with magnitude m≥4.5 in the Richter scale. After the network construction, we have analyzed the results under two viewpoints. First, in contrast to previous works, which have considered just small areas, we showed that the best fitting for networks of seismic events is not a pure power law, but a power law with exponential cutoff; we also have found that the global network presents small-world properties. Second, we have found that the time intervals between successive earthquakes have a cumulative probability distribution well fitted by nontraditional functional forms. The implications of our results are significant because they seem to indicate that seisms around the world are not independent. In this paper we provide evidence to support this argument.

  17. Networks of prospective thoughts: The organisational role of emotion and its impact on well-being.

    PubMed

    Demblon, Julie; D'Argembeau, Arnaud

    2016-01-01

    Recent research has shown that many prospective thoughts are organised in networks of related events, but the relational dimensions that contribute to the formation of such networks are not fully understood. Here, we investigated the organisational role of emotion by using cues of different valence for eliciting event networks. We found that manipulating the emotional valence of cues influenced the characteristics of events within networks, and that members of a network were more similar to each other on affective components than they were to members of other networks. Furthermore, a substantial proportion of events within networks were part of thematic clusters and cluster membership significantly modulated the impact of represented events on current well-being, in part through an intensification of the emotion felt when thinking about these events. These findings demonstrate that emotion contributes to the organisation of future thoughts in networks that can affect people's well-being. PMID:25787140

  18. Event mapping meeting

    SciTech Connect

    Eaton, L.; Mason, D.

    1997-02-20

    A one-day meeting was held by the authors to evaluate how the strategic lab workshops would tie to this year`s tactical planning exercise. In particular, they wanted to find recent events that would support the tactical goal decisions of the Lab, and they wanted to find events that verify the Lab`s present course. The events which are each briefly discussed are: Galvin Commission recommends consolidating DOE defense labs (1995); Congressional subcommittee staff force budget cuts and consolidation (1995); 28% of DOE/DP budget held back pending completion of a clear 5-yr plan for nukes (1995); DOD and DOE focus on dual use (1995); LANL work includes weapons rebuilds (1995); LANL chosen by DOE to develop and test advanced remediation techniques (1995); AGEX/DARHT Project is stopped by suits from environmental activities (1996); Non-proliferation treaty renewed (1996); US complies with Comprehensive Test Ban Treaty (1996); Capability based deterrence policy put into place (1998); Stockpile shrinks to approximately 2000 weapons (2005); DOE weapons labs re-chartered as true national labs (1996); DOE terminates all nuclear weapons testing support (1996); Industrial projects at LANL up 20% from previous year (1997); NIST-ATP Program becomes an interagency process (1997); DOE warns that spent commercial reactor fuels is a major proliferation threat (1998); Non-lethal weapons work helps to reshape LANL image (1998); Global warning theory proven (2005); Overall US spending on science has been flat or decreasing for three years (1998); and Economic role of LANL in northern New Mexico declines (2005).

  19. On the prediction of GLE events

    NASA Astrophysics Data System (ADS)

    Nunez, Marlon; Reyes, Pedro

    2016-04-01

    A model for predicting the occurrence of GLE events is presented. This model uses the UMASEP scheme based on the lag-correlation between the time derivatives of soft X-ray flux (SXR) and near-earth proton fluxes (Núñez, 2011, 2015). We extended this approach with the correlation between SXR and ground-level neutron measurements. This model was calibrated with X-ray, proton and neutron data obtained during the period 1989-2015 from the GOES/HEPAD instrument, and neutron data from the Neutron Monitor Data Base (NMDB). During this period, 32 GLE events were detected by neutron monitor stations. We consider that a GLE prediction is successful when it is triggered before the first GLE alert is issued by any neutron station of the NMDB network. For the most recent 16 years (2015-2000), the model was able to issue successful predictions for the 53.8% (7 of 13 GLE events), obtaining a false alarm ratio (FAR) of 36.4% (4/11), and an average warning time of 10 min. For the first years of the evaluation period (1989-1999), the model was able to issue successful predictions for the 31.6% (6 of 19 GLE events), obtaining a FAR of 33.3% (3/9), and an AWT of 17 min. A preliminary conclusion is that the model is not able to predict the promptest events but the more gradual ones. The final goal of this project, which is now halfway through its planned two-year duration, is the prediction of >500 MeV events. This project has received funding from the European Union's Horizon 2020 research and innovation programme under agreement No 637324.

  20. Video Event Trigger

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.; Lichter, Michael J.

    1994-01-01

    Video event trigger (VET) processes video image data to generate trigger signal when image shows significant change like motion or appearance, disappearance, change in color, change in brightness, or dilation of object. System aids in efficient utilization of image-data-storage and image-data-processing equipment in applications in which many video frames show no changes and are wasteful to record and analyze all frames when only relatively few frames show changes of interest. Applications include video recording of automobile crash tests, automated video monitoring of entrances, exits, parking lots, and secure areas.