Sample records for integrated computing network

  1. GaAs Optoelectronic Integrated-Circuit Neurons

    NASA Technical Reports Server (NTRS)

    Lin, Steven H.; Kim, Jae H.; Psaltis, Demetri

    1992-01-01

    Monolithic GaAs optoelectronic integrated circuits developed for use as artificial neurons. Neural-network computer contains planar arrays of optoelectronic neurons, and variable synaptic connections between neurons effected by diffraction of light from volume hologram in photorefractive material. Basic principles of neural-network computers explained more fully in "Optoelectronic Integrated Circuits For Neural Networks" (NPO-17652). In present circuits, devices replaced by metal/semiconductor field effect transistors (MESFET's), which consume less power.

  2. Local and Long Distance Computer Networking for Science Classrooms. Technical Report No. 43.

    ERIC Educational Resources Information Center

    Newman, Denis

    This report describes Earth Lab, a project which is demonstrating new ways of using computers for upper-elementary and middle-school science instruction, and finding ways to integrate local-area and telecommunications networks. The discussion covers software, classroom activities, formative research on communications networks, and integration of…

  3. The NASA Science Internet: An integrated approach to networking

    NASA Technical Reports Server (NTRS)

    Rounds, Fred

    1991-01-01

    An integrated approach to building a networking infrastructure is an absolute necessity for meeting the multidisciplinary science networking requirements of the Office of Space Science and Applications (OSSA) science community. These networking requirements include communication connectivity between computational resources, databases, and library systems, as well as to other scientists and researchers around the world. A consolidated networking approach allows strategic use of the existing science networking within the Federal government, and it provides networking capability that takes into consideration national and international trends towards multivendor and multiprotocol service. It also offers a practical vehicle for optimizing costs and maximizing performance. Finally, and perhaps most important to the development of high speed computing is that an integrated network constitutes a focus for phasing to the National Research and Education Network (NREN). The NASA Science Internet (NSI) program, established in mid 1988, is structured to provide just such an integrated network. A description of the NSI is presented.

  4. FORCEnet Net Centric Architecture - A Standards View

    DTIC Science & Technology

    2006-06-01

    SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION DATA MANAGEMENT APPLICATION...R V I C E P L A T F O R M S E R V I C E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM...E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION

  5. ENFIN--A European network for integrative systems biology.

    PubMed

    Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan

    2009-11-01

    Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

  6. Enterprise networks. Strategies for integrated delivery systems.

    PubMed

    Siwicki, B

    1997-02-01

    More integrated delivery systems are making progress toward building computer networks that link all their care delivery sites so they can efficiently and economically coordinate care. A growing number of these systems are turning to intranets--private computer networks that use Internet-derived protocols and technologies--to move information that's essential to managing scare health care resources.

  7. Network-based drug discovery by integrating systems biology and computational technologies

    PubMed Central

    Leung, Elaine L.; Cao, Zhi-Wei; Jiang, Zhi-Hong; Zhou, Hua

    2013-01-01

    Network-based intervention has been a trend of curing systemic diseases, but it relies on regimen optimization and valid multi-target actions of the drugs. The complex multi-component nature of medicinal herbs may serve as valuable resources for network-based multi-target drug discovery due to its potential treatment effects by synergy. Recently, robustness of multiple systems biology platforms shows powerful to uncover molecular mechanisms and connections between the drugs and their targeting dynamic network. However, optimization methods of drug combination are insufficient, owning to lacking of tighter integration across multiple ‘-omics’ databases. The newly developed algorithm- or network-based computational models can tightly integrate ‘-omics’ databases and optimize combinational regimens of drug development, which encourage using medicinal herbs to develop into new wave of network-based multi-target drugs. However, challenges on further integration across the databases of medicinal herbs with multiple system biology platforms for multi-target drug optimization remain to the uncertain reliability of individual data sets, width and depth and degree of standardization of herbal medicine. Standardization of the methodology and terminology of multiple system biology and herbal database would facilitate the integration. Enhance public accessible databases and the number of research using system biology platform on herbal medicine would be helpful. Further integration across various ‘-omics’ platforms and computational tools would accelerate development of network-based drug discovery and network medicine. PMID:22877768

  8. Simplicity and efficiency of integrate-and-fire neuron models.

    PubMed

    Plesser, Hans E; Diesmann, Markus

    2009-02-01

    Lovelace and Cios (2008) recently proposed a very simple spiking neuron (VSSN) model for simulations of large neuronal networks as an efficient replacement for the integrate-and-fire neuron model. We argue that the VSSN model falls behind key advances in neuronal network modeling over the past 20 years, in particular, techniques that permit simulators to compute the state of the neuron without repeated summation over the history of input spikes and to integrate the subthreshold dynamics exactly. State-of-the-art solvers for networks of integrate-and-fire model neurons are substantially more efficient than the VSSN simulator and allow routine simulations of networks of some 10(5) neurons and 10(9) connections on moderate computer clusters.

  9. Computer network security for the radiology enterprise.

    PubMed

    Eng, J

    2001-08-01

    As computer networks become an integral part of the radiology practice, it is appropriate to raise concerns regarding their security. The purpose of this article is to present an overview of computer network security risks and preventive strategies as they pertain to the radiology enterprise. A number of technologies are available that provide strong deterrence against attacks on networks and networked computer systems in the radiology enterprise. While effective, these technologies must be supplemented with vigilant user and system management.

  10. Integrated Engineering Information Technology, FY93 accommplishments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  11. Performance evaluation of the NASA/KSC CAD/CAE and office automation LAN's

    NASA Technical Reports Server (NTRS)

    Zobrist, George W.

    1994-01-01

    This study's objective is the performance evaluation of the existing CAD/CAE (Computer Aided Design/Computer Aided Engineering) network at NASA/KSC. This evaluation also includes a similar study of the Office Automation network, since it is being planned to integrate this network into the CAD/CAE network. The Microsoft mail facility which is presently on the CAD/CAE network was monitored to determine its present usage. This performance evaluation of the various networks will aid the NASA/KSC network managers in planning for the integration of future workload requirements into the CAD/CAE network and determining the effectiveness of the planned FDDI (Fiber Distributed Data Interface) migration.

  12. Specificity of V1-V2 Orientation Networks in the Primate Visual Cortex

    PubMed Central

    Roe, Anna W.; Ts'o, Daniel Y.

    2015-01-01

    The computation of texture and shape involves integration of features of various orientations. Orientation networks within V1 tend to involve cells which share similar orientation selectivity. However, emergent properties in V2 require the integration of multiple orientations. We now show that, unlike interactions within V1, V1-V2 orientation interactions are much less synchronized and are not necessarily orientation dependent. We find V1-V2 orientation networks are of two types: a more tightly synchronized, orientation-preserving network and a less synchronized orientation-diverse network. We suggest that such diversity of V1-V2 interactions underlies the spatial and functional integration required for computation of higher order contour and shape in V2. PMID:26314798

  13. The Georgetown University Library Information System (LIS): a minicomputer-based integrated library system.

    PubMed Central

    Broering, N C

    1983-01-01

    Georgetown University's Library Information System (LIS), an integrated library system designed and implemented at the Dahlgren Memorial Library, is broadly described from an administrative point of view. LIS' functional components consist of eight "user-friendly" modules: catalog, circulation, serials, bibliographic management (including Mini-MEDLINE), acquisitions, accounting, networking, and computer-assisted instruction. This article touches on emerging library services, user education, and computer information services, which are also changing the role of staff librarians. The computer's networking capability brings the library directly to users through personal or institutional computers at remote sites. The proposed Integrated Medical Center Information System at Georgetown University will include interface with LIS through a network mechanism. LIS is being replicated at other libraries, and a microcomputer version is being tested for use in a hospital setting. PMID:6688749

  14. Analysis of Flow Behavior Within An Integrated Computer-Communication Network,

    DTIC Science & Technology

    1979-05-01

    Howard. Plan today for tomorrows data/voice nets. Data Communications 7, 9 (Sep. 1978), 51-62. 24. F-ark, Howard, and Gitman , Israel. Inteqrated DoD...computer networks. NTC-74, San Diego, CA., (Dec. 2-4, 1974), 1032-1037. 31. Gitman , I., Frank, H., Occhiogrosso, B., and Hsieh, W. Issues in integrated...switched networks agree on standard interface. Data Communications, (May/June 1978), 25)-39. 36. Hsieh, W., Gitman , I., and Occhiogrosso, B. Design of

  15. Integration of hybrid wireless networks in cloud services oriented enterprise information systems

    NASA Astrophysics Data System (ADS)

    Li, Shancang; Xu, Lida; Wang, Xinheng; Wang, Jue

    2012-05-01

    This article presents a hybrid wireless network integration scheme in cloud services-based enterprise information systems (EISs). With the emerging hybrid wireless networks and cloud computing technologies, it is necessary to develop a scheme that can seamlessly integrate these new technologies into existing EISs. By combining the hybrid wireless networks and computing in EIS, a new framework is proposed, which includes frontend layer, middle layer and backend layers connected to IP EISs. Based on a collaborative architecture, cloud services management framework and process diagram are presented. As a key feature, the proposed approach integrates access control functionalities within the hybrid framework that provide users with filtered views on available cloud services based on cloud service access requirements and user security credentials. In future work, we will implement the proposed framework over SwanMesh platform by integrating the UPnP standard into an enterprise information system.

  16. Recurrent network dynamics reconciles visual motion segmentation and integration.

    PubMed

    Medathati, N V Kartheek; Rankin, James; Meso, Andrew I; Kornprobst, Pierre; Masson, Guillaume S

    2017-09-12

    In sensory systems, a range of computational rules are presumed to be implemented by neuronal subpopulations with different tuning functions. For instance, in primate cortical area MT, different classes of direction-selective cells have been identified and related either to motion integration, segmentation or transparency. Still, how such different tuning properties are constructed is unclear. The dominant theoretical viewpoint based on a linear-nonlinear feed-forward cascade does not account for their complex temporal dynamics and their versatility when facing different input statistics. Here, we demonstrate that a recurrent network model of visual motion processing can reconcile these different properties. Using a ring network, we show how excitatory and inhibitory interactions can implement different computational rules such as vector averaging, winner-take-all or superposition. The model also captures ordered temporal transitions between these behaviors. In particular, depending on the inhibition regime the network can switch from motion integration to segmentation, thus being able to compute either a single pattern motion or to superpose multiple inputs as in motion transparency. We thus demonstrate that recurrent architectures can adaptively give rise to different cortical computational regimes depending upon the input statistics, from sensory flow integration to segmentation.

  17. Salient regions detection using convolutional neural networks and color volume

    NASA Astrophysics Data System (ADS)

    Liu, Guang-Hai; Hou, Yingkun

    2018-03-01

    Convolutional neural network is an important technique in machine learning, pattern recognition and image processing. In order to reduce the computational burden and extend the classical LeNet-5 model to the field of saliency detection, we propose a simple and novel computing model based on LeNet-5 network. In the proposed model, hue, saturation and intensity are utilized to extract depth cues, and then we integrate depth cues and color volume to saliency detection following the basic structure of the feature integration theory. Experimental results show that the proposed computing model outperforms some existing state-of-the-art methods on MSRA1000 and ECSSD datasets.

  18. Network reliability maximization for stochastic-flow network subject to correlated failures using genetic algorithm and tabu\\xA0search

    NASA Astrophysics Data System (ADS)

    Yeh, Cheng-Ta; Lin, Yi-Kuei; Yang, Jo-Yun

    2018-07-01

    Network reliability is an important performance index for many real-life systems, such as electric power systems, computer systems and transportation systems. These systems can be modelled as stochastic-flow networks (SFNs) composed of arcs and nodes. Most system supervisors respect the network reliability maximization by finding the optimal multi-state resource assignment, which is one resource to each arc. However, a disaster may cause correlated failures for the assigned resources, affecting the network reliability. This article focuses on determining the optimal resource assignment with maximal network reliability for SFNs. To solve the problem, this study proposes a hybrid algorithm integrating the genetic algorithm and tabu search to determine the optimal assignment, called the hybrid GA-TS algorithm (HGTA), and integrates minimal paths, recursive sum of disjoint products and the correlated binomial distribution to calculate network reliability. Several practical numerical experiments are adopted to demonstrate that HGTA has better computational quality than several popular soft computing algorithms.

  19. NIF ICCS network design and loading analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tietbohl, G; Bryant, R

    The National Ignition Facility (NIF) is housed within a large facility about the size of two football fields. The Integrated Computer Control System (ICCS) is distributed throughout this facility and requires the integration of about 40,000 control points and over 500 video sources. This integration is provided by approximately 700 control computers distributed throughout the NIF facility and a network that provides the communication infrastructure. A main control room houses a set of seven computer consoles providing operator access and control of the various distributed front-end processors (FEPs). There are also remote workstations distributed within the facility that allow providemore » operator console functions while personnel are testing and troubleshooting throughout the facility. The operator workstations communicate with the FEPs which implement the localized control and monitoring functions. There are different types of FEPs for the various subsystems being controlled. This report describes the design of the NIF ICCS network and how it meets the traffic loads that will are expected and the requirements of the Sub-System Design Requirements (SSDR's). This document supersedes the earlier reports entitled Analysis of the National Ignition Facility Network, dated November 6, 1996 and The National Ignition Facility Digital Video and Control Network, dated July 9, 1996. For an overview of the ICCS, refer to the document NIF Integrated Computer Controls System Description (NIF-3738).« less

  20. Optimization of an interactive distributive computer network

    NASA Technical Reports Server (NTRS)

    Frederick, V.

    1985-01-01

    The activities under a cooperative agreement for the development of a computer network are briefly summarized. Research activities covered are: computer operating systems optimization and integration; software development and implementation of the IRIS (Infrared Imaging of Shuttle) Experiment; and software design, development, and implementation of the APS (Aerosol Particle System) Experiment.

  1. Adaptive Topological Configuration of an Integrated Circuit/Packet-Switched Computer Network.

    DTIC Science & Technology

    1984-01-01

    Gitman et al. [45] state that there are basically two approaches to the integrated network design problem: (1) solve the link/capacity problem for...1972), 1385-1397. 33. Frank, H., and Gitman , I. Economic analysis of integrated voice and data networks: a case study. Proc. of IEEE 66 , 11 (Nov. 1978...1974), 1074-1079. 45. Gitman , I., Hsieh, W., and Occhiogrosso, B. J. Analysis and design of hybrid switching networks. IEEE Trans. on Comm. Com-29

  2. The Role of Wireless Computing Technology in the Design of Schools.

    ERIC Educational Resources Information Center

    Nair, Prakash

    This document discusses integrating computers logically and affordably into a school building's infrastructure through the use of wireless technology. It begins by discussing why wireless networks using mobile computers are preferable to desktop machines in each classoom. It then explains the features of a wireless local area network (WLAN) and…

  3. Network neuroscience

    PubMed Central

    Bassett, Danielle S; Sporns, Olaf

    2017-01-01

    Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844

  4. Integrating Computer-Assisted Language Learning in Saudi Schools: A Change Model

    ERIC Educational Resources Information Center

    Alresheed, Saleh; Leask, Marilyn; Raiker, Andrea

    2015-01-01

    Computer-assisted language learning (CALL) technology and pedagogy have gained recognition globally for their success in supporting second language acquisition (SLA). In Saudi Arabia, the government aims to provide most educational institutions with computers and networking for integrating CALL into classrooms. However, the recognition of CALL's…

  5. Network survivability performance (computer diskette)

    NASA Astrophysics Data System (ADS)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  6. Explicit integration with GPU acceleration for large kinetic networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brock, Benjamin; Computer Science and Mathematics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37830; Belt, Andrew

    2015-12-01

    We demonstrate the first implementation of recently-developed fast explicit kinetic integration algorithms on modern graphics processing unit (GPU) accelerators. Taking as a generic test case a Type Ia supernova explosion with an extremely stiff thermonuclear network having 150 isotopic species and 1604 reactions coupled to hydrodynamics using operator splitting, we demonstrate the capability to solve of order 100 realistic kinetic networks in parallel in the same time that standard implicit methods can solve a single such network on a CPU. This orders-of-magnitude decrease in computation time for solving systems of realistic kinetic networks implies that important coupled, multiphysics problems inmore » various scientific and technical fields that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible.« less

  7. Integrating a Microcomputer for Agriculture Data and Telecommunications into the Agriculture Classroom. Final Report.

    ERIC Educational Resources Information Center

    Northeast Arkansas Educational Cooperative, Strawberry.

    A program was conducted to integrate computer technology into agriculture classrooms in 25 secondary schools and 4 universities in Arkansas. State-of-the-art technology from the AgriData Network of Milwaukee was used. The AgriData Network includes the Ag Ed Network, which offers more than 800 "live textbook" lessons with current…

  8. Computer Network Security- The Challenges of Securing a Computer Network

    NASA Technical Reports Server (NTRS)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  9. Optoelectronic Integrated Circuits For Neural Networks

    NASA Technical Reports Server (NTRS)

    Psaltis, D.; Katz, J.; Kim, Jae-Hoon; Lin, S. H.; Nouhi, A.

    1990-01-01

    Many threshold devices placed on single substrate. Integrated circuits containing optoelectronic threshold elements developed for use as planar arrays of artificial neurons in research on neural-network computers. Mounted with volume holograms recorded in photorefractive crystals serving as dense arrays of variable interconnections between neurons.

  10. A Dynamic Three-Dimensional Network Visualization Program for Integration into CyberCIEGE and Other Network Visualization Scenarios

    DTIC Science & Technology

    2007-06-01

    information flow involved in network attacks. This kind of information can be invaluable in learning how to best setup and defend computer networks...administrators, and those interested in learning about securing networks a way to conceptualize this complex system of computing. NTAV3D will provide a three...teaching with visual and other components can make learning more effective” (Baxley et al, 2006). A hyperbox (Alpern and Carter, 1991) is

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Qishi; Zhu, Mengxia; Rao, Nageswara S

    We propose an intelligent decision support system based on sensor and computer networks that incorporates various component techniques for sensor deployment, data routing, distributed computing, and information fusion. The integrated system is deployed in a distributed environment composed of both wireless sensor networks for data collection and wired computer networks for data processing in support of homeland security defense. We present the system framework and formulate the analytical problems and develop approximate or exact solutions for the subtasks: (i) sensor deployment strategy based on a two-dimensional genetic algorithm to achieve maximum coverage with cost constraints; (ii) data routing scheme tomore » achieve maximum signal strength with minimum path loss, high energy efficiency, and effective fault tolerance; (iii) network mapping method to assign computing modules to network nodes for high-performance distributed data processing; and (iv) binary decision fusion rule that derive threshold bounds to improve system hit rate and false alarm rate. These component solutions are implemented and evaluated through either experiments or simulations in various application scenarios. The extensive results demonstrate that these component solutions imbue the integrated system with the desirable and useful quality of intelligence in decision making.« less

  12. The Network Classroom.

    ERIC Educational Resources Information Center

    Maule, R. William

    1993-01-01

    Discussion of the role of new computer communications technologies in education focuses on modern networking systems, including fiber distributed data interface and Integrated Services Digital Network; strategies for implementing networked-based communication; and public online information resources for the classroom, including Bitnet, Internet,…

  13. Explicit integration with GPU acceleration for large kinetic networks

    DOE PAGES

    Brock, Benjamin; Belt, Andrew; Billings, Jay Jay; ...

    2015-09-15

    In this study, we demonstrate the first implementation of recently-developed fast explicit kinetic integration algorithms on modern graphics processing unit (GPU) accelerators. Taking as a generic test case a Type Ia supernova explosion with an extremely stiff thermonuclear network having 150 isotopic species and 1604 reactions coupled to hydrodynamics using operator splitting, we demonstrate the capability to solve of order 100 realistic kinetic networks in parallel in the same time that standard implicit methods can solve a single such network on a CPU. In addition, this orders-of-magnitude decrease in computation time for solving systems of realistic kinetic networks implies thatmore » important coupled, multiphysics problems in various scientific and technical fields that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible.« less

  14. Integrating Emerging Topics through Online Team Design in a Hybrid Communication Networks Course: Interaction Patterns and Impact of Prior Knowledge

    ERIC Educational Resources Information Center

    Reisslein, Jana; Seeling, Patrick; Reisslein, Martin

    2005-01-01

    An important challenge in the introductory communication networks course in electrical and computer engineering curricula is to integrate emerging topics, such as wireless Internet access and network security, into the already content-intensive course. At the same time it is essential to provide students with experiences in online collaboration,…

  15. Integrated-optics heralded controlled-NOT gate for polarization-encoded qubits

    NASA Astrophysics Data System (ADS)

    Zeuner, Jonas; Sharma, Aditya N.; Tillmann, Max; Heilmann, René; Gräfe, Markus; Moqanaki, Amir; Szameit, Alexander; Walther, Philip

    2018-03-01

    Recent progress in integrated-optics technology has made photonics a promising platform for quantum networks and quantum computation protocols. Integrated optical circuits are characterized by small device footprints and unrivalled intrinsic interferometric stability. Here, we take advantage of femtosecond-laser-written waveguides' ability to process polarization-encoded qubits and present an implementation of a heralded controlled-NOT gate on chip. We evaluate the gate performance in the computational basis and a superposition basis, showing that the gate can create polarization entanglement between two photons. Transmission through the integrated device is optimized using thermally expanded core fibers and adiabatically reduced mode-field diameters at the waveguide facets. This demonstration underlines the feasibility of integrated quantum gates for all-optical quantum networks and quantum repeaters.

  16. Preparing for the Integration of Emerging Technologies.

    ERIC Educational Resources Information Center

    Dyrli, Odvard Egil; Kinnaman, Daniel E.

    1994-01-01

    Discussion of the process of integrating new technologies into schools considers the evolution of technology, including personal computers, CD-ROMs, hypermedia, and networking/communications; the transition from Industrial-Age to Information-Age schools; and the logical steps of transition. Sidebars discuss a networked multimedia pilot project and…

  17. Design and implementation of a UNIX based distributed computing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Love, J.S.; Michael, M.W.

    1994-12-31

    We have designed, implemented, and are running a corporate-wide distributed processing batch queue on a large number of networked workstations using the UNIX{reg_sign} operating system. Atlas Wireline researchers and scientists have used the system for over a year. The large increase in available computer power has greatly reduced the time required for nuclear and electromagnetic tool modeling. Use of remote distributed computing has simultaneously reduced computation costs and increased usable computer time. The system integrates equipment from different manufacturers, using various CPU architectures, distinct operating system revisions, and even multiple processors per machine. Various differences between the machines have tomore » be accounted for in the master scheduler. These differences include shells, command sets, swap spaces, memory sizes, CPU sizes, and OS revision levels. Remote processing across a network must be performed in a manner that is seamless from the users` perspective. The system currently uses IBM RISC System/6000{reg_sign}, SPARCstation{sup TM}, HP9000s700, HP9000s800, and DEC Alpha AXP{sup TM} machines. Each CPU in the network has its own speed rating, allowed working hours, and workload parameters. The system if designed so that all of the computers in the network can be optimally scheduled without adversely impacting the primary users of the machines. The increase in the total usable computational capacity by means of distributed batch computing can change corporate computing strategy. The integration of disparate computer platforms eliminates the need to buy one type of computer for computations, another for graphics, and yet another for day-to-day operations. It might be possible, for example, to meet all research and engineering computing needs with existing networked computers.« less

  18. Applications of computational models to better understand microvascular remodelling: a focus on biomechanical integration across scales

    PubMed Central

    Murfee, Walter L.; Sweat, Richard S.; Tsubota, Ken-ichi; Gabhann, Feilim Mac; Khismatullin, Damir; Peirce, Shayn M.

    2015-01-01

    Microvascular network remodelling is a common denominator for multiple pathologies and involves both angiogenesis, defined as the sprouting of new capillaries, and network patterning associated with the organization and connectivity of existing vessels. Much of what we know about microvascular remodelling at the network, cellular and molecular scales has been derived from reductionist biological experiments, yet what happens when the experiments provide incomplete (or only qualitative) information? This review will emphasize the value of applying computational approaches to advance our understanding of the underlying mechanisms and effects of microvascular remodelling. Examples of individual computational models applied to each of the scales will highlight the potential of answering specific questions that cannot be answered using typical biological experimentation alone. Looking into the future, we will also identify the needs and challenges associated with integrating computational models across scales. PMID:25844149

  19. Distributed sensor networks: a cellular nonlinear network perspective.

    PubMed

    Haenggi, Martin

    2003-12-01

    Large-scale networks of integrated wireless sensors become increasingly tractable. Advances in hardware technology and engineering design have led to dramatic reductions in size, power consumption, and cost for digital circuitry, and wireless communications. Networking, self-organization, and distributed operation are crucial ingredients to harness the sensing, computing, and computational capabilities of the nodes into a complete system. This article shows that those networks can be considered as cellular nonlinear networks (CNNs), and that their analysis and design may greatly benefit from the rich theoretical results available for CNNs.

  20. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    NASA Technical Reports Server (NTRS)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  1. Data Integration in Computer Distributed Systems

    NASA Astrophysics Data System (ADS)

    Kwiecień, Błażej

    In this article the author analyze a problem of data integration in a computer distributed systems. Exchange of information between different levels in integrated pyramid of enterprise process is fundamental with regard to efficient enterprise work. Communication and data exchange between levels are not always the same cause of necessity of different network protocols usage, communication medium, system response time, etc.

  2. "TIS": An Intelligent Gateway Computer for Information and Modeling Networks. Overview.

    ERIC Educational Resources Information Center

    Hampel, Viktor E.; And Others

    TIS (Technology Information System) is being used at the Lawrence Livermore National Laboratory (LLNL) to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or…

  3. Integrated Model of Chemical Perturbations of a Biological PathwayUsing 18 In Vitro High Throughput Screening Assays for the Estrogen Receptor

    EPA Science Inventory

    We demonstrate a computational network model that integrates 18 in vitro, high-throughput screening assays measuring estrogen receptor (ER) binding, dimerization, chromatin binding, transcriptional activation and ER-dependent cell proliferation. The network model uses activity pa...

  4. A collaborative computing framework of cloud network and WBSN applied to fall detection and 3-D motion reconstruction.

    PubMed

    Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh

    2014-03-01

    As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.

  5. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  6. Integrated Speech and Language Technology for Intelligence, Surveillance, and Reconnaissance (ISR)

    DTIC Science & Technology

    2017-07-01

    applying submodularity techniques to address computing challenges posed by large datasets in speech and language processing. MT and speech tools were...aforementioned research-oriented activities, the IT system administration team provided necessary support to laboratory computing and network operations...operations of SCREAM Lab computer systems and networks. Other miscellaneous activities in relation to Task Order 29 are presented in an additional fourth

  7. HeNCE: A Heterogeneous Network Computing Environment

    DOE PAGES

    Beguelin, Adam; Dongarra, Jack J.; Geist, George Al; ...

    1994-01-01

    Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE) is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM).more » The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.« less

  8. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    NASA Astrophysics Data System (ADS)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  9. Slow Computing Simulation of Bio-plausible Control

    DTIC Science & Technology

    2012-03-01

    information networks, neuromorphic chips would become necessary. Small unstable flying platforms currently require RTK, GPS, or Vicon closed-circuit...Visual, and IR Sensing FPGA ASIC Neuromorphic Chip Simulation Quad Rotor Robotic Insect Uniform Independent Network Single Modality Neural Network... neuromorphic Processing across parallel computational elements =0.54 N u m b e r o f c o m p u ta tio n s - No info 14 integrated circuit

  10. A systems approach to integrative biology: an overview of statistical methods to elucidate association and architecture.

    PubMed

    Ciaccio, Mark F; Finkle, Justin D; Xue, Albert Y; Bagheri, Neda

    2014-07-01

    An organism's ability to maintain a desired physiological response relies extensively on how cellular and molecular signaling networks interpret and react to environmental cues. The capacity to quantitatively predict how networks respond to a changing environment by modifying signaling regulation and phenotypic responses will help inform and predict the impact of a changing global enivronment on organisms and ecosystems. Many computational strategies have been developed to resolve cue-signal-response networks. However, selecting a strategy that answers a specific biological question requires knowledge both of the type of data being collected, and of the strengths and weaknesses of different computational regimes. We broadly explore several computational approaches, and we evaluate their accuracy in predicting a given response. Specifically, we describe how statistical algorithms can be used in the context of integrative and comparative biology to elucidate the genomic, proteomic, and/or cellular networks responsible for robust physiological response. As a case study, we apply this strategy to a dataset of quantitative levels of protein abundance from the mussel, Mytilus galloprovincialis, to uncover the temperature-dependent signaling network. © The Author 2014. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  11. Effect of Network-Assisted Language Teaching Model on Undergraduate English Skills

    ERIC Educational Resources Information Center

    He, Chunyan

    2013-01-01

    With the coming of the information age, computer-based teaching model has had an important impact on English teaching. Since 2004, the trial instruction on Network-assisted Language Teaching (NALT) Model integrating the English instruction and computer technology has been launched at some universities in China, including China university of…

  12. Modeling integrated cellular machinery using hybrid Petri-Boolean networks.

    PubMed

    Berestovsky, Natalie; Zhou, Wanding; Nagrath, Deepak; Nakhleh, Luay

    2013-01-01

    The behavior and phenotypic changes of cells are governed by a cellular circuitry that represents a set of biochemical reactions. Based on biological functions, this circuitry is divided into three types of networks, each encoding for a major biological process: signal transduction, transcription regulation, and metabolism. This division has generally enabled taming computational complexity dealing with the entire system, allowed for using modeling techniques that are specific to each of the components, and achieved separation of the different time scales at which reactions in each of the three networks occur. Nonetheless, with this division comes loss of information and power needed to elucidate certain cellular phenomena. Within the cell, these three types of networks work in tandem, and each produces signals and/or substances that are used by the others to process information and operate normally. Therefore, computational techniques for modeling integrated cellular machinery are needed. In this work, we propose an integrated hybrid model (IHM) that combines Petri nets and Boolean networks to model integrated cellular networks. Coupled with a stochastic simulation mechanism, the model simulates the dynamics of the integrated network, and can be perturbed to generate testable hypotheses. Our model is qualitative and is mostly built upon knowledge from the literature and requires fine-tuning of very few parameters. We validated our model on two systems: the transcriptional regulation of glucose metabolism in human cells, and cellular osmoregulation in S. cerevisiae. The model produced results that are in very good agreement with experimental data, and produces valid hypotheses. The abstract nature of our model and the ease of its construction makes it a very good candidate for modeling integrated networks from qualitative data. The results it produces can guide the practitioner to zoom into components and interconnections and investigate them using such more detailed mathematical models.

  13. Modeling Integrated Cellular Machinery Using Hybrid Petri-Boolean Networks

    PubMed Central

    Berestovsky, Natalie; Zhou, Wanding; Nagrath, Deepak; Nakhleh, Luay

    2013-01-01

    The behavior and phenotypic changes of cells are governed by a cellular circuitry that represents a set of biochemical reactions. Based on biological functions, this circuitry is divided into three types of networks, each encoding for a major biological process: signal transduction, transcription regulation, and metabolism. This division has generally enabled taming computational complexity dealing with the entire system, allowed for using modeling techniques that are specific to each of the components, and achieved separation of the different time scales at which reactions in each of the three networks occur. Nonetheless, with this division comes loss of information and power needed to elucidate certain cellular phenomena. Within the cell, these three types of networks work in tandem, and each produces signals and/or substances that are used by the others to process information and operate normally. Therefore, computational techniques for modeling integrated cellular machinery are needed. In this work, we propose an integrated hybrid model (IHM) that combines Petri nets and Boolean networks to model integrated cellular networks. Coupled with a stochastic simulation mechanism, the model simulates the dynamics of the integrated network, and can be perturbed to generate testable hypotheses. Our model is qualitative and is mostly built upon knowledge from the literature and requires fine-tuning of very few parameters. We validated our model on two systems: the transcriptional regulation of glucose metabolism in human cells, and cellular osmoregulation in S. cerevisiae. The model produced results that are in very good agreement with experimental data, and produces valid hypotheses. The abstract nature of our model and the ease of its construction makes it a very good candidate for modeling integrated networks from qualitative data. The results it produces can guide the practitioner to zoom into components and interconnections and investigate them using such more detailed mathematical models. PMID:24244124

  14. The combination of circle topology and leaky integrator neurons remarkably improves the performance of echo state network on time series prediction.

    PubMed

    Xue, Fangzheng; Li, Qian; Li, Xiumin

    2017-01-01

    Recently, echo state network (ESN) has attracted a great deal of attention due to its high accuracy and efficient learning performance. Compared with the traditional random structure and classical sigmoid units, simple circle topology and leaky integrator neurons have more advantages on reservoir computing of ESN. In this paper, we propose a new model of ESN with both circle reservoir structure and leaky integrator units. By comparing the prediction capability on Mackey-Glass chaotic time series of four ESN models: classical ESN, circle ESN, traditional leaky integrator ESN, circle leaky integrator ESN, we find that our circle leaky integrator ESN shows significantly better performance than other ESNs with roughly 2 orders of magnitude reduction of the predictive error. Moreover, this model has stronger ability to approximate nonlinear dynamics and resist noise than conventional ESN and ESN with only simple circle structure or leaky integrator neurons. Our results show that the combination of circle topology and leaky integrator neurons can remarkably increase dynamical diversity and meanwhile decrease the correlation of reservoir states, which contribute to the significant improvement of computational performance of Echo state network on time series prediction.

  15. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  16. A FORCEnet Framework for Analysis of Existing Naval C4I Architectures

    DTIC Science & Technology

    2003-06-01

    best qualities of humans and computers. f. Information Weapons Information weapons integrate the use of military deception, psychological ...operations, to include electronic warfare, psychological operations, computer network attack, computer network defense, operations security, and military...F/A-18 ( ATARS /SHARP), S-3B (SSU), SH-60 LAMPS (HAWKLINK) and P-3C (AIP, Special Projects). CDL-N consists of two antennas (one meter diameter

  17. Social network extraction based on Web: 3. the integrated superficial method

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.

    2018-03-01

    The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.

  18. IntNetDB v1.0: an integrated protein-protein interaction network database generated by a probabilistic model

    PubMed Central

    Xia, Kai; Dong, Dong; Han, Jing-Dong J

    2006-01-01

    Background Although protein-protein interaction (PPI) networks have been explored by various experimental methods, the maps so built are still limited in coverage and accuracy. To further expand the PPI network and to extract more accurate information from existing maps, studies have been carried out to integrate various types of functional relationship data. A frequently updated database of computationally analyzed potential PPIs to provide biological researchers with rapid and easy access to analyze original data as a biological network is still lacking. Results By applying a probabilistic model, we integrated 27 heterogeneous genomic, proteomic and functional annotation datasets to predict PPI networks in human. In addition to previously studied data types, we show that phenotypic distances and genetic interactions can also be integrated to predict PPIs. We further built an easy-to-use, updatable integrated PPI database, the Integrated Network Database (IntNetDB) online, to provide automatic prediction and visualization of PPI network among genes of interest. The networks can be visualized in SVG (Scalable Vector Graphics) format for zooming in or out. IntNetDB also provides a tool to extract topologically highly connected network neighborhoods from a specific network for further exploration and research. Using the MCODE (Molecular Complex Detections) algorithm, 190 such neighborhoods were detected among all the predicted interactions. The predicted PPIs can also be mapped to worm, fly and mouse interologs. Conclusion IntNetDB includes 180,010 predicted protein-protein interactions among 9,901 human proteins and represents a useful resource for the research community. Our study has increased prediction coverage by five-fold. IntNetDB also provides easy-to-use network visualization and analysis tools that allow biological researchers unfamiliar with computational biology to access and analyze data over the internet. The web interface of IntNetDB is freely accessible at . Visualization requires Mozilla version 1.8 (or higher) or Internet Explorer with installation of SVGviewer. PMID:17112386

  19. How well do mean field theories of spiking quadratic-integrate-and-fire networks work in realistic parameter regimes?

    PubMed

    Grabska-Barwińska, Agnieszka; Latham, Peter E

    2014-06-01

    We use mean field techniques to compute the distribution of excitatory and inhibitory firing rates in large networks of randomly connected spiking quadratic integrate and fire neurons. These techniques are based on the assumption that activity is asynchronous and Poisson. For most parameter settings these assumptions are strongly violated; nevertheless, so long as the networks are not too synchronous, we find good agreement between mean field prediction and network simulations. Thus, much of the intuition developed for randomly connected networks in the asynchronous regime applies to mildly synchronous networks.

  20. A study of structural properties of gene network graphs for mathematical modeling of integrated mosaic gene networks.

    PubMed

    Petrovskaya, Olga V; Petrovskiy, Evgeny D; Lavrik, Inna N; Ivanisenko, Vladimir A

    2017-04-01

    Gene network modeling is one of the widely used approaches in systems biology. It allows for the study of complex genetic systems function, including so-called mosaic gene networks, which consist of functionally interacting subnetworks. We conducted a study of a mosaic gene networks modeling method based on integration of models of gene subnetworks by linear control functionals. An automatic modeling of 10,000 synthetic mosaic gene regulatory networks was carried out using computer experiments on gene knockdowns/knockouts. Structural analysis of graphs of generated mosaic gene regulatory networks has revealed that the most important factor for building accurate integrated mathematical models, among those analyzed in the study, is data on expression of genes corresponding to the vertices with high properties of centrality.

  1. Deep learning with coherent nanophotonic circuits

    NASA Astrophysics Data System (ADS)

    Shen, Yichen; Harris, Nicholas C.; Skirlo, Scott; Prabhu, Mihika; Baehr-Jones, Tom; Hochberg, Michael; Sun, Xin; Zhao, Shijie; Larochelle, Hugo; Englund, Dirk; Soljačić, Marin

    2017-07-01

    Artificial neural networks are computational network models inspired by signal processing in the brain. These models have dramatically improved performance for many machine-learning tasks, including speech and image recognition. However, today's computing hardware is inefficient at implementing neural networks, in large part because much of it was designed for von Neumann computing schemes. Significant effort has been made towards developing electronic architectures tuned to implement artificial neural networks that exhibit improved computational speed and accuracy. Here, we propose a new architecture for a fully optical neural network that, in principle, could offer an enhancement in computational speed and power efficiency over state-of-the-art electronics for conventional inference tasks. We experimentally demonstrate the essential part of the concept using a programmable nanophotonic processor featuring a cascaded array of 56 programmable Mach-Zehnder interferometers in a silicon photonic integrated circuit and show its utility for vowel recognition.

  2. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  3. Development of Integrated Programs for Aerospace-vechicle Design (IPAD). IPAD user requirements: Implementation (first-level IPAD)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The requirements implementation strategy for first level development of the Integrated Programs for Aerospace Vehicle Design (IPAD) computing system is presented. The capabilities of first level IPAD are sufficient to demonstrated management of engineering data on two computers (CDC CYBER 170/720 and DEC VAX 11/780 computers) using the IPAD system in a distributed network environment.

  4. Using Reputation Based Trust to Overcome Malfunctions and Malicious Failures in Electric Power Protection Systems

    DTIC Science & Technology

    2011-09-01

    concert with a physical attack. Additionally, the importance of preventive measures implemented by a social human network to counteract a cyber attack...integrity of the data stored on specific computers. This coordinated cyber attack would have been successful if not for the trusted social network...established by Mr. Hillar Aarelaid, head of the Estonian computer 6 emergency response team (CERT). This social network consisted of Mr. Hillar Aarelaid

  5. Computational properties of networks of synchronous groups of spiking neurons.

    PubMed

    Dayhoff, Judith E

    2007-09-01

    We demonstrate a model in which synchronously firing ensembles of neurons are networked to produce computational results. Each ensemble is a group of biological integrate-and-fire spiking neurons, with probabilistic interconnections between groups. An analogy is drawn in which each individual processing unit of an artificial neural network corresponds to a neuronal group in a biological model. The activation value of a unit in the artificial neural network corresponds to the fraction of active neurons, synchronously firing, in a biological neuronal group. Weights of the artificial neural network correspond to the product of the interconnection density between groups, the group size of the presynaptic group, and the postsynaptic potential heights in the synchronous group model. All three of these parameters can modulate connection strengths between neuronal groups in the synchronous group models. We give an example of nonlinear classification (XOR) and a function approximation example in which the capability of the artificial neural network can be captured by a neural network model with biological integrate-and-fire neurons configured as a network of synchronously firing ensembles of such neurons. We point out that the general function approximation capability proven for feedforward artificial neural networks appears to be approximated by networks of neuronal groups that fire in synchrony, where the groups comprise integrate-and-fire neurons. We discuss the advantages of this type of model for biological systems, its possible learning mechanisms, and the associated timing relationships.

  6. Data systems and computer science programs: Overview

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.; Hunter, Paul

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  7. A Study of Quality of Service Communication for High-Speed Packet-Switching Computer Sub-Networks

    NASA Technical Reports Server (NTRS)

    Cui, Zhenqian

    1999-01-01

    With the development of high-speed networking technology, computer networks, including local-area networks (LANs), wide-area networks (WANs) and the Internet, are extending their traditional roles of carrying computer data. They are being used for Internet telephony, multimedia applications such as conferencing and video on demand, distributed simulations, and other real-time applications. LANs are even used for distributed real-time process control and computing as a cost-effective approach. Differing from traditional data transfer, these new classes of high-speed network applications (video, audio, real-time process control, and others) are delay sensitive. The usefulness of data depends not only on the correctness of received data, but also the time that data are received. In other words, these new classes of applications require networks to provide guaranteed services or quality of service (QoS). Quality of service can be defined by a set of parameters and reflects a user's expectation about the underlying network's behavior. Traditionally, distinct services are provided by different kinds of networks. Voice services are provided by telephone networks, video services are provided by cable networks, and data transfer services are provided by computer networks. A single network providing different services is called an integrated-services network.

  8. An infrastructure with a unified control plane to integrate IP into optical metro networks to provide flexible and intelligent bandwidth on demand for cloud computing

    NASA Astrophysics Data System (ADS)

    Yang, Wei; Hall, Trevor

    2012-12-01

    The Internet is entering an era of cloud computing to provide more cost effective, eco-friendly and reliable services to consumer and business users and the nature of the Internet traffic will undertake a fundamental transformation. Consequently, the current Internet will no longer suffice for serving cloud traffic in metro areas. This work proposes an infrastructure with a unified control plane that integrates simple packet aggregation technology with optical express through the interoperation between IP routers and electrical traffic controllers in optical metro networks. The proposed infrastructure provides flexible, intelligent, and eco-friendly bandwidth on demand for cloud computing in metro areas.

  9. Facebook Usage as a Predictor of Retention at a Private 4-Year Institution

    ERIC Educational Resources Information Center

    Morris, Jason; Reese, Jeff; Beck, Richard; Mattis, Charles

    2010-01-01

    Computer-based social networking has become ubiquitous on college and university campuses. However, little is known about how this form of networking reflects social integration which is considered to be an integral component of student persistence. To address this topic, a random sample of 375 entering freshman were used to evaluate the…

  10. Challenges in Integrating a Complex Systems Computer Simulation in Class: An Educational Design Research

    ERIC Educational Resources Information Center

    Loke, Swee-Kin; Al-Sallami, Hesham S.; Wright, Daniel F. B.; McDonald, Jenny; Jadhav, Sheetal; Duffull, Stephen B.

    2012-01-01

    Complex systems are typically difficult for students to understand and computer simulations offer a promising way forward. However, integrating such simulations into conventional classes presents numerous challenges. Framed within an educational design research, we studied the use of an in-house built simulation of the coagulation network in four…

  11. Integrated network analysis and effective tools in plant systems biology

    PubMed Central

    Fukushima, Atsushi; Kanaya, Shigehiko; Nishida, Kozo

    2014-01-01

    One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1) network visualization tools, (2) pathway analyses, (3) genome-scale metabolic reconstruction, and (4) the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms. PMID:25408696

  12. Mass Storage Systems.

    ERIC Educational Resources Information Center

    Ranade, Sanjay; Schraeder, Jeff

    1991-01-01

    Presents an overview of the mass storage market and discusses mass storage systems as part of computer networks. Systems for personal computers, workstations, minicomputers, and mainframe computers are described; file servers are explained; system integration issues are raised; and future possibilities are suggested. (LRW)

  13. Multimedia and the Future of Distance Learning Technology.

    ERIC Educational Resources Information Center

    Barnard, John

    1992-01-01

    Describes recent innovations in distance learning technology, including the use of video technology; personal computers, including computer conferencing, computer-mediated communication, and workstations; multimedia, including hypermedia; Integrated Services Digital Networks (ISDN); and fiber optics. Research implications for multimedia and…

  14. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  15. Resource Aware Intelligent Network Services (RAINS) Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, Tom; Yang, Xi

    The Resource Aware Intelligent Network Services (RAINS) project conducted research and developed technologies in the area of cyber infrastructure resource modeling and computation. The goal of this work was to provide a foundation to enable intelligent, software defined services which spanned the network AND the resources which connect to the network. A Multi-Resource Service Plane (MRSP) was defined, which allows resource owners/managers to locate and place themselves from a topology and service availability perspective within the dynamic networked cyberinfrastructure ecosystem. The MRSP enables the presentation of integrated topology views and computation results which can include resources across the spectrum ofmore » compute, storage, and networks. The RAINS project developed MSRP includes the following key components: i) Multi-Resource Service (MRS) Ontology/Multi-Resource Markup Language (MRML), ii) Resource Computation Engine (RCE), iii) Modular Driver Framework (to allow integration of a variety of external resources). The MRS/MRML is a general and extensible modeling framework that allows for resource owners to model, or describe, a wide variety of resource types. All resources are described using three categories of elements: Resources, Services, and Relationships between the elements. This modeling framework defines a common method for the transformation of cyber infrastructure resources into data in the form of MRML models. In order to realize this infrastructure datification, the RAINS project developed a model based computation system, i.e. “RAINS Computation Engine (RCE)”. The RCE has the ability to ingest, process, integrate, and compute based on automatically generated MRML models. The RCE interacts with the resources thru system drivers which are specific to the type of external network or resource controller. The RAINS project developed a modular and pluggable driver system which facilities a variety of resource controllers to automatically generate, maintain, and distribute MRML based resource descriptions. Once all of the resource topologies are absorbed by the RCE, a connected graph of the full distributed system topology is constructed, which forms the basis for computation and workflow processing. The RCE includes a Modular Computation Element (MCE) framework which allows for tailoring of the computation process to the specific set of resources under control, and the services desired. The input and output of an MCE are both model data based on MRS/MRML ontology and schema. Some of the RAINS project accomplishments include: Development of general and extensible multi-resource modeling framework; Design of a Resource Computation Engine (RCE) system which includes the following key capabilities; Absorb a variety of multi-resource model types and build integrated models; Novel architecture which uses model based communications across the full stack for all Flexible provision of abstract or intent based user facing interfaces; Workflow processing based on model descriptions; Release of the RCE as an open source software; Deployment of RCE in the University of Maryland/Mid-Atlantic Crossroad ScienceDMZ in prototype mode with a plan under way to transition to production; Deployment at the Argonne National Laboratory DTN Facility in prototype mode; Selection of RCE by the DOE SENSE (SDN for End-to-end Networked Science at the Exascale) project as the basis for their orchestration service.« less

  16. Performance management of multiple access communication networks

    NASA Astrophysics Data System (ADS)

    Lee, Suk; Ray, Asok

    1993-12-01

    This paper focuses on conceptual design, development, and implementation of a performance management tool for computer communication networks to serve large-scale integrated systems. The objective is to improve the network performance in handling various types of messages by on-line adjustment of protocol parameters. The techniques of perturbation analysis of Discrete Event Dynamic Systems (DEDS), stochastic approximation (SA), and learning automata have been used in formulating the algorithm of performance management. The efficacy of the performance management tool has been demonstrated on a network testbed. The conceptual design presented in this paper offers a step forward to bridging the gap between management standards and users' demands for efficient network operations since most standards such as ISO (International Standards Organization) and IEEE address only the architecture, services, and interfaces for network management. The proposed concept of performance management can also be used as a general framework to assist design, operation, and management of various DEDS such as computer integrated manufacturing and battlefield C(sup 3) (Command, Control, and Communications).

  17. Comfort with Computers in the Library.

    ERIC Educational Resources Information Center

    Agati, Joseph

    2002-01-01

    Sets forth a list of do's and don't's when integrating aesthetics, functionality, and technology into college library computer workstation furniture. The article discusses workstation access for both portable computer users and for staff, whose needs involve desktop computers that are possibly networked with printers and other peripherals. (GR)

  18. Computational, Integrative, and Comparative Methods for the Elucidation of Genetic Coexpression Networks

    DOE PAGES

    Baldwin, Nicole E.; Chesler, Elissa J.; Kirov, Stefan; ...

    2005-01-01

    Gene expression microarray data can be used for the assembly of genetic coexpression network graphs. Using mRNA samples obtained from recombinant inbred Mus musculus strains, it is possible to integrate allelic variation with molecular and higher-order phenotypes. The depth of quantitative genetic analysis of microarray data can be vastly enhanced utilizing this mouse resource in combination with powerful computational algorithms, platforms, and data repositories. The resulting network graphs transect many levels of biological scale. This approach is illustrated with the extraction of cliques of putatively co-regulated genes and their annotation using gene ontology analysis and cis -regulatory element discovery. Themore » causal basis for co-regulation is detected through the use of quantitative trait locus mapping.« less

  19. Is functional integration of resting state brain networks an unspecific biomarker for working memory performance?

    PubMed

    Alavash, Mohsen; Doebler, Philipp; Holling, Heinz; Thiel, Christiane M; Gießing, Carsten

    2015-03-01

    Is there one optimal topology of functional brain networks at rest from which our cognitive performance would profit? Previous studies suggest that functional integration of resting state brain networks is an important biomarker for cognitive performance. However, it is still unknown whether higher network integration is an unspecific predictor for good cognitive performance or, alternatively, whether specific network organization during rest predicts only specific cognitive abilities. Here, we investigated the relationship between network integration at rest and cognitive performance using two tasks that measured different aspects of working memory; one task assessed visual-spatial and the other numerical working memory. Network clustering, modularity and efficiency were computed to capture network integration on different levels of network organization, and to statistically compare their correlations with the performance in each working memory test. The results revealed that each working memory aspect profits from a different resting state topology, and the tests showed significantly different correlations with each of the measures of network integration. While higher global network integration and modularity predicted significantly better performance in visual-spatial working memory, both measures showed no significant correlation with numerical working memory performance. In contrast, numerical working memory was superior in subjects with highly clustered brain networks, predominantly in the intraparietal sulcus, a core brain region of the working memory network. Our findings suggest that a specific balance between local and global functional integration of resting state brain networks facilitates special aspects of cognitive performance. In the context of working memory, while visual-spatial performance is facilitated by globally integrated functional resting state brain networks, numerical working memory profits from increased capacities for local processing, especially in brain regions involved in working memory performance. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Automated selection of computed tomography display parameters using neural networks

    NASA Astrophysics Data System (ADS)

    Zhang, Di; Neu, Scott; Valentino, Daniel J.

    2001-07-01

    A collection of artificial neural networks (ANN's) was trained to identify simple anatomical structures in a set of x-ray computed tomography (CT) images. These neural networks learned to associate a point in an image with the anatomical structure containing the point by using the image pixels located on the horizontal and vertical lines that ran through the point. The neural networks were integrated into a computer software tool whose function is to select an index into a list of CT window/level values from the location of the user's mouse cursor. Based upon the anatomical structure selected by the user, the software tool automatically adjusts the image display to optimally view the structure.

  1. CLIPS on the NeXT computer

    NASA Technical Reports Server (NTRS)

    Charnock, Elizabeth; Eng, Norman

    1990-01-01

    This paper discusses the integration of CLIPS into a hybrid expert system neural network AI tool for the NeXT computer. The main discussion is devoted to the joining of these two AI paradigms in a mutually beneficial relationship. We conclude that expert systems and neural networks should not be considered as competing AI implementation methods, but rather as complimentary components of a whole.

  2. Microgrids | Energy Systems Integration Facility | NREL

    Science.gov Websites

    Manager, Marine Corps Air Station (MCAS) Miramar Network Simulator-in-the-Loop Testing OMNeT++: simulates a network and links with real computers and virtual hosts. Power Hardware-in-the-Loop Simulation

  3. Identifying essential proteins based on sub-network partition and prioritization by integrating subcellular localization information.

    PubMed

    Li, Min; Li, Wenkai; Wu, Fang-Xiang; Pan, Yi; Wang, Jianxin

    2018-06-14

    Essential proteins are important participants in various life activities and play a vital role in the survival and reproduction of living organisms. Identification of essential proteins from protein-protein interaction (PPI) networks has great significance to facilitate the study of human complex diseases, the design of drugs and the development of bioinformatics and computational science. Studies have shown that highly connected proteins in a PPI network tend to be essential. A series of computational methods have been proposed to identify essential proteins by analyzing topological structures of PPI networks. However, the high noise in the PPI data can degrade the accuracy of essential protein prediction. Moreover, proteins must be located in the appropriate subcellular localization to perform their functions, and only when the proteins are located in the same subcellular localization, it is possible that they can interact with each other. In this paper, we propose a new network-based essential protein discovery method based on sub-network partition and prioritization by integrating subcellular localization information, named SPP. The proposed method SPP was tested on two different yeast PPI networks obtained from DIP database and BioGRID database. The experimental results show that SPP can effectively reduce the effect of false positives in PPI networks and predict essential proteins more accurately compared with other existing computational methods DC, BC, CC, SC, EC, IC, NC. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. A likelihood ratio anomaly detector for identifying within-perimeter computer network attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grana, Justin; Wolpert, David; Neil, Joshua

    The rapid detection of attackers within firewalls of enterprise computer networks is of paramount importance. Anomaly detectors address this problem by quantifying deviations from baseline statistical models of normal network behavior and signaling an intrusion when the observed data deviates significantly from the baseline model. But, many anomaly detectors do not take into account plausible attacker behavior. As a result, anomaly detectors are prone to a large number of false positives due to unusual but benign activity. Our paper first introduces a stochastic model of attacker behavior which is motivated by real world attacker traversal. Then, we develop a likelihoodmore » ratio detector that compares the probability of observed network behavior under normal conditions against the case when an attacker has possibly compromised a subset of hosts within the network. Since the likelihood ratio detector requires integrating over the time each host becomes compromised, we illustrate how to use Monte Carlo methods to compute the requisite integral. We then present Receiver Operating Characteristic (ROC) curves for various network parameterizations that show for any rate of true positives, the rate of false positives for the likelihood ratio detector is no higher than that of a simple anomaly detector and is often lower. Finally, we demonstrate the superiority of the proposed likelihood ratio detector when the network topologies and parameterizations are extracted from real-world networks.« less

  5. A likelihood ratio anomaly detector for identifying within-perimeter computer network attacks

    DOE PAGES

    Grana, Justin; Wolpert, David; Neil, Joshua; ...

    2016-03-11

    The rapid detection of attackers within firewalls of enterprise computer networks is of paramount importance. Anomaly detectors address this problem by quantifying deviations from baseline statistical models of normal network behavior and signaling an intrusion when the observed data deviates significantly from the baseline model. But, many anomaly detectors do not take into account plausible attacker behavior. As a result, anomaly detectors are prone to a large number of false positives due to unusual but benign activity. Our paper first introduces a stochastic model of attacker behavior which is motivated by real world attacker traversal. Then, we develop a likelihoodmore » ratio detector that compares the probability of observed network behavior under normal conditions against the case when an attacker has possibly compromised a subset of hosts within the network. Since the likelihood ratio detector requires integrating over the time each host becomes compromised, we illustrate how to use Monte Carlo methods to compute the requisite integral. We then present Receiver Operating Characteristic (ROC) curves for various network parameterizations that show for any rate of true positives, the rate of false positives for the likelihood ratio detector is no higher than that of a simple anomaly detector and is often lower. Finally, we demonstrate the superiority of the proposed likelihood ratio detector when the network topologies and parameterizations are extracted from real-world networks.« less

  6. Modeling biological pathway dynamics with timed automata.

    PubMed

    Schivo, Stefano; Scholma, Jetse; Wanders, Brend; Urquidi Camacho, Ricardo A; van der Vet, Paul E; Karperien, Marcel; Langerak, Rom; van de Pol, Jaco; Post, Janine N

    2014-05-01

    Living cells are constantly subjected to a plethora of environmental stimuli that require integration into an appropriate cellular response. This integration takes place through signal transduction events that form tightly interconnected networks. The understanding of these networks requires capturing their dynamics through computational support and models. ANIMO (analysis of Networks with Interactive Modeling) is a tool that enables the construction and exploration of executable models of biological networks, helping to derive hypotheses and to plan wet-lab experiments. The tool is based on the formalism of Timed Automata, which can be analyzed via the UPPAAL model checker. Thanks to Timed Automata, we can provide a formal semantics for the domain-specific language used to represent signaling networks. This enforces precision and uniformity in the definition of signaling pathways, contributing to the integration of isolated signaling events into complex network models. We propose an approach to discretization of reaction kinetics that allows us to efficiently use UPPAAL as the computational engine to explore the dynamic behavior of the network of interest. A user-friendly interface hides the use of Timed Automata from the user, while keeping the expressive power intact. Abstraction to single-parameter kinetics speeds up construction of models that remain faithful enough to provide meaningful insight. The resulting dynamic behavior of the network components is displayed graphically, allowing for an intuitive and interactive modeling experience.

  7. ICC '86; Proceedings of the International Conference on Communications, Toronto, Canada, June 22-25, 1986, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Papers are presented on ISDN, mobile radio systems and techniques for digital connectivity, centralized and distributed algorithms in computer networks, communications networks, quality assurance and impact on cost, adaptive filters in communications, the spread spectrum, signal processing, video communication techniques, and digital satellite services. Topics discussed include performance evaluation issues for integrated protocols, packet network operations, the computer network theory and multiple-access, microwave single sideband systems, switching architectures, fiber optic systems, wireless local communications, modulation, coding, and synchronization, remote switching, software quality, transmission, and expert systems in network operations. Consideration is given to wide area networks, image and speech processing, office communications application protocols, multimedia systems, customer-controlled network operations, digital radio systems, channel modeling and signal processing in digital communications, earth station/on-board modems, computer communications system performance evaluation, source encoding, compression, and quantization, and adaptive communications systems.

  8. A design automation framework for computational bioenergetics in biological networks.

    PubMed

    Angione, Claudio; Costanza, Jole; Carapezza, Giovanni; Lió, Pietro; Nicosia, Giuseppe

    2013-10-01

    The bioenergetic activity of mitochondria can be thoroughly investigated by using computational methods. In particular, in our work we focus on ATP and NADH, namely the metabolites representing the production of energy in the cell. We develop a computational framework to perform an exhaustive investigation at the level of species, reactions, genes and metabolic pathways. The framework integrates several methods implementing the state-of-the-art algorithms for many-objective optimization, sensitivity, and identifiability analysis applied to biological systems. We use this computational framework to analyze three case studies related to the human mitochondria and the algal metabolism of Chlamydomonas reinhardtii, formally described with algebraic differential equations or flux balance analysis. Integrating the results of our framework applied to interacting organelles would provide a general-purpose method for assessing the production of energy in a biological network.

  9. Military clouds: utilization of cloud computing systems at the battlefield

    NASA Astrophysics Data System (ADS)

    Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai

    2012-05-01

    Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.

  10. National research and education network

    NASA Technical Reports Server (NTRS)

    Villasenor, Tony

    1991-01-01

    Some goals of this network are as follows: Extend U.S. technological leadership in high performance computing and computer communications; Provide wide dissemination and application of the technologies both to the speed and the pace of innovation and to serve the national economy, national security, education, and the global environment; and Spur gains in the U.S. productivity and industrial competitiveness by making high performance computing and networking technologies an integral part of the design and production process. Strategies for achieving these goals are as follows: Support solutions to important scientific and technical challenges through a vigorous R and D effort; Reduce the uncertainties to industry for R and D and use of this technology through increased cooperation between government, industry, and universities and by the continued use of government and government funded facilities as a prototype user for early commercial HPCC products; and Support underlying research, network, and computational infrastructures on which U.S. high performance computing technology is based.

  11. Teaching Heat Exchanger Network Synthesis Using Interactive Microcomputer Graphics.

    ERIC Educational Resources Information Center

    Dixon, Anthony G.

    1987-01-01

    Describes the Heat Exchanger Network Synthesis (HENS) program used at Worcester Polytechnic Institute (Massachusetts) as an aid to teaching the energy integration step in process design. Focuses on the benefits of the computer graphics used in the program to increase the speed of generating and changing networks. (TW)

  12. A convergent model for distributed processing of Big Sensor Data in urban engineering networks

    NASA Astrophysics Data System (ADS)

    Parygin, D. S.; Finogeev, A. G.; Kamaev, V. A.; Finogeev, A. A.; Gnedkova, E. P.; Tyukov, A. P.

    2017-01-01

    The problems of development and research of a convergent model of the grid, cloud, fog and mobile computing for analytical Big Sensor Data processing are reviewed. The model is meant to create monitoring systems of spatially distributed objects of urban engineering networks and processes. The proposed approach is the convergence model of the distributed data processing organization. The fog computing model is used for the processing and aggregation of sensor data at the network nodes and/or industrial controllers. The program agents are loaded to perform computing tasks for the primary processing and data aggregation. The grid and the cloud computing models are used for integral indicators mining and accumulating. A computing cluster has a three-tier architecture, which includes the main server at the first level, a cluster of SCADA system servers at the second level, a lot of GPU video cards with the support for the Compute Unified Device Architecture at the third level. The mobile computing model is applied to visualize the results of intellectual analysis with the elements of augmented reality and geo-information technologies. The integrated indicators are transferred to the data center for accumulation in a multidimensional storage for the purpose of data mining and knowledge gaining.

  13. Current Issues for Higher Education Information Resources Management.

    ERIC Educational Resources Information Center

    CAUSE/EFFECT, 1996

    1996-01-01

    Issues identified as important to the future of information resources management and use in higher education include information policy in a networked environment, distributed computing, integrating information resources and college planning, benchmarking information technology, integrated digital libraries, technology integration in teaching,…

  14. Simulator for neural networks and action potentials.

    PubMed

    Baxter, Douglas A; Byrne, John H

    2007-01-01

    A key challenge for neuroinformatics is to devise methods for representing, accessing, and integrating vast amounts of diverse and complex data. A useful approach to represent and integrate complex data sets is to develop mathematical models [Arbib (The Handbook of Brain Theory and Neural Networks, pp. 741-745, 2003); Arbib and Grethe (Computing the Brain: A Guide to Neuroinformatics, 2001); Ascoli (Computational Neuroanatomy: Principles and Methods, 2002); Bower and Bolouri (Computational Modeling of Genetic and Biochemical Networks, 2001); Hines et al. (J. Comput. Neurosci. 17, 7-11, 2004); Shepherd et al. (Trends Neurosci. 21, 460-468, 1998); Sivakumaran et al. (Bioinformatics 19, 408-415, 2003); Smolen et al. (Neuron 26, 567-580, 2000); Vadigepalli et al. (OMICS 7, 235-252, 2003)]. Models of neural systems provide quantitative and modifiable frameworks for representing data and analyzing neural function. These models can be developed and solved using neurosimulators. One such neurosimulator is simulator for neural networks and action potentials (SNNAP) [Ziv (J. Neurophysiol. 71, 294-308, 1994)]. SNNAP is a versatile and user-friendly tool for developing and simulating models of neurons and neural networks. SNNAP simulates many features of neuronal function, including ionic currents and their modulation by intracellular ions and/or second messengers, and synaptic transmission and synaptic plasticity. SNNAP is written in Java and runs on most computers. Moreover, SNNAP provides a graphical user interface (GUI) and does not require programming skills. This chapter describes several capabilities of SNNAP and illustrates methods for simulating neurons and neural networks. SNNAP is available at http://snnap.uth.tmc.edu .

  15. Integrative network alignment reveals large regions of global network similarity in yeast and human.

    PubMed

    Kuchaiev, Oleksii; Przulj, Natasa

    2011-05-15

    High-throughput methods for detecting molecular interactions have produced large sets of biological network data with much more yet to come. Analogous to sequence alignment, efficient and reliable network alignment methods are expected to improve our understanding of biological systems. Unlike sequence alignment, network alignment is computationally intractable. Hence, devising efficient network alignment heuristics is currently a foremost challenge in computational biology. We introduce a novel network alignment algorithm, called Matching-based Integrative GRAph ALigner (MI-GRAAL), which can integrate any number and type of similarity measures between network nodes (e.g. proteins), including, but not limited to, any topological network similarity measure, sequence similarity, functional similarity and structural similarity. Hence, we resolve the ties in similarity measures and find a combination of similarity measures yielding the largest contiguous (i.e. connected) and biologically sound alignments. MI-GRAAL exposes the largest functional, connected regions of protein-protein interaction (PPI) network similarity to date: surprisingly, it reveals that 77.7% of proteins in the baker's yeast high-confidence PPI network participate in such a subnetwork that is fully contained in the human high-confidence PPI network. This is the first demonstration that species as diverse as yeast and human contain so large, continuous regions of global network similarity. We apply MI-GRAAL's alignments to predict functions of un-annotated proteins in yeast, human and bacteria validating our predictions in the literature. Furthermore, using network alignment scores for PPI networks of different herpes viruses, we reconstruct their phylogenetic relationship. This is the first time that phylogeny is exactly reconstructed from purely topological alignments of PPI networks. Supplementary files and MI-GRAAL executables: http://bio-nets.doc.ic.ac.uk/MI-GRAAL/.

  16. Functional Module Analysis for Gene Coexpression Networks with Network Integration.

    PubMed

    Zhang, Shuqin; Zhao, Hongyu; Ng, Michael K

    2015-01-01

    Network has been a general tool for studying the complex interactions between different genes, proteins, and other small molecules. Module as a fundamental property of many biological networks has been widely studied and many computational methods have been proposed to identify the modules in an individual network. However, in many cases, a single network is insufficient for module analysis due to the noise in the data or the tuning of parameters when building the biological network. The availability of a large amount of biological networks makes network integration study possible. By integrating such networks, more informative modules for some specific disease can be derived from the networks constructed from different tissues, and consistent factors for different diseases can be inferred. In this paper, we have developed an effective method for module identification from multiple networks under different conditions. The problem is formulated as an optimization model, which combines the module identification in each individual network and alignment of the modules from different networks together. An approximation algorithm based on eigenvector computation is proposed. Our method outperforms the existing methods, especially when the underlying modules in multiple networks are different in simulation studies. We also applied our method to two groups of gene coexpression networks for humans, which include one for three different cancers, and one for three tissues from the morbidly obese patients. We identified 13 modules with three complete subgraphs, and 11 modules with two complete subgraphs, respectively. The modules were validated through Gene Ontology enrichment and KEGG pathway enrichment analysis. We also showed that the main functions of most modules for the corresponding disease have been addressed by other researchers, which may provide the theoretical basis for further studying the modules experimentally.

  17. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks.

    PubMed

    Shen, Yiwen; Hattink, Maarten H N; Samadi, Payman; Cheng, Qixiang; Hu, Ziyiz; Gazman, Alexander; Bergman, Keren

    2018-04-16

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. We present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly network testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 µs control plane latency for data-center and high performance computing platforms.

  18. Fusing literature and full network data improves disease similarity computation.

    PubMed

    Li, Ping; Nie, Yaling; Yu, Jingkai

    2016-08-30

    Identifying relatedness among diseases could help deepen understanding for the underlying pathogenic mechanisms of diseases, and facilitate drug repositioning projects. A number of methods for computing disease similarity had been developed; however, none of them were designed to utilize information of the entire protein interaction network, using instead only those interactions involving disease causing genes. Most of previously published methods required gene-disease association data, unfortunately, many diseases still have very few or no associated genes, which impeded broad adoption of those methods. In this study, we propose a new method (MedNetSim) for computing disease similarity by integrating medical literature and protein interaction network. MedNetSim consists of a network-based method (NetSim), which employs the entire protein interaction network, and a MEDLINE-based method (MedSim), which computes disease similarity by mining the biomedical literature. Among function-based methods, NetSim achieved the best performance. Its average AUC (area under the receiver operating characteristic curve) reached 95.2 %. MedSim, whose performance was even comparable to some function-based methods, acquired the highest average AUC in all semantic-based methods. Integration of MedSim and NetSim (MedNetSim) further improved the average AUC to 96.4 %. We further studied the effectiveness of different data sources. It was found that quality of protein interaction data was more important than its volume. On the contrary, higher volume of gene-disease association data was more beneficial, even with a lower reliability. Utilizing higher volume of disease-related gene data further improved the average AUC of MedNetSim and NetSim to 97.5 % and 96.7 %, respectively. Integrating biomedical literature and protein interaction network can be an effective way to compute disease similarity. Lacking sufficient disease-related gene data, literature-based methods such as MedSim can be a great addition to function-based algorithms. It may be beneficial to steer more resources torward studying gene-disease associations and improving the quality of protein interaction data. Disease similarities can be computed using the proposed methods at http:// www.digintelli.com:8000/ .

  19. Sign: large-scale gene network estimation environment for high performance computing.

    PubMed

    Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .

  20. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.

    PubMed

    Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T

    2015-07-15

    While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks including irregular, Poisson-like spike times, and a tight balance between excitation and inhibition. These results significantly increase the biological plausibility of the spike-based approach to network computation, and uncover how several components of biological networks may work together to efficiently carry out computation. Copyright © 2015 the authors 0270-6474/15/3510112-23$15.00/0.

  1. Extensive cross-talk and global regulators identified from an analysis of the integrated transcriptional and signaling network in Escherichia coli.

    PubMed

    Antiqueira, Lucas; Janga, Sarath Chandra; Costa, Luciano da Fontoura

    2012-11-01

    To understand the regulatory dynamics of transcription factors (TFs) and their interplay with other cellular components we have integrated transcriptional, protein-protein and the allosteric or equivalent interactions which mediate the physiological activity of TFs in Escherichia coli. To study this integrated network we computed a set of network measurements followed by principal component analysis (PCA), investigated the correlations between network structure and dynamics, and carried out a procedure for motif detection. In particular, we show that outliers identified in the integrated network based on their network properties correspond to previously characterized global transcriptional regulators. Furthermore, outliers are highly and widely expressed across conditions, thus supporting their global nature in controlling many genes in the cell. Motifs revealed that TFs not only interact physically with each other but also obtain feedback from signals delivered by signaling proteins supporting the extensive cross-talk between different types of networks. Our analysis can lead to the development of a general framework for detecting and understanding global regulatory factors in regulatory networks and reinforces the importance of integrating multiple types of interactions in underpinning the interrelationships between them.

  2. Association of Small Computer Users in Education (ASCUE) Summer Conference. Proceedings (27th, North Myrtle Beach, South Carolina, June 12-16, 1994).

    ERIC Educational Resources Information Center

    Huston, Rick, Ed.; Armel, Donald, Ed.

    Topics addressed by 40 papers from a conference on microcomputers include: developing a campus wide computer ethics policy; integrating new technologies into professional education; campus computer networks; computer assisted instruction; client/server architecture; competencies for entry-level computing positions; auditing and professional…

  3. Tele-Medicine Applications of an ISDN-Based Tele-Working Platform

    DTIC Science & Technology

    2001-10-25

    developed over the Hellenic Integrated Services Digital Network (ISDN), is based on user terminals (personal computers), networking apparatus, and a...key infrastructure, ready to offer enhanced message switching and translation in response to market trends [8]. Three (3) years ago, the Hellenic PTT...should outcome to both an integrated Tele- Working platform, a main central database (completed with maintenance facilities), and a ready-to-be

  4. DeepX: Deep Learning Accelerator for Restricted Boltzmann Machine Artificial Neural Networks.

    PubMed

    Kim, Lok-Won

    2018-05-01

    Although there have been many decades of research and commercial presence on high performance general purpose processors, there are still many applications that require fully customized hardware architectures for further computational acceleration. Recently, deep learning has been successfully used to learn in a wide variety of applications, but their heavy computation demand has considerably limited their practical applications. This paper proposes a fully pipelined acceleration architecture to alleviate high computational demand of an artificial neural network (ANN) which is restricted Boltzmann machine (RBM) ANNs. The implemented RBM ANN accelerator (integrating network size, using 128 input cases per batch, and running at a 303-MHz clock frequency) integrated in a state-of-the art field-programmable gate array (FPGA) (Xilinx Virtex 7 XC7V-2000T) provides a computational performance of 301-billion connection-updates-per-second and about 193 times higher performance than a software solution running on general purpose processors. Most importantly, the architecture enables over 4 times (12 times in batch learning) higher performance compared with a previous work when both are implemented in an FPGA device (XC2VP70).

  5. ENFIN a network to enhance integrative systems biology.

    PubMed

    Kahlem, Pascal; Birney, Ewan

    2007-12-01

    Integration of biological data of various types and development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing both an adapted infrastructure to connect databases and platforms to enable the generation of new bioinformatics tools as well as the experimental validation of computational predictions. We will give an overview of the projects tackled within ENFIN and discuss the challenges associated with integration for systems biology.

  6. Network-based study reveals potential infection pathways of hepatitis-C leading to various diseases.

    PubMed

    Mukhopadhyay, Anirban; Maulik, Ujjwal

    2014-01-01

    Protein-protein interaction network-based study of viral pathogenesis has been gaining popularity among computational biologists in recent days. In the present study we attempt to investigate the possible pathways of hepatitis-C virus (HCV) infection by integrating the HCV-human interaction network, human protein interactome and human genetic disease association network. We have proposed quasi-biclique and quasi-clique mining algorithms to integrate these three networks to identify infection gateway host proteins and possible pathways of HCV pathogenesis leading to various diseases. Integrated study of three networks, namely HCV-human interaction network, human protein interaction network, and human proteins-disease association network reveals potential pathways of infection by the HCV that lead to various diseases including cancers. The gateway proteins have been found to be biologically coherent and have high degrees in human interactome compared to the other virus-targeted proteins. The analyses done in this study provide possible targets for more effective anti-hepatitis-C therapeutic involvement.

  7. Network-Based Study Reveals Potential Infection Pathways of Hepatitis-C Leading to Various Diseases

    PubMed Central

    Mukhopadhyay, Anirban; Maulik, Ujjwal

    2014-01-01

    Protein-protein interaction network-based study of viral pathogenesis has been gaining popularity among computational biologists in recent days. In the present study we attempt to investigate the possible pathways of hepatitis-C virus (HCV) infection by integrating the HCV-human interaction network, human protein interactome and human genetic disease association network. We have proposed quasi-biclique and quasi-clique mining algorithms to integrate these three networks to identify infection gateway host proteins and possible pathways of HCV pathogenesis leading to various diseases. Integrated study of three networks, namely HCV-human interaction network, human protein interaction network, and human proteins-disease association network reveals potential pathways of infection by the HCV that lead to various diseases including cancers. The gateway proteins have been found to be biologically coherent and have high degrees in human interactome compared to the other virus-targeted proteins. The analyses done in this study provide possible targets for more effective anti-hepatitis-C therapeutic involvement. PMID:24743187

  8. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences

    PubMed Central

    Rudd, Michael E.

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4. PMID:25202253

  9. A cortical edge-integration model of object-based lightness computation that explains effects of spatial context and individual differences.

    PubMed

    Rudd, Michael E

    2014-01-01

    Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4.

  10. Improving the Capture and Re-Use of Data with Wearable Computers

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara; Fating, Curtis C.; Green, Daniel; Powers, Edward I. (Technical Monitor)

    2001-01-01

    At the Goddard Space Flight Center, members of the Real-Time Software Engineering Branch are developing a wearable, wireless, voice-activated computer for use in a wide range of crosscutting space applications that would benefit from having instant Internet, network, and computer access with complete mobility and hands-free operations. These applications can be applied across many fields and disciplines including spacecraft fabrication, integration and testing (including environmental testing), and astronaut on-orbit control and monitoring of experiments with ground based experimenters. To satisfy the needs of NASA customers, this wearable computer needs to be connected to a wireless network, to transmit and receive real-time video over the network, and to receive updated documents via the Internet or NASA servers. The voice-activated computer, with a unique vocabulary, will allow the users to access documentation in a hands free environment and interact in real-time with remote users. We will discuss wearable computer development, hardware and software issues, wireless network limitations, video/audio solutions and difficulties in language development.

  11. Integrative Analysis of Many Weighted Co-Expression Networks Using Tensor Computation

    PubMed Central

    Li, Wenyuan; Liu, Chun-Chi; Zhang, Tong; Li, Haifeng; Waterman, Michael S.; Zhou, Xianghong Jasmine

    2011-01-01

    The rapid accumulation of biological networks poses new challenges and calls for powerful integrative analysis tools. Most existing methods capable of simultaneously analyzing a large number of networks were primarily designed for unweighted networks, and cannot easily be extended to weighted networks. However, it is known that transforming weighted into unweighted networks by dichotomizing the edges of weighted networks with a threshold generally leads to information loss. We have developed a novel, tensor-based computational framework for mining recurrent heavy subgraphs in a large set of massive weighted networks. Specifically, we formulate the recurrent heavy subgraph identification problem as a heavy 3D subtensor discovery problem with sparse constraints. We describe an effective approach to solving this problem by designing a multi-stage, convex relaxation protocol, and a non-uniform edge sampling technique. We applied our method to 130 co-expression networks, and identified 11,394 recurrent heavy subgraphs, grouped into 2,810 families. We demonstrated that the identified subgraphs represent meaningful biological modules by validating against a large set of compiled biological knowledge bases. We also showed that the likelihood for a heavy subgraph to be meaningful increases significantly with its recurrence in multiple networks, highlighting the importance of the integrative approach to biological network analysis. Moreover, our approach based on weighted graphs detects many patterns that would be overlooked using unweighted graphs. In addition, we identified a large number of modules that occur predominately under specific phenotypes. This analysis resulted in a genome-wide mapping of gene network modules onto the phenome. Finally, by comparing module activities across many datasets, we discovered high-order dynamic cooperativeness in protein complex networks and transcriptional regulatory networks. PMID:21698123

  12. Data-driven integration of genome-scale regulatory and metabolic network models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Saheed; Schauble, Sascha; Brooks, Aaron N.

    Microbes are diverse and extremely versatile organisms that play vital roles in all ecological niches. Understanding and harnessing microbial systems will be key to the sustainability of our planet. One approach to improving our knowledge of microbial processes is through data-driven and mechanism-informed computational modeling. Individual models of biological networks (such as metabolism, transcription, and signaling) have played pivotal roles in driving microbial research through the years. These networks, however, are highly interconnected and function in concert a fact that has led to the development of a variety of approaches aimed at simulating the integrated functions of two or moremore » network types. Though the task of integrating these different models is fraught with new challenges, the large amounts of high-throughput data sets being generated, and algorithms being developed, means that the time is at hand for concerted efforts to build integrated regulatory-metabolic networks in a data-driven fashion. Lastly, in this perspective, we review current approaches for constructing integrated regulatory-metabolic models and outline new strategies for future development of these network models for any microbial system.« less

  13. Data-driven integration of genome-scale regulatory and metabolic network models

    DOE PAGES

    Imam, Saheed; Schauble, Sascha; Brooks, Aaron N.; ...

    2015-05-05

    Microbes are diverse and extremely versatile organisms that play vital roles in all ecological niches. Understanding and harnessing microbial systems will be key to the sustainability of our planet. One approach to improving our knowledge of microbial processes is through data-driven and mechanism-informed computational modeling. Individual models of biological networks (such as metabolism, transcription, and signaling) have played pivotal roles in driving microbial research through the years. These networks, however, are highly interconnected and function in concert a fact that has led to the development of a variety of approaches aimed at simulating the integrated functions of two or moremore » network types. Though the task of integrating these different models is fraught with new challenges, the large amounts of high-throughput data sets being generated, and algorithms being developed, means that the time is at hand for concerted efforts to build integrated regulatory-metabolic networks in a data-driven fashion. Lastly, in this perspective, we review current approaches for constructing integrated regulatory-metabolic models and outline new strategies for future development of these network models for any microbial system.« less

  14. Understanding principles of integration and segregation using whole-brain computational connectomics: implications for neuropsychiatric disorders

    PubMed Central

    Lord, Louis-David; Stevner, Angus B.; Kringelbach, Morten L.

    2017-01-01

    To survive in an ever-changing environment, the brain must seamlessly integrate a rich stream of incoming information into coherent internal representations that can then be used to efficiently plan for action. The brain must, however, balance its ability to integrate information from various sources with a complementary capacity to segregate information into modules which perform specialized computations in local circuits. Importantly, evidence suggests that imbalances in the brain's ability to bind together and/or segregate information over both space and time is a common feature of several neuropsychiatric disorders. Most studies have, however, until recently strictly attempted to characterize the principles of integration and segregation in static (i.e. time-invariant) representations of human brain networks, hence disregarding the complex spatio-temporal nature of these processes. In the present Review, we describe how the emerging discipline of whole-brain computational connectomics may be used to study the causal mechanisms of the integration and segregation of information on behaviourally relevant timescales. We emphasize how novel methods from network science and whole-brain computational modelling can expand beyond traditional neuroimaging paradigms and help to uncover the neurobiological determinants of the abnormal integration and segregation of information in neuropsychiatric disorders. This article is part of the themed issue ‘Mathematical methods in medicine: neuroscience, cardiology and pathology’. PMID:28507228

  15. Mergeomics: a web server for identifying pathological pathways, networks, and key regulators via multidimensional data integration.

    PubMed

    Arneson, Douglas; Bhattacharya, Anindya; Shu, Le; Mäkinen, Ville-Petteri; Yang, Xia

    2016-09-09

    Human diseases are commonly the result of multidimensional changes at molecular, cellular, and systemic levels. Recent advances in genomic technologies have enabled an outpour of omics datasets that capture these changes. However, separate analyses of these various data only provide fragmented understanding and do not capture the holistic view of disease mechanisms. To meet the urgent needs for tools that effectively integrate multiple types of omics data to derive biological insights, we have developed Mergeomics, a computational pipeline that integrates multidimensional disease association data with functional genomics and molecular networks to retrieve biological pathways, gene networks, and central regulators critical for disease development. To make the Mergeomics pipeline available to a wider research community, we have implemented an online, user-friendly web server ( http://mergeomics. idre.ucla.edu/ ). The web server features a modular implementation of the Mergeomics pipeline with detailed tutorials. Additionally, it provides curated genomic resources including tissue-specific expression quantitative trait loci, ENCODE functional annotations, biological pathways, and molecular networks, and offers interactive visualization of analytical results. Multiple computational tools including Marker Dependency Filtering (MDF), Marker Set Enrichment Analysis (MSEA), Meta-MSEA, and Weighted Key Driver Analysis (wKDA) can be used separately or in flexible combinations. User-defined summary-level genomic association datasets (e.g., genetic, transcriptomic, epigenomic) related to a particular disease or phenotype can be uploaded and computed real-time to yield biologically interpretable results, which can be viewed online and downloaded for later use. Our Mergeomics web server offers researchers flexible and user-friendly tools to facilitate integration of multidimensional data into holistic views of disease mechanisms in the form of tissue-specific key regulators, biological pathways, and gene networks.

  16. Compact VLSI neural computer integrated with active pixel sensor for real-time ATR applications

    NASA Astrophysics Data System (ADS)

    Fang, Wai-Chi; Udomkesmalee, Gabriel; Alkalai, Leon

    1997-04-01

    A compact VLSI neural computer integrated with an active pixel sensor has been under development to mimic what is inherent in biological vision systems. This electronic eye- brain computer is targeted for real-time machine vision applications which require both high-bandwidth communication and high-performance computing for data sensing, synergy of multiple types of sensory information, feature extraction, target detection, target recognition, and control functions. The neural computer is based on a composite structure which combines Annealing Cellular Neural Network (ACNN) and Hierarchical Self-Organization Neural Network (HSONN). The ACNN architecture is a programmable and scalable multi- dimensional array of annealing neurons which are locally connected with their local neurons. Meanwhile, the HSONN adopts a hierarchical structure with nonlinear basis functions. The ACNN+HSONN neural computer is effectively designed to perform programmable functions for machine vision processing in all levels with its embedded host processor. It provides a two order-of-magnitude increase in computation power over the state-of-the-art microcomputer and DSP microelectronics. A compact current-mode VLSI design feasibility of the ACNN+HSONN neural computer is demonstrated by a 3D 16X8X9-cube neural processor chip design in a 2-micrometers CMOS technology. Integration of this neural computer as one slice of a 4'X4' multichip module into the 3D MCM based avionics architecture for NASA's New Millennium Program is also described.

  17. Web-Based Learning in the Computer-Aided Design Curriculum.

    ERIC Educational Resources Information Center

    Sung, Wen-Tsai; Ou, S. C.

    2002-01-01

    Applies principles of constructivism and virtual reality (VR) to computer-aided design (CAD) curriculum, particularly engineering, by integrating network, VR and CAD technologies into a Web-based learning environment that expands traditional two-dimensional computer graphics into a three-dimensional real-time simulation that enhances user…

  18. Design and implementation of space physics multi-model application integration based on web

    NASA Astrophysics Data System (ADS)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.

  19. Social Networks Users: Fear of Missing out in Preservice Teachers

    ERIC Educational Resources Information Center

    Gezgin, Deniz Mertkan; Hamutoglu, Nazire Burcin; Gemikonakli, Orhan; Raman, Ilhan

    2017-01-01

    As mobile computing and smartphones become an integrated part of our lives, the time individuals spend on social networks has significantly increased. Moreover, a link has been established between the uncontrolled use of social networks to the development of undesirable habits and behaviors including addictions. One such behavior, namely, fear of…

  20. Efficient Embedded Decoding of Neural Network Language Models in a Machine Translation System.

    PubMed

    Zamora-Martinez, Francisco; Castro-Bleda, Maria Jose

    2018-02-22

    Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing tasks, such as Machine Translation. We introduce in this work a Statistical Machine Translation (SMT) system which fully integrates NNLMs in the decoding stage, breaking the traditional approach based on [Formula: see text]-best list rescoring. The neural net models (both language models (LMs) and translation models) are fully coupled in the decoding stage, allowing to more strongly influence the translation quality. Computational issues were solved by using a novel idea based on memorization and smoothing of the softmax constants to avoid their computation, which introduces a trade-off between LM quality and computational cost. These ideas were studied in a machine translation task with different combinations of neural networks used both as translation models and as target LMs, comparing phrase-based and [Formula: see text]-gram-based systems, showing that the integrated approach seems more promising for [Formula: see text]-gram-based systems, even with nonfull-quality NNLMs.

  1. Neuronal integration of dynamic sources: Bayesian learning and Bayesian inference.

    PubMed

    Siegelmann, Hava T; Holzman, Lars E

    2010-09-01

    One of the brain's most basic functions is integrating sensory data from diverse sources. This ability causes us to question whether the neural system is computationally capable of intelligently integrating data, not only when sources have known, fixed relative dependencies but also when it must determine such relative weightings based on dynamic conditions, and then use these learned weightings to accurately infer information about the world. We suggest that the brain is, in fact, fully capable of computing this parallel task in a single network and describe a neural inspired circuit with this property. Our implementation suggests the possibility that evidence learning requires a more complex organization of the network than was previously assumed, where neurons have different specialties, whose emergence brings the desired adaptivity seen in human online inference.

  2. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models.

    PubMed

    Mazzoni, Alberto; Lindén, Henrik; Cuntz, Hermann; Lansner, Anders; Panzeri, Stefano; Einevoll, Gaute T

    2015-12-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP). Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best "LFP proxy", we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents) with "ground-truth" LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D) network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo.

  3. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models

    PubMed Central

    Cuntz, Hermann; Lansner, Anders; Panzeri, Stefano; Einevoll, Gaute T.

    2015-01-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP). Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best “LFP proxy”, we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents) with “ground-truth” LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D) network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo. PMID:26657024

  4. CIM for 300-mm semiconductor fab

    NASA Astrophysics Data System (ADS)

    Luk, Arthur

    1997-08-01

    Five years ago, factory automation (F/A) was not prevalent in the fab. Today facing the drastically changed market and the intense competition, management request the plant floor data be forward to their desktop computer. This increased demand rapidly pushed F/A to the computer integrated manufacturing (CIM). Through personalization, we successfully reduced a computer size, let them can be stored on our desktop. PC initiates a computer new era. With the advent of the network, the network computer (NC) creates fresh problems for us. When we plan to invest more than $3 billion to build new 300 mm fab, the next generation technology raises a challenging bar.

  5. Should Secondary Schools Buy Local Area Networks?

    ERIC Educational Resources Information Center

    Hyde, Hartley

    1986-01-01

    The advantages of microcomputer networks include resource sharing, multiple user communications, and integrating data processing and office automation. This article nonetheless favors stand-alone computers for Australian secondary school classrooms because of unreliable hardware, software design, and copyright problems, and individual progress…

  6. Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher R. Johnson, Charles D. Hansen

    2001-10-29

    The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less

  7. Implementing Internet of Things in a military command and control environment

    NASA Astrophysics Data System (ADS)

    Raglin, Adrienne; Metu, Somiya; Russell, Stephen; Budulas, Peter

    2017-05-01

    While the term Internet of Things (IoT) has been coined relatively recently, it has deep roots in multiple other areas of research including cyber-physical systems, pervasive and ubiquitous computing, embedded systems, mobile ad-hoc networks, wireless sensor networks, cellular networks, wearable computing, cloud computing, big data analytics, and intelligent agents. As the Internet of Things, these technologies have created a landscape of diverse heterogeneous capabilities and protocols that will require adaptive controls to effect linkages and changes that are useful to end users. In the context of military applications, it will be necessary to integrate disparate IoT devices into a common platform that necessarily must interoperate with proprietary military protocols, data structures, and systems. In this environment, IoT devices and data will not be homogeneous and provenance-controlled (i.e. single vendor/source/supplier owned). This paper presents a discussion of the challenges of integrating varied IoT devices and related software in a military environment. A review of contemporary commercial IoT protocols is given and as a practical example, a middleware implementation is proffered that provides transparent interoperability through a proactive message dissemination system. The implementation is described as a framework through which military applications can integrate and utilize commercial IoT in conjunction with existing military sensor networks and command and control (C2) systems.

  8. Quantification of Interactions between Dynamic Cellular Network Functionalities by Cascaded Layering

    PubMed Central

    Prescott, Thomas P.; Lang, Moritz; Papachristodoulou, Antonis

    2015-01-01

    Large, naturally evolved biomolecular networks typically fulfil multiple functions. When modelling or redesigning such systems, functional subsystems are often analysed independently first, before subsequent integration into larger-scale computational models. In the design and analysis process, it is therefore important to quantitatively analyse and predict the dynamics of the interactions between integrated subsystems; in particular, how the incremental effect of integrating a subsystem into a network depends on the existing dynamics of that network. In this paper we present a framework for simulating the contribution of any given functional subsystem when integrated together with one or more other subsystems. This is achieved through a cascaded layering of a network into functional subsystems, where each layer is defined by an appropriate subset of the reactions. We exploit symmetries in our formulation to exhaustively quantify each subsystem’s incremental effects with minimal computational effort. When combining subsystems, their isolated behaviour may be amplified, attenuated, or be subject to more complicated effects. We propose the concept of mutual dynamics to quantify such nonlinear phenomena, thereby defining the incompatibility and cooperativity between all pairs of subsystems when integrated into any larger network. We exemplify our theoretical framework by analysing diverse behaviours in three dynamic models of signalling and metabolic pathways: the effect of crosstalk mechanisms on the dynamics of parallel signal transduction pathways; reciprocal side-effects between several integral feedback mechanisms and the subsystems they stabilise; and consequences of nonlinear interactions between elementary flux modes in glycolysis for metabolic engineering strategies. Our analysis shows that it is not sufficient to just specify subsystems and analyse their pairwise interactions; the environment in which the interaction takes place must also be explicitly defined. Our framework provides a natural representation of nonlinear interaction phenomena, and will therefore be an important tool for modelling large-scale evolved or synthetic biomolecular networks. PMID:25933116

  9. Multisensory integration processing during olfactory-visual stimulation-An fMRI graph theoretical network analysis.

    PubMed

    Ripp, Isabelle; Zur Nieden, Anna-Nora; Blankenagel, Sonja; Franzmeier, Nicolai; Lundström, Johan N; Freiherr, Jessica

    2018-05-07

    In this study, we aimed to understand how whole-brain neural networks compute sensory information integration based on the olfactory and visual system. Task-related functional magnetic resonance imaging (fMRI) data was obtained during unimodal and bimodal sensory stimulation. Based on the identification of multisensory integration processing (MIP) specific hub-like network nodes analyzed with network-based statistics using region-of-interest based connectivity matrices, we conclude the following brain areas to be important for processing the presented bimodal sensory information: right precuneus connected contralaterally to the supramarginal gyrus for memory-related imagery and phonology retrieval, and the left middle occipital gyrus connected ipsilaterally to the inferior frontal gyrus via the inferior fronto-occipital fasciculus including functional aspects of working memory. Applied graph theory for quantification of the resulting complex network topologies indicates a significantly increased global efficiency and clustering coefficient in networks including aspects of MIP reflecting a simultaneous better integration and segregation. Graph theoretical analysis of positive and negative network correlations allowing for inferences about excitatory and inhibitory network architectures revealed-not significant, but very consistent-that MIP-specific neural networks are dominated by inhibitory relationships between brain regions involved in stimulus processing. © 2018 Wiley Periodicals, Inc.

  10. Integrated Distributed Directory Service for KSC

    NASA Technical Reports Server (NTRS)

    Ghansah, Isaac

    1997-01-01

    This paper describes an integrated distributed directory services (DDS) architecture as a fundamental component of KSC distributed computing systems. Specifically, an architecture for an integrated directory service based on DNS and X.500/LDAP has been suggested. The architecture supports using DNS in its traditional role as a name service and X.500 for other services. Specific designs were made in the integration of X.500 DDS for Public Key Certificates, Kerberos Security Services, Network-wide Login, Electronic Mail, WWW URLS, Servers, and other diverse network objects. Issues involved in incorporating the emerging Microsoft Active Directory Service MADS in KSC's X.500 were discussed.

  11. MIR@NT@N: a framework integrating transcription factors, microRNAs and their targets to identify sub-network motifs in a meta-regulation network model

    PubMed Central

    2011-01-01

    Background To understand biological processes and diseases, it is crucial to unravel the concerted interplay of transcription factors (TFs), microRNAs (miRNAs) and their targets within regulatory networks and fundamental sub-networks. An integrative computational resource generating a comprehensive view of these regulatory molecular interactions at a genome-wide scale would be of great interest to biologists, but is not available to date. Results To identify and analyze molecular interaction networks, we developed MIR@NT@N, an integrative approach based on a meta-regulation network model and a large-scale database. MIR@NT@N uses a graph-based approach to predict novel molecular actors across multiple regulatory processes (i.e. TFs acting on protein-coding or miRNA genes, or miRNAs acting on messenger RNAs). Exploiting these predictions, the user can generate networks and further analyze them to identify sub-networks, including motifs such as feedback and feedforward loops (FBL and FFL). In addition, networks can be built from lists of molecular actors with an a priori role in a given biological process to predict novel and unanticipated interactions. Analyses can be contextualized and filtered by integrating additional information such as microarray expression data. All results, including generated graphs, can be visualized, saved and exported into various formats. MIR@NT@N performances have been evaluated using published data and then applied to the regulatory program underlying epithelium to mesenchyme transition (EMT), an evolutionary-conserved process which is implicated in embryonic development and disease. Conclusions MIR@NT@N is an effective computational approach to identify novel molecular regulations and to predict gene regulatory networks and sub-networks including conserved motifs within a given biological context. Taking advantage of the M@IA environment, MIR@NT@N is a user-friendly web resource freely available at http://mironton.uni.lu which will be updated on a regular basis. PMID:21375730

  12. High-Lift Optimization Design Using Neural Networks on a Multi-Element Airfoil

    NASA Technical Reports Server (NTRS)

    Greenman, Roxana M.; Roth, Karlin R.; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The high-lift performance of a multi-element airfoil was optimized by using neural-net predictions that were trained using a computational data set. The numerical data was generated using a two-dimensional, incompressible, Navier-Stokes algorithm with the Spalart-Allmaras turbulence model. Because it is difficult to predict maximum lift for high-lift systems, an empirically-based maximum lift criteria was used in this study to determine both the maximum lift and the angle at which it occurs. Multiple input, single output networks were trained using the NASA Ames variation of the Levenberg-Marquardt algorithm for each of the aerodynamic coefficients (lift, drag, and moment). The artificial neural networks were integrated with a gradient-based optimizer. Using independent numerical simulations and experimental data for this high-lift configuration, it was shown that this design process successfully optimized flap deflection, gap, overlap, and angle of attack to maximize lift. Once the neural networks were trained and integrated with the optimizer, minimal additional computer resources were required to perform optimization runs with different initial conditions and parameters. Applying the neural networks within the high-lift rigging optimization process reduced the amount of computational time and resources by 83% compared with traditional gradient-based optimization procedures for multiple optimization runs.

  13. Network portal: a database for storage, analysis and visualization of biological networks

    PubMed Central

    Turkarslan, Serdar; Wurtmann, Elisabeth J.; Wu, Wei-Ju; Jiang, Ning; Bare, J. Christopher; Foley, Karen; Reiss, David J.; Novichkov, Pavel; Baliga, Nitin S.

    2014-01-01

    The ease of generating high-throughput data has enabled investigations into organismal complexity at the systems level through the inference of networks of interactions among the various cellular components (genes, RNAs, proteins and metabolites). The wider scientific community, however, currently has limited access to tools for network inference, visualization and analysis because these tasks often require advanced computational knowledge and expensive computing resources. We have designed the network portal (http://networks.systemsbiology.net) to serve as a modular database for the integration of user uploaded and public data, with inference algorithms and tools for the storage, visualization and analysis of biological networks. The portal is fully integrated into the Gaggle framework to seamlessly exchange data with desktop and web applications and to allow the user to create, save and modify workspaces, and it includes social networking capabilities for collaborative projects. While the current release of the database contains networks for 13 prokaryotic organisms from diverse phylogenetic clades (4678 co-regulated gene modules, 3466 regulators and 9291 cis-regulatory motifs), it will be rapidly populated with prokaryotic and eukaryotic organisms as relevant data become available in public repositories and through user input. The modular architecture, simple data formats and open API support community development of the portal. PMID:24271392

  14. National Geographic Society Kids Network: Report on 1994 teacher participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    In 1994, National Geographic Society Kids Network, a computer/telecommunications-based science curriculum, was presented to elementary and middle school teachers through summer programs sponsored by NGS and US DOE. The network program assists teachers in understanding the process of doing science; understanding the role of computers and telecommunications in the study of science, math, and engineering; and utilizing computers and telecommunications appropriately in the classroom. The program enables teacher to integrate science, math, and technology with other subjects with the ultimate goal of encouraging students of all abilities to pursue careers in science/math/engineering. This report assesses the impact of the networkmore » program on participating teachers.« less

  15. A comparison of graph- and kernel-based -omics data integration algorithms for classifying complex traits.

    PubMed

    Yan, Kang K; Zhao, Hongyu; Pang, Herbert

    2017-12-06

    High-throughput sequencing data are widely collected and analyzed in the study of complex diseases in quest of improving human health. Well-studied algorithms mostly deal with single data source, and cannot fully utilize the potential of these multi-omics data sources. In order to provide a holistic understanding of human health and diseases, it is necessary to integrate multiple data sources. Several algorithms have been proposed so far, however, a comprehensive comparison of data integration algorithms for classification of binary traits is currently lacking. In this paper, we focus on two common classes of integration algorithms, graph-based that depict relationships with subjects denoted by nodes and relationships denoted by edges, and kernel-based that can generate a classifier in feature space. Our paper provides a comprehensive comparison of their performance in terms of various measurements of classification accuracy and computation time. Seven different integration algorithms, including graph-based semi-supervised learning, graph sharpening integration, composite association network, Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machine (RVM) and Ada-boost relevance vector machine are compared and evaluated with hypertension and two cancer data sets in our study. In general, kernel-based algorithms create more complex models and require longer computation time, but they tend to perform better than graph-based algorithms. The performance of graph-based algorithms has the advantage of being faster computationally. The empirical results demonstrate that composite association network, relevance vector machine, and Ada-boost RVM are the better performers. We provide recommendations on how to choose an appropriate algorithm for integrating data from multiple sources.

  16. An Artificial Neural Network-Based Decision-Support System for Integrated Network Security

    DTIC Science & Technology

    2014-09-01

    group that they need to know in order to make team-based decisions in real-time environments, (c) Employ secure cloud computing services to host mobile...THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of Engineering and Management Air Force...out-of-the-loop syndrome and create complexity creep. As a result, full automation efforts can lead to inappropriate decision-making despite a

  17. GLOBECOM '88 - IEEE Global Telecommunications Conference and Exhibition, Hollywood, FL, Nov. 28-Dec. 1, 1988, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Various papers on communications for the information age are presented. Among the general topics considered are: telematic services and terminals, satellite communications, telecommunications mangaement network, control of integrated broadband networks, advances in digital radio systems, the intelligent network, broadband networks and services deployment, future switch architectures, performance analysis of computer networks, advances in spread spectrum, optical high-speed LANs, and broadband switching and networks. Also addressed are: multiple access protocols, video coding techniques, modulation and coding, photonic switching, SONET terminals and applications, standards for video coding, digital switching, progress in MANs, mobile and portable radio, software design for improved maintainability, multipath propagation and advanced countermeasure, data communication, network control and management, fiber in the loop, network algorithm and protocols, and advances in computer communications.

  18. Jargon that Computes: Today's PC Terminology.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1997-01-01

    Discusses PC (personal computer) and telecommunications terminology in context: Integrated Services Digital Network (ISDN); Asymmetric Digital Subscriber Line (ADSL); cable modems; satellite downloads; T1 and T3 lines; magnitudes ("giga-,""nano-"); Central Processing Unit (CPU); Random Access Memory (RAM); Universal Serial Bus…

  19. Six networks on a universal neuromorphic computing substrate.

    PubMed

    Pfeil, Thomas; Grübl, Andreas; Jeltsch, Sebastian; Müller, Eric; Müller, Paul; Petrovici, Mihai A; Schmuker, Michael; Brüderle, Daniel; Schemmel, Johannes; Meier, Karlheinz

    2013-01-01

    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality.

  20. Six Networks on a Universal Neuromorphic Computing Substrate

    PubMed Central

    Pfeil, Thomas; Grübl, Andreas; Jeltsch, Sebastian; Müller, Eric; Müller, Paul; Petrovici, Mihai A.; Schmuker, Michael; Brüderle, Daniel; Schemmel, Johannes; Meier, Karlheinz

    2013-01-01

    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality. PMID:23423583

  1. Advances in the integration of transcriptional regulatory information into genome-scale metabolic models.

    PubMed

    Vivek-Ananth, R P; Samal, Areejit

    2016-09-01

    A major goal of systems biology is to build predictive computational models of cellular metabolism. Availability of complete genome sequences and wealth of legacy biochemical information has led to the reconstruction of genome-scale metabolic networks in the last 15 years for several organisms across the three domains of life. Due to paucity of information on kinetic parameters associated with metabolic reactions, the constraint-based modelling approach, flux balance analysis (FBA), has proved to be a vital alternative to investigate the capabilities of reconstructed metabolic networks. In parallel, advent of high-throughput technologies has led to the generation of massive amounts of omics data on transcriptional regulation comprising mRNA transcript levels and genome-wide binding profile of transcriptional regulators. A frontier area in metabolic systems biology has been the development of methods to integrate the available transcriptional regulatory information into constraint-based models of reconstructed metabolic networks in order to increase the predictive capabilities of computational models and understand the regulation of cellular metabolism. Here, we review the existing methods to integrate transcriptional regulatory information into constraint-based models of metabolic networks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.

    PubMed

    Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi

    2017-08-01

    With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.

  3. A web-based system for neural network based classification in temporomandibular joint osteoarthritis.

    PubMed

    de Dumast, Priscille; Mirabel, Clément; Cevidanes, Lucia; Ruellas, Antonio; Yatabe, Marilia; Ioshida, Marcos; Ribera, Nina Tubau; Michoud, Loic; Gomes, Liliane; Huang, Chao; Zhu, Hongtu; Muniz, Luciana; Shoukri, Brandon; Paniagua, Beatriz; Styner, Martin; Pieper, Steve; Budin, Francois; Vimort, Jean-Baptiste; Pascal, Laura; Prieto, Juan Carlos

    2018-07-01

    The purpose of this study is to describe the methodological innovations of a web-based system for storage, integration and computation of biomedical data, using a training imaging dataset to remotely compute a deep neural network classifier of temporomandibular joint osteoarthritis (TMJOA). This study imaging dataset consisted of three-dimensional (3D) surface meshes of mandibular condyles constructed from cone beam computed tomography (CBCT) scans. The training dataset consisted of 259 condyles, 105 from control subjects and 154 from patients with diagnosis of TMJ OA. For the image analysis classification, 34 right and left condyles from 17 patients (39.9 ± 11.7 years), who experienced signs and symptoms of the disease for less than 5 years, were included as the testing dataset. For the integrative statistical model of clinical, biological and imaging markers, the sample consisted of the same 17 test OA subjects and 17 age and sex matched control subjects (39.4 ± 15.4 years), who did not show any sign or symptom of OA. For these 34 subjects, a standardized clinical questionnaire, blood and saliva samples were also collected. The technological methodologies in this study include a deep neural network classifier of 3D condylar morphology (ShapeVariationAnalyzer, SVA), and a flexible web-based system for data storage, computation and integration (DSCI) of high dimensional imaging, clinical, and biological data. The DSCI system trained and tested the neural network, indicating 5 stages of structural degenerative changes in condylar morphology in the TMJ with 91% close agreement between the clinician consensus and the SVA classifier. The DSCI remotely ran with a novel application of a statistical analysis, the Multivariate Functional Shape Data Analysis, that computed high dimensional correlations between shape 3D coordinates, clinical pain levels and levels of biological markers, and then graphically displayed the computation results. The findings of this study demonstrate a comprehensive phenotypic characterization of TMJ health and disease at clinical, imaging and biological levels, using novel flexible and versatile open-source tools for a web-based system that provides advanced shape statistical analysis and a neural network based classification of temporomandibular joint osteoarthritis. Published by Elsevier Ltd.

  4. The COPD Knowledge Base: enabling data analysis and computational simulation in translational COPD research.

    PubMed

    Cano, Isaac; Tényi, Ákos; Schueller, Christine; Wolff, Martin; Huertas Migueláñez, M Mercedes; Gomez-Cabrero, David; Antczak, Philipp; Roca, Josep; Cascante, Marta; Falciani, Francesco; Maier, Dieter

    2014-11-28

    Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http://www.copdknowledgebase.eu.

  5. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    NASA Astrophysics Data System (ADS)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  6. The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).

    ERIC Educational Resources Information Center

    Library Software Review, 1984

    1984-01-01

    Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…

  7. Organizing for intelligent transportation systems : case study of emergency operations in San Francisco Bay area

    DOT National Transportation Integrated Search

    1997-01-01

    Computer-integrated transportation (CIT) is envisioned as an integrated network of public and private transportation organizations, each with unique responsibilities but working toward a common mission of facilitating travel across all modes of trans...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trujillo, Angelina Michelle

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  9. Heterogeneity in Health Care Computing Environments

    PubMed Central

    Sengupta, Soumitra

    1989-01-01

    This paper discusses issues of heterogeneity in computer systems, networks, databases, and presentation techniques, and the problems it creates in developing integrated medical information systems. The need for institutional, comprehensive goals are emphasized. Using the Columbia-Presbyterian Medical Center's computing environment as the case study, various steps to solve the heterogeneity problem are presented.

  10. Cyber-physical approach to the network-centric robotics control task

    NASA Astrophysics Data System (ADS)

    Muliukha, Vladimir; Ilyashenko, Alexander; Zaborovsky, Vladimir; Lukashin, Alexey

    2016-10-01

    Complex engineering tasks concerning control for groups of mobile robots are developed poorly. In our work for their formalization we use cyber-physical approach, which extends the range of engineering and physical methods for a design of complex technical objects by researching the informational aspects of communication and interaction between objects and with an external environment [1]. The paper analyzes network-centric methods for control of cyber-physical objects. Robots or cyber-physical objects interact with each other by transmitting information via computer networks using preemptive queueing system and randomized push-out mechanism [2],[3]. The main field of application for the results of our work is space robotics. The selection of cyber-physical systems as a special class of designed objects is due to the necessity of integrating various components responsible for computing, communications and control processes. Network-centric solutions allow using universal means for the organization of information exchange to integrate different technologies for the control system.

  11. Cloud Computing Services for Seismic Networks

    NASA Astrophysics Data System (ADS)

    Olson, Michael

    This thesis describes a compositional framework for developing situation awareness applications: applications that provide ongoing information about a user's changing environment. The thesis describes how the framework is used to develop a situation awareness application for earthquakes. The applications are implemented as Cloud computing services connected to sensors and actuators. The architecture and design of the Cloud services are described and measurements of performance metrics are provided. The thesis includes results of experiments on earthquake monitoring conducted over a year. The applications developed by the framework are (1) the CSN---the Community Seismic Network---which uses relatively low-cost sensors deployed by members of the community, and (2) SAF---the Situation Awareness Framework---which integrates data from multiple sources, including the CSN, CISN---the California Integrated Seismic Network, a network consisting of high-quality seismometers deployed carefully by professionals in the CISN organization and spread across Southern California---and prototypes of multi-sensor platforms that include carbon monoxide, methane, dust and radiation sensors.

  12. Next Generation Space Telescope Integrated Science Module Data System

    NASA Technical Reports Server (NTRS)

    Schnurr, Richard G.; Greenhouse, Matthew A.; Jurotich, Matthew M.; Whitley, Raymond; Kalinowski, Keith J.; Love, Bruce W.; Travis, Jeffrey W.; Long, Knox S.

    1999-01-01

    The Data system for the Next Generation Space Telescope (NGST) Integrated Science Module (ISIM) is the primary data interface between the spacecraft, telescope, and science instrument systems. This poster includes block diagrams of the ISIM data system and its components derived during the pre-phase A Yardstick feasibility study. The poster details the hardware and software components used to acquire and process science data for the Yardstick instrument compliment, and depicts the baseline external interfaces to science instruments and other systems. This baseline data system is a fully redundant, high performance computing system. Each redundant computer contains three 150 MHz power PC processors. All processors execute a commercially available real time multi-tasking operating system supporting, preemptive multi-tasking, file management and network interfaces. These six processors in the system are networked together. The spacecraft interface baseline is an extension of the network, which links the six processors. The final selection for Processor busses, processor chips, network interfaces, and high-speed data interfaces will be made during mid 2002.

  13. Semantic integration of data on transcriptional regulation

    PubMed Central

    Baitaluk, Michael; Ponomarenko, Julia

    2010-01-01

    Motivation: Experimental and predicted data concerning gene transcriptional regulation are distributed among many heterogeneous sources. However, there are no resources to integrate these data automatically or to provide a ‘one-stop shop’ experience for users seeking information essential for deciphering and modeling gene regulatory networks. Results: IntegromeDB, a semantic graph-based ‘deep-web’ data integration system that automatically captures, integrates and manages publicly available data concerning transcriptional regulation, as well as other relevant biological information, is proposed in this article. The problems associated with data integration are addressed by ontology-driven data mapping, multiple data annotation and heterogeneous data querying, also enabling integration of the user's data. IntegromeDB integrates over 100 experimental and computational data sources relating to genomics, transcriptomics, genetics, and functional and interaction data concerning gene transcriptional regulation in eukaryotes and prokaryotes. Availability: IntegromeDB is accessible through the integrated research environment BiologicalNetworks at http://www.BiologicalNetworks.org Contact: baitaluk@sdsc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20427517

  14. Semantic integration of data on transcriptional regulation.

    PubMed

    Baitaluk, Michael; Ponomarenko, Julia

    2010-07-01

    Experimental and predicted data concerning gene transcriptional regulation are distributed among many heterogeneous sources. However, there are no resources to integrate these data automatically or to provide a 'one-stop shop' experience for users seeking information essential for deciphering and modeling gene regulatory networks. IntegromeDB, a semantic graph-based 'deep-web' data integration system that automatically captures, integrates and manages publicly available data concerning transcriptional regulation, as well as other relevant biological information, is proposed in this article. The problems associated with data integration are addressed by ontology-driven data mapping, multiple data annotation and heterogeneous data querying, also enabling integration of the user's data. IntegromeDB integrates over 100 experimental and computational data sources relating to genomics, transcriptomics, genetics, and functional and interaction data concerning gene transcriptional regulation in eukaryotes and prokaryotes. IntegromeDB is accessible through the integrated research environment BiologicalNetworks at http://www.BiologicalNetworks.org baitaluk@sdsc.edu Supplementary data are available at Bioinformatics online.

  15. Air Force highly integrated photonics program: development and demonstration of an optically transparent fiber optic network for avionics applications

    NASA Astrophysics Data System (ADS)

    Whaley, Gregory J.; Karnopp, Roger J.

    2010-04-01

    The goal of the Air Force Highly Integrated Photonics (HIP) program is to develop and demonstrate single photonic chip components which support a single mode fiber network architecture for use on mobile military platforms. We propose an optically transparent, broadcast and select fiber optic network as the next generation interconnect on avionics platforms. In support of this network, we have developed three principal, single-chip photonic components: a tunable laser transmitter, a 32x32 port star coupler, and a 32 port multi-channel receiver which are all compatible with demanding avionics environmental and size requirements. The performance of the developed components will be presented as well as the results of a demonstration system which integrates the components into a functional network representative of the form factor used in advanced avionics computing and signal processing applications.

  16. Integrating Computers into the Accounting Curriculum Using an IBM PC Network. Final Report.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    Noting the increased use of microcomputers in commerce and the accounting profession, the Department of Accounting and Finance at the University of Manchester recognized the importance of integrating microcomputers into the accounting curriculum and requested and received a grant to develop an integrated study environment in which students would…

  17. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    PubMed

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks.

  18. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment

    PubMed Central

    2014-01-01

    Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks. PMID:24428926

  19. Computer-aided decision making.

    Treesearch

    Keith M. Reynolds; Daniel L. Schmoldt

    2006-01-01

    Several major classes of software technologies have been used in decisionmaking for forest management applications over the past few decades. These computer-based technologies include mathematical programming, expert systems, network models, multi-criteria decisionmaking, and integrated systems. Each technology possesses unique advantages and disadvantages, and has...

  20. Application-oriented integrated control center (AICC) for heterogeneous optical networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie; Cao, Xuping; Wang, Dajiang; Wu, Koubo; Cai, Yinxiang; Gu, Wanyi

    2011-12-01

    Various broad bandwidth services have being swallowing the bandwidth resource of optical networks, such as the data center application and cloud computation. There are still some challenges for future optical networks although the available bandwidth is increasing with the development of transmission technologies. The relationship between upper application layer and lower network resource layer is necessary to be researched further. In order to improve the efficiency of network resources and capability of service provisioning, heterogeneous optical networks resource can be abstracted as unified Application Programming Interfaces (APIs) which can be open to various upper applications through Application-oriented Integrated Control Center (AICC) proposed in the paper. A novel Openflow-based unified control architecture is proposed for the optimization of cross layer resources. Numeric results show good performance of AICC through simulation experiments.

  1. Fast numerical methods for simulating large-scale integrate-and-fire neuronal networks.

    PubMed

    Rangan, Aaditya V; Cai, David

    2007-02-01

    We discuss numerical methods for simulating large-scale, integrate-and-fire (I&F) neuronal networks. Important elements in our numerical methods are (i) a neurophysiologically inspired integrating factor which casts the solution as a numerically tractable integral equation, and allows us to obtain stable and accurate individual neuronal trajectories (i.e., voltage and conductance time-courses) even when the I&F neuronal equations are stiff, such as in strongly fluctuating, high-conductance states; (ii) an iterated process of spike-spike corrections within groups of strongly coupled neurons to account for spike-spike interactions within a single large numerical time-step; and (iii) a clustering procedure of firing events in the network to take advantage of localized architectures, such as spatial scales of strong local interactions, which are often present in large-scale computational models-for example, those of the primary visual cortex. (We note that the spike-spike corrections in our methods are more involved than the correction of single neuron spike-time via a polynomial interpolation as in the modified Runge-Kutta methods commonly used in simulations of I&F neuronal networks.) Our methods can evolve networks with relatively strong local interactions in an asymptotically optimal way such that each neuron fires approximately once in [Formula: see text] operations, where N is the number of neurons in the system. We note that quantifications used in computational modeling are often statistical, since measurements in a real experiment to characterize physiological systems are typically statistical, such as firing rate, interspike interval distributions, and spike-triggered voltage distributions. We emphasize that it takes much less computational effort to resolve statistical properties of certain I&F neuronal networks than to fully resolve trajectories of each and every neuron within the system. For networks operating in realistic dynamical regimes, such as strongly fluctuating, high-conductance states, our methods are designed to achieve statistical accuracy when very large time-steps are used. Moreover, our methods can also achieve trajectory-wise accuracy when small time-steps are used.

  2. Instruction of Computer Supported Collaborative Learning Environment and Students' Contribution Quality

    ERIC Educational Resources Information Center

    Akgün, Ergün; Akkoyunlu, Buket

    2013-01-01

    Along with the integration of network and communication innovations into education, those technology enriched learning environments gained importance both qualitatively and operationally. Using network and communication innovations in the education field, provides diffusion of information and global accessibility, and also allows physically…

  3. Electronic Networks: Crossing Boundaries/Creating Communities.

    ERIC Educational Resources Information Center

    Howard, Tharon, Ed.; Benson, Chris, Ed.; Gooch, Rocky; Goswami, Dixie

    Written by practicing teachers about actual instructional computing projects, this book provides information teachers need to integrate instructional technologies into their classrooms. The book is divided into three parts. Part 1, "New Tools for the Classroom: An Introduction to Networked Learning," includes chapters: (1) "Getting Started in a…

  4. The Many-Headed Hydra: Information Networking at LAA.

    ERIC Educational Resources Information Center

    Winzenried, Arthur P.

    1997-01-01

    Describes an integrated computer library system installed at Lilydale Adventist Academy (LAA) in Melbourne (Australia) in response to a limited budget, increased demand, and greater user expectations. Topics include student workstations, cost effectiveness, CD-ROMS on local area networks, and student input regarding their needs. (Author/LRW)

  5. Apollo Ring Optical Switch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maestas, J.H.

    1987-03-01

    An optical switch was designed, built, and installed at Sandia National Laboratories in Albuquerque, New Mexico, to facilitate the integration of two Apollo computer networks into a single network. This report presents an overview of the optical switch as well as its layout, switch testing procedure and test data, and installation.

  6. Modeling of Receptor Tyrosine Kinase Signaling: Computational and Experimental Protocols.

    PubMed

    Fey, Dirk; Aksamitiene, Edita; Kiyatkin, Anatoly; Kholodenko, Boris N

    2017-01-01

    The advent of systems biology has convincingly demonstrated that the integration of experiments and dynamic modelling is a powerful approach to understand the cellular network biology. Here we present experimental and computational protocols that are necessary for applying this integrative approach to the quantitative studies of receptor tyrosine kinase (RTK) signaling networks. Signaling by RTKs controls multiple cellular processes, including the regulation of cell survival, motility, proliferation, differentiation, glucose metabolism, and apoptosis. We describe methods of model building and training on experimentally obtained quantitative datasets, as well as experimental methods of obtaining quantitative dose-response and temporal dependencies of protein phosphorylation and activities. The presented methods make possible (1) both the fine-grained modeling of complex signaling dynamics and identification of salient, course-grained network structures (such as feedback loops) that bring about intricate dynamics, and (2) experimental validation of dynamic models.

  7. Prototyping an institutional IAIMS/UMLS information environment for an academic medical center.

    PubMed

    Miller, P L; Paton, J A; Clyman, J I; Powsner, S M

    1992-07-01

    The paper describes a prototype information environment designed to link network-based information resources in an integrated fashion and thus enhance the information capabilities of an academic medical center. The prototype was implemented on a single Macintosh computer to permit exploration of the overall "information architecture" and to demonstrate the various desired capabilities prior to full-scale network-based implementation. At the heart of the prototype are two components: a diverse set of information resources available over an institutional computer network and an information sources map designed to assist users in finding and accessing information resources relevant to their needs. The paper describes these and other components of the prototype and presents a scenario illustrating its use. The prototype illustrates the link between the goals of two National Library of Medicine initiatives, the Integrated Academic Information Management System (IAIMS) and the Unified Medical Language System (UMLS).

  8. Methods for biological data integration: perspectives and challenges

    PubMed Central

    Gligorijević, Vladimir; Pržulj, Nataša

    2015-01-01

    Rapid technological advances have led to the production of different types of biological data and enabled construction of complex networks with various types of interactions between diverse biological entities. Standard network data analysis methods were shown to be limited in dealing with such heterogeneous networked data and consequently, new methods for integrative data analyses have been proposed. The integrative methods can collectively mine multiple types of biological data and produce more holistic, systems-level biological insights. We survey recent methods for collective mining (integration) of various types of networked biological data. We compare different state-of-the-art methods for data integration and highlight their advantages and disadvantages in addressing important biological problems. We identify the important computational challenges of these methods and provide a general guideline for which methods are suited for specific biological problems, or specific data types. Moreover, we propose that recent non-negative matrix factorization-based approaches may become the integration methodology of choice, as they are well suited and accurate in dealing with heterogeneous data and have many opportunities for further development. PMID:26490630

  9. Lateral Information Processing by Spiking Neurons: A Theoretical Model of the Neural Correlate of Consciousness

    PubMed Central

    Ebner, Marc; Hameroff, Stuart

    2011-01-01

    Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on “autopilot”). Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the “conscious pilot”) suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious “auto-pilot” cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways “gap junctions” in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of synchrony, within which the contents of perception are represented and contained. This mobile zone can be viewed as a model of the neural correlate of consciousness in the brain. PMID:22046178

  10. Lateral information processing by spiking neurons: a theoretical model of the neural correlate of consciousness.

    PubMed

    Ebner, Marc; Hameroff, Stuart

    2011-01-01

    Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on "autopilot"). Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the "conscious pilot") suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious "auto-pilot" cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways "gap junctions" in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of synchrony, within which the contents of perception are represented and contained. This mobile zone can be viewed as a model of the neural correlate of consciousness in the brain.

  11. A National Virtual Specimen Database for Early Cancer Detection

    NASA Technical Reports Server (NTRS)

    Crichton, Daniel; Kincaid, Heather; Kelly, Sean; Thornquist, Mark; Johnsey, Donald; Winget, Marcy

    2003-01-01

    Access to biospecimens is essential for enabling cancer biomarker discovery. The National Cancer Institute's (NCI) Early Detection Research Network (EDRN) comprises and integrates a large number of laboratories into a network in order to establish a collaborative scientific environment to discover and validate disease markers. The diversity of both the institutions and the collaborative focus has created the need for establishing cross-disciplinary teams focused on integrating expertise in biomedical research, computational and biostatistics, and computer science. Given the collaborative design of the network, the EDRN needed an informatics infrastructure. The Fred Hutchinson Cancer Research Center, the National Cancer Institute,and NASA's Jet Propulsion Laboratory (JPL) teamed up to build an informatics infrastructure creating a collaborative, science-driven research environment despite the geographic and morphology differences of the information systems that existed within the diverse network. EDRN investigators identified the need to share biospecimen data captured across the country managed in disparate databases. As a result, the informatics team initiated an effort to create a virtual tissue database whereby scientists could search and locate details about specimens located at collaborating laboratories. Each database, however, was locally implemented and integrated into collection processes and methods unique to each institution. This meant that efforts to integrate databases needed to be done in a manner that did not require redesign or re-implementation of existing system

  12. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks

    DOE PAGES

    Shen, Yiwen; Hattink, Maarten; Samadi, Payman; ...

    2018-04-13

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. Here, we present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly networkmore » testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 microsecond control plane latency for data-center and high performance computing platforms.« less

  13. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Yiwen; Hattink, Maarten; Samadi, Payman

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. Here, we present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly networkmore » testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 microsecond control plane latency for data-center and high performance computing platforms.« less

  14. Cytoscape: a software environment for integrated models of biomolecular interaction networks.

    PubMed

    Shannon, Paul; Markiel, Andrew; Ozier, Owen; Baliga, Nitin S; Wang, Jonathan T; Ramage, Daniel; Amin, Nada; Schwikowski, Benno; Ideker, Trey

    2003-11-01

    Cytoscape is an open source software project for integrating biomolecular interaction networks with high-throughput expression data and other molecular states into a unified conceptual framework. Although applicable to any system of molecular components and interactions, Cytoscape is most powerful when used in conjunction with large databases of protein-protein, protein-DNA, and genetic interactions that are increasingly available for humans and model organisms. Cytoscape's software Core provides basic functionality to layout and query the network; to visually integrate the network with expression profiles, phenotypes, and other molecular states; and to link the network to databases of functional annotations. The Core is extensible through a straightforward plug-in architecture, allowing rapid development of additional computational analyses and features. Several case studies of Cytoscape plug-ins are surveyed, including a search for interaction pathways correlating with changes in gene expression, a study of protein complexes involved in cellular recovery to DNA damage, inference of a combined physical/functional interaction network for Halobacterium, and an interface to detailed stochastic/kinetic gene regulatory models.

  15. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  16. A graph-based computational framework for simulation and optimisation of coupled infrastructure networks

    DOE PAGES

    Jalving, Jordan; Abhyankar, Shrirang; Kim, Kibaek; ...

    2017-04-24

    Here, we present a computational framework that facilitates the construction, instantiation, and analysis of large-scale optimization and simulation applications of coupled energy networks. The framework integrates the optimization modeling package PLASMO and the simulation package DMNetwork (built around PETSc). These tools use a common graphbased abstraction that enables us to achieve compatibility between data structures and to build applications that use network models of different physical fidelity. We also describe how to embed these tools within complex computational workflows using SWIFT, which is a tool that facilitates parallel execution of multiple simulation runs and management of input and output data.more » We discuss how to use these capabilities to target coupled natural gas and electricity systems.« less

  17. Volcano Monitoring: A Case Study in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Peterson, Nina; Anusuya-Rangappa, Lohith; Shirazi, Behrooz A.; Song, Wenzhan; Huang, Renjie; Tran, Daniel; Chien, Steve; Lahusen, Rick

    Recent advances in wireless sensor network technology have provided robust and reliable solutions for sophisticated pervasive computing applications such as inhospitable terrain environmental monitoring. We present a case study for developing a real-time pervasive computing system, called OASIS for optimized autonomous space in situ sensor-web, which combines ground assets (a sensor network) and space assets (NASA’s earth observing (EO-1) satellite) to monitor volcanic activities at Mount St. Helens. OASIS’s primary goals are: to integrate complementary space and in situ ground sensors into an interactive and autonomous sensorweb, to optimize power and communication resource management of the sensorweb and to provide mechanisms for seamless and scalable fusion of future space and in situ components. The OASIS in situ ground sensor network development addresses issues related to power management, bandwidth management, quality of service management, topology and routing management, and test-bed design. The space segment development consists of EO-1 architectural enhancements, feedback of EO-1 data into the in situ component, command and control integration, data ingestion and dissemination and field demonstrations.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Computing and Communications (C) Division is responsible for the Laboratory's Integrated Computing Network (ICN) as well as Laboratory-wide communications. Our computing network, used by 8,000 people distributed throughout the nation, constitutes one of the most powerful scientific computing facilities in the world. In addition to the stable production environment of the ICN, we have taken a leadership role in high-performance computing and have established the Advanced Computing Laboratory (ACL), the site of research on experimental, massively parallel computers; high-speed communication networks; distributed computing; and a broad variety of advanced applications. The computational resources available in the ACL are ofmore » the type needed to solve problems critical to national needs, the so-called Grand Challenge'' problems. The purpose of this publication is to inform our clients of our strategic and operating plans in these important areas. We review major accomplishments since late 1990 and describe our strategic planning goals and specific projects that will guide our operations over the next few years. Our mission statement, planning considerations, and management policies and practices are also included.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Computing and Communications (C) Division is responsible for the Laboratory`s Integrated Computing Network (ICN) as well as Laboratory-wide communications. Our computing network, used by 8,000 people distributed throughout the nation, constitutes one of the most powerful scientific computing facilities in the world. In addition to the stable production environment of the ICN, we have taken a leadership role in high-performance computing and have established the Advanced Computing Laboratory (ACL), the site of research on experimental, massively parallel computers; high-speed communication networks; distributed computing; and a broad variety of advanced applications. The computational resources available in the ACL are ofmore » the type needed to solve problems critical to national needs, the so-called ``Grand Challenge`` problems. The purpose of this publication is to inform our clients of our strategic and operating plans in these important areas. We review major accomplishments since late 1990 and describe our strategic planning goals and specific projects that will guide our operations over the next few years. Our mission statement, planning considerations, and management policies and practices are also included.« less

  20. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    NASA Astrophysics Data System (ADS)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  1. A Double Dwell High Sensitivity GPS Acquisition Scheme Using Binarized Convolution Neural Network

    PubMed Central

    Wang, Zhen; Zhuang, Yuan; Yang, Jun; Zhang, Hengfeng; Dong, Wei; Wang, Min; Hua, Luchi; Liu, Bo; Shi, Longxing

    2018-01-01

    Conventional GPS acquisition methods, such as Max selection and threshold crossing (MAX/TC), estimate GPS code/Doppler by its correlation peak. Different from MAX/TC, a multi-layer binarized convolution neural network (BCNN) is proposed to recognize the GPS acquisition correlation envelope in this article. The proposed method is a double dwell acquisition in which a short integration is adopted in the first dwell and a long integration is applied in the second one. To reduce the search space for parameters, BCNN detects the possible envelope which contains the auto-correlation peak in the first dwell to compress the initial search space to 1/1023. Although there is a long integration in the second dwell, the acquisition computation overhead is still low due to the compressed search space. Comprehensively, the total computation overhead of the proposed method is only 1/5 of conventional ones. Experiments show that the proposed double dwell/correlation envelope identification (DD/CEI) neural network achieves 2 dB improvement when compared with the MAX/TC under the same specification. PMID:29747373

  2. Benefits of Cooperative Learning in Weblog Networks

    ERIC Educational Resources Information Center

    Wang, Jenny; Fang, Yuehchiu

    2005-01-01

    The purpose of this study was to explore the benefits of cooperative learning in weblog networks, focusing particularly on learning outcomes in college writing curriculum integrated with computer-mediated learning tool-weblog. The first section addressed the advantages of using weblogs in cooperative learning structure on teaching and learning.…

  3. Network-Centric Warfare: Implications for Applying the Principles of War

    DTIC Science & Technology

    1999-05-17

    Noting the competitive advantage that a computer network system completely integrated into a firm’s structure and operations has provided to...businesses, individuals have begun to argue that adoption of this concept by the United States armed forces would produce a comparable, competitive advantage in

  4. SATWG networked quality function deployment

    NASA Technical Reports Server (NTRS)

    Brown, Don

    1992-01-01

    The initiative of this work is to develop a cooperative process for continual evolution of an integrated, time phased avionics technology plan that involves customers, technologists, developers, and managers. This will be accomplished by demonstrating a computer network technology to augment the Quality Function Deployment (QFD). All results are presented in viewgraph format.

  5. A hybrid optical switch architecture to integrate IP into optical networks to provide flexible and intelligent bandwidth on demand for cloud computing

    NASA Astrophysics Data System (ADS)

    Yang, Wei; Hall, Trevor J.

    2013-12-01

    The Internet is entering an era of cloud computing to provide more cost effective, eco-friendly and reliable services to consumer and business users. As a consequence, the nature of the Internet traffic has been fundamentally transformed from a pure packet-based pattern to today's predominantly flow-based pattern. Cloud computing has also brought about an unprecedented growth in the Internet traffic. In this paper, a hybrid optical switch architecture is presented to deal with the flow-based Internet traffic, aiming to offer flexible and intelligent bandwidth on demand to improve fiber capacity utilization. The hybrid optical switch is capable of integrating IP into optical networks for cloud-based traffic with predictable performance, for which the delay performance of the electronic module in the hybrid optical switch architecture is evaluated through simulation.

  6. Integrating System Dynamics and Bayesian Networks with Application to Counter-IED Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kenneth D.; Brothers, Alan J.; Whitney, Paul D.

    2010-06-06

    The practice of choosing a single modeling paradigm for predictive analysis can limit the scope and relevance of predictions and their utility to decision-making processes. Considering multiple modeling methods simultaneously may improve this situation, but a better solution provides a framework for directly integrating different, potentially complementary modeling paradigms to enable more comprehensive modeling and predictions, and thus better-informed decisions. The primary challenges of this kind of model integration are to bridge language and conceptual gaps between modeling paradigms, and to determine whether natural and useful linkages can be made in a formal mathematical manner. To address these challenges inmore » the context of two specific modeling paradigms, we explore mathematical and computational options for linking System Dynamics (SD) and Bayesian network (BN) models and incorporating data into the integrated models. We demonstrate that integrated SD/BN models can naturally be described as either state space equations or Dynamic Bayes Nets, which enables the use of many existing computational methods for simulation and data integration. To demonstrate, we apply our model integration approach to techno-social models of insurgent-led attacks and security force counter-measures centered on improvised explosive devices.« less

  7. Educational Systems Integrators/Integrated Learning System Project: Titan Schools 1993-94. OER Report.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Office of Educational Research.

    The 1993-94 Integrated Learning System (ILS) project, a means of delivering individualized instruction through a computer network, involved approximately 70 schools from New York City school districts. To help schools learn about and operate the technology in an ILS, districts were given the option of hiring one of the following companies…

  8. Bacterial molecular networks: bridging the gap between functional genomics and dynamical modelling.

    PubMed

    van Helden, Jacques; Toussaint, Ariane; Thieffry, Denis

    2012-01-01

    This introductory review synthesizes the contents of the volume Bacterial Molecular Networks of the series Methods in Molecular Biology. This volume gathers 9 reviews and 16 method chapters describing computational protocols for the analysis of metabolic pathways, protein interaction networks, and regulatory networks. Each protocol is documented by concrete case studies dedicated to model bacteria or interacting populations. Altogether, the chapters provide a representative overview of state-of-the-art methods for data integration and retrieval, network visualization, graph analysis, and dynamical modelling.

  9. Adaptation disrupts motion integration in the primate dorsal stream

    PubMed Central

    Patterson, Carlyn A.; Wissig, Stephanie C.; Kohn, Adam

    2014-01-01

    Summary Sensory systems adjust continuously to the environment. The effects of recent sensory experience—or adaptation—are typically assayed by recording in a relevant subcortical or cortical network. However, adaptation effects cannot be localized to a single, local network. Adjustments in one circuit or area will alter the input provided to others, with unclear consequences for computations implemented in the downstream circuit. Here we show that prolonged adaptation with drifting gratings, which alters responses in the early visual system, impedes the ability of area MT neurons to integrate motion signals in plaid stimuli. Perceptual experiments reveal a corresponding loss of plaid coherence. A simple computational model shows how the altered representation of motion signals in early cortex can derail integration in MT. Our results suggest that the effects of adaptation cascade through the visual system, derailing the downstream representation of distinct stimulus attributes. PMID:24507198

  10. About Distributed Simulation-based Optimization of Forming Processes using a Grid Architecture

    NASA Astrophysics Data System (ADS)

    Grauer, Manfred; Barth, Thomas

    2004-06-01

    Permanently increasing complexity of products and their manufacturing processes combined with a shorter "time-to-market" leads to more and more use of simulation and optimization software systems for product design. Finding a "good" design of a product implies the solution of computationally expensive optimization problems based on the results of simulation. Due to the computational load caused by the solution of these problems, the requirements on the Information&Telecommunication (IT) infrastructure of an enterprise or research facility are shifting from stand-alone resources towards the integration of software and hardware resources in a distributed environment for high-performance computing. Resources can either comprise software systems, hardware systems, or communication networks. An appropriate IT-infrastructure must provide the means to integrate all these resources and enable their use even across a network to cope with requirements from geographically distributed scenarios, e.g. in computational engineering and/or collaborative engineering. Integrating expert's knowledge into the optimization process is inevitable in order to reduce the complexity caused by the number of design variables and the high dimensionality of the design space. Hence, utilization of knowledge-based systems must be supported by providing data management facilities as a basis for knowledge extraction from product data. In this paper, the focus is put on a distributed problem solving environment (PSE) capable of providing access to a variety of necessary resources and services. A distributed approach integrating simulation and optimization on a network of workstations and cluster systems is presented. For geometry generation the CAD-system CATIA is used which is coupled with the FEM-simulation system INDEED for simulation of sheet-metal forming processes and the problem solving environment OpTiX for distributed optimization.

  11. A Parallel Trade Study Architecture for Design Optimization of Complex Systems

    NASA Technical Reports Server (NTRS)

    Kim, Hongman; Mullins, James; Ragon, Scott; Soremekun, Grant; Sobieszczanski-Sobieski, Jaroslaw

    2005-01-01

    Design of a successful product requires evaluating many design alternatives in a limited design cycle time. This can be achieved through leveraging design space exploration tools and available computing resources on the network. This paper presents a parallel trade study architecture to integrate trade study clients and computing resources on a network using Web services. The parallel trade study solution is demonstrated to accelerate design of experiments, genetic algorithm optimization, and a cost as an independent variable (CAIV) study for a space system application.

  12. Suborbital Telepresence and Over-the-Horizon Networking

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.

    2007-01-01

    A viewgraph presentation describing the suborbital telepresence project utilizing in-flight network computing is shown. The topics include: 1) Motivation; 2) Suborbital Telepresence and Global Test Range; 3) Tropical Composition, Cloud, and Climate Coupling Experiment (TC4); 4) Data Sets for TC4 Real-time Monitoring; 5) TC-4 Notional Architecture; 6) An Application Integration View; 7) Telepresence: Architectural Framework; and 8) Disruption Tolerant Networks.

  13. ITEP: an integrated toolkit for exploration of microbial pan-genomes.

    PubMed

    Benedict, Matthew N; Henriksen, James R; Metcalf, William W; Whitaker, Rachel J; Price, Nathan D

    2014-01-03

    Comparative genomics is a powerful approach for studying variation in physiological traits as well as the evolution and ecology of microorganisms. Recent technological advances have enabled sequencing large numbers of related genomes in a single project, requiring computational tools for their integrated analysis. In particular, accurate annotations and identification of gene presence and absence are critical for understanding and modeling the cellular physiology of newly sequenced genomes. Although many tools are available to compare the gene contents of related genomes, new tools are necessary to enable close examination and curation of protein families from large numbers of closely related organisms, to integrate curation with the analysis of gain and loss, and to generate metabolic networks linking the annotations to observed phenotypes. We have developed ITEP, an Integrated Toolkit for Exploration of microbial Pan-genomes, to curate protein families, compute similarities to externally-defined domains, analyze gene gain and loss, and generate draft metabolic networks from one or more curated reference network reconstructions in groups of related microbial species among which the combination of core and variable genes constitute the their "pan-genomes". The ITEP toolkit consists of: (1) a series of modular command-line scripts for identification, comparison, curation, and analysis of protein families and their distribution across many genomes; (2) a set of Python libraries for programmatic access to the same data; and (3) pre-packaged scripts to perform common analysis workflows on a collection of genomes. ITEP's capabilities include de novo protein family prediction, ortholog detection, analysis of functional domains, identification of core and variable genes and gene regions, sequence alignments and tree generation, annotation curation, and the integration of cross-genome analysis and metabolic networks for study of metabolic network evolution. ITEP is a powerful, flexible toolkit for generation and curation of protein families. ITEP's modular design allows for straightforward extension as analysis methods and tools evolve. By integrating comparative genomics with the development of draft metabolic networks, ITEP harnesses the power of comparative genomics to build confidence in links between genotype and phenotype and helps disambiguate gene annotations when they are evaluated in both evolutionary and metabolic network contexts.

  14. SIPP ACCESS: Information Tools Improve Access to National Longitudinal Panel Surveys.

    ERIC Educational Resources Information Center

    Robbin, Alice; David, Martin

    1988-01-01

    A computer-based, integrated information system incorporating data and information about the data, SIPP ACCESS systematically links technologies of laser disk, mainframe computer, microcomputer, and electronic networks, and applies relational technology to provide access to information about complex statistical data collections. Examples are given…

  15. Peering into the Future of Advertising.

    ERIC Educational Resources Information Center

    Hsia, H. J.

    All areas in mass communications (i.e., newspapers, magazines, television, radio, films, photos, and books) will be transformed because of the increasing sophistication of computer users, the decreasing costs for interactive computer systems, and the global adoption of integrated services digital networks (ISDN). ISDN refer to the digitization of…

  16. Advances in Integrating Autonomy with Acoustic Communications for Intelligent Networks of Marine Robots

    DTIC Science & Technology

    2013-02-01

    Sonar AUV #Environmental Sampling Environmental AUV +name : string = OEX Ocean Explorer +name : string = Hammerhead Iver2 +name : string = Unicorn ...executable» Google Earth Bluefin 21 AUV ( Unicorn ) MOOS Computer GPS «serial» Bluefin 21 AUV (Macrura) MOOS Computer «acoustic» Micro-Modem «wired...Computer Bluefin 21 AUV ( Unicorn ) MOOS Computer NURC AUV (OEX) MOOS Computer Topside MOOS Computer «wifi» 5.0GHz WiLan «acoustic» Edgetech GPS

  17. TTEthernet for Integrated Spacecraft Networks

    NASA Technical Reports Server (NTRS)

    Loveless, Andrew

    2015-01-01

    Aerospace projects have traditionally employed federated avionics architectures, in which each computer system is designed to perform one specific function (e.g. navigation). There are obvious downsides to this approach, including excessive weight (from so much computing hardware), and inefficient processor utilization (since modern processors are capable of performing multiple tasks). There has therefore been a push for integrated modular avionics (IMA), in which common computing platforms can be leveraged for different purposes. This consolidation of multiple vehicle functions to shared computing platforms can significantly reduce spacecraft cost, weight, and design complexity. However, the application of IMA principles introduces significant challenges, as the data network must accommodate traffic of mixed criticality and performance levels - potentially all related to the same shared computer hardware. Because individual network technologies are rarely so competent, the development of truly integrated network architectures often proves unreasonable. Several different types of networks are utilized - each suited to support a specific vehicle function. Critical functions are typically driven by precise timing loops, requiring networks with strict guarantees regarding message latency (i.e. determinism) and fault-tolerance. Alternatively, non-critical systems generally employ data networks prioritizing flexibility and high performance over reliable operation. Switched Ethernet has seen widespread success filling this role in terrestrial applications. Its high speed, flexibility, and the availability of inexpensive commercial off-the-shelf (COTS) components make it desirable for inclusion in spacecraft platforms. Basic Ethernet configurations have been incorporated into several preexisting aerospace projects, including both the Space Shuttle and International Space Station (ISS). However, classical switched Ethernet cannot provide the high level of network determinism required by real-time spacecraft applications. Even with modern advancements, the uncoordinated (i.e. event-driven) nature of Ethernet communication unavoidably leads to message contention within network switches. The arbitration process used to resolve such conflicts introduces variation in the time it takes for messages to be forwarded. TTEthernet1 introduces decentralized clock synchronization to switched Ethernet, enabling message transmission according to a time-triggered (TT) paradigm. A network planning tool is used to allocate each device a finite amount of time in which it may transmit a frame. Each time slot is repeated sequentially to form a periodic communication schedule that is then loaded onto each TTEthernet device (e.g. switches and end systems). Each network participant references the synchronized time in order to dispatch messages at predetermined instances. This schedule guarantees that no contention exists between time-triggered Ethernet frames in the network switches, therefore eliminating the need for arbitration (and the timing variation it causes). Besides time-triggered messaging, TTEthernet networks may provide two additional traffic classes to support communication of different criticality levels. In the rate-constrained (RC) traffic class, the frame payload size and rate of transmission along each communication channel are limited to predetermined maximums. The network switches can therefore be configured to accommodate the known worst-case traffic pattern, and buffer overflows can be eliminated. The best-effort (BE) traffic class behaves akin to classical Ethernet. No guarantees are provided regarding transmission latency or successful message delivery. TTEthernet coordinates transmission of all three traffic classes over the same physical connections, therefore accommodating the full spectrum of traffic criticality levels required in IMA architectures. Common computing platforms (e.g. LRUs) can share networking resources in such a way that failures in non-critical systems (using BE or RC communication modes) cannot impact flight-critical functions (using TT communication). Furthermore, TTEthernet hardware (e.g. switches, cabling) can be shared by both TTEthernet and classical Ethernet traffic.

  18. Ontology-supported research on vaccine efficacy, safety and integrative biological networks.

    PubMed

    He, Yongqun

    2014-07-01

    While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including Vaccine Ontology, Ontology of Adverse Events and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network ('OneNet') Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms.

  19. Ontology-supported Research on Vaccine Efficacy, Safety, and Integrative Biological Networks

    PubMed Central

    He, Yongqun

    2016-01-01

    Summary While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including the Vaccine Ontology, Ontology of Adverse Events, and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network (“OneNet”) Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms. PMID:24909153

  20. Ubiquitous virtual private network: a solution for WSN seamless integration.

    PubMed

    Villa, David; Moya, Francisco; Villanueva, Félix Jesús; Aceña, Óscar; López, Juan Carlos

    2014-01-06

    Sensor networks are becoming an essential part of ubiquitous systems and applications. However, there are no well-defined protocols or mechanisms to access the sensor network from the enterprise information system. We consider this issue as a heterogeneous network interconnection problem, and as a result, the same concepts may be applied. Specifically, we propose the use of object-oriented middlewares to provide a virtual private network in which all involved elements (sensor nodes or computer applications) will be able to communicate as if all of them were in a single and uniform network.

  1. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less

  2. Reduced integration and improved segregation of functional brain networks in Alzheimer’s disease

    NASA Astrophysics Data System (ADS)

    Kabbara, A.; Eid, H.; El Falou, W.; Khalil, M.; Wendling, F.; Hassan, M.

    2018-04-01

    Objective. Emerging evidence shows that cognitive deficits in Alzheimer’s disease (AD) are associated with disruptions in brain functional connectivity. Thus, the identification of alterations in AD functional networks has become a topic of increasing interest. However, to what extent AD induces disruption of the balance of local and global information processing in the human brain remains elusive. The main objective of this study is to explore the dynamic topological changes of AD networks in terms of brain network segregation and integration. Approach. We used electroencephalography (EEG) data recorded from 20 participants (10 AD patients and 10 healthy controls) during resting state. Functional brain networks were reconstructed using EEG source connectivity computed in different frequency bands. Graph theoretical analyses were performed assess differences between both groups. Main results. Results revealed that AD networks, compared to networks of age-matched healthy controls, are characterized by lower global information processing (integration) and higher local information processing (segregation). Results showed also significant correlation between the alterations in the AD patients’ functional brain networks and their cognitive scores. Significance. These findings may contribute to the development of EEG network-based test that could strengthen results obtained from currently-used neurophysiological tests in neurodegenerative diseases.

  3. Reduced integration and improved segregation of functional brain networks in Alzheimer's disease.

    PubMed

    Kabbara, A; Eid, H; El Falou, W; Khalil, M; Wendling, F; Hassan, M

    2018-04-01

    Emerging evidence shows that cognitive deficits in Alzheimer's disease (AD) are associated with disruptions in brain functional connectivity. Thus, the identification of alterations in AD functional networks has become a topic of increasing interest. However, to what extent AD induces disruption of the balance of local and global information processing in the human brain remains elusive. The main objective of this study is to explore the dynamic topological changes of AD networks in terms of brain network segregation and integration. We used electroencephalography (EEG) data recorded from 20 participants (10 AD patients and 10 healthy controls) during resting state. Functional brain networks were reconstructed using EEG source connectivity computed in different frequency bands. Graph theoretical analyses were performed assess differences between both groups. Results revealed that AD networks, compared to networks of age-matched healthy controls, are characterized by lower global information processing (integration) and higher local information processing (segregation). Results showed also significant correlation between the alterations in the AD patients' functional brain networks and their cognitive scores. These findings may contribute to the development of EEG network-based test that could strengthen results obtained from currently-used neurophysiological tests in neurodegenerative diseases.

  4. Enhancing gene regulatory network inference through data integration with markov random fields

    DOE PAGES

    Banf, Michael; Rhee, Seung Y.

    2017-02-01

    Here, a gene regulatory network links transcription factors to their target genes and represents a map of transcriptional regulation. Much progress has been made in deciphering gene regulatory networks computationally. However, gene regulatory network inference for most eukaryotic organisms remain challenging. To improve the accuracy of gene regulatory network inference and facilitate candidate selection for experimentation, we developed an algorithm called GRACE (Gene Regulatory network inference ACcuracy Enhancement). GRACE exploits biological a priori and heterogeneous data integration to generate high- confidence network predictions for eukaryotic organisms using Markov Random Fields in a semi-supervised fashion. GRACE uses a novel optimization schememore » to integrate regulatory evidence and biological relevance. It is particularly suited for model learning with sparse regulatory gold standard data. We show GRACE’s potential to produce high confidence regulatory networks compared to state of the art approaches using Drosophila melanogaster and Arabidopsis thaliana data. In an A. thaliana developmental gene regulatory network, GRACE recovers cell cycle related regulatory mechanisms and further hypothesizes several novel regulatory links, including a putative control mechanism of vascular structure formation due to modifications in cell proliferation.« less

  5. Enhancing gene regulatory network inference through data integration with markov random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banf, Michael; Rhee, Seung Y.

    Here, a gene regulatory network links transcription factors to their target genes and represents a map of transcriptional regulation. Much progress has been made in deciphering gene regulatory networks computationally. However, gene regulatory network inference for most eukaryotic organisms remain challenging. To improve the accuracy of gene regulatory network inference and facilitate candidate selection for experimentation, we developed an algorithm called GRACE (Gene Regulatory network inference ACcuracy Enhancement). GRACE exploits biological a priori and heterogeneous data integration to generate high- confidence network predictions for eukaryotic organisms using Markov Random Fields in a semi-supervised fashion. GRACE uses a novel optimization schememore » to integrate regulatory evidence and biological relevance. It is particularly suited for model learning with sparse regulatory gold standard data. We show GRACE’s potential to produce high confidence regulatory networks compared to state of the art approaches using Drosophila melanogaster and Arabidopsis thaliana data. In an A. thaliana developmental gene regulatory network, GRACE recovers cell cycle related regulatory mechanisms and further hypothesizes several novel regulatory links, including a putative control mechanism of vascular structure formation due to modifications in cell proliferation.« less

  6. The DYNES Instrument: A Description and Overview

    NASA Astrophysics Data System (ADS)

    Zurawski, Jason; Ball, Robert; Barczyk, Artur; Binkley, Mathew; Boote, Jeff; Boyd, Eric; Brown, Aaron; Brown, Robert; Lehman, Tom; McKee, Shawn; Meekhof, Benjeman; Mughal, Azher; Newman, Harvey; Rozsa, Sandor; Sheldon, Paul; Tackett, Alan; Voicu, Ramiro; Wolff, Stephen; Yang, Xi

    2012-12-01

    Scientific innovation continues to increase requirements for the computing and networking infrastructures of the world. Collaborative partners, instrumentation, storage, and processing facilities are often geographically and topologically separated, as is the case with LHC virtual organizations. These separations challenge the technology used to interconnect available resources, often delivered by Research and Education (R&E) networking providers, and leads to complications in the overall process of end-to-end data management. Capacity and traffic management are key concerns of R&E network operators; a delicate balance is required to serve both long-lived, high capacity network flows, as well as more traditional end-user activities. The advent of dynamic circuit services, a technology that enables the creation of variable duration, guaranteed bandwidth networking channels, allows for the efficient use of common network infrastructures. These gains are seen particularly in locations where overall capacity is scarce compared to the (sustained peak) needs of user communities. Related efforts, including those of the LHCOPN [3] operations group and the emerging LHCONE [4] project, may take advantage of available resources by designating specific network activities as a “high priority”, allowing reservation of dedicated bandwidth or optimizing for deadline scheduling and predicable delivery patterns. This paper presents the DYNES instrument, an NSF funded cyberinfrastructure project designed to facilitate end-to-end dynamic circuit services [2]. This combination of hardware and software innovation is being deployed across R&E networks in the United States at selected end-sites located on University Campuses. DYNES is peering with international efforts in other countries using similar solutions, and is increasing the reach of this emerging technology. This global data movement solution could be integrated into computing paradigms such as cloud and grid computing platforms, and through the use of APIs can be integrated into existing data movement software.

  7. Structure Shapes Dynamics and Directionality in Diverse Brain Networks: Mathematical Principles and Empirical Confirmation in Three Species

    NASA Astrophysics Data System (ADS)

    Moon, Joon-Young; Kim, Junhyeok; Ko, Tae-Wook; Kim, Minkyung; Iturria-Medina, Yasser; Choi, Jee-Hyun; Lee, Joseph; Mashour, George A.; Lee, Uncheol

    2017-04-01

    Identifying how spatially distributed information becomes integrated in the brain is essential to understanding higher cognitive functions. Previous computational and empirical studies suggest a significant influence of brain network structure on brain network function. However, there have been few analytical approaches to explain the role of network structure in shaping regional activities and directionality patterns. In this study, analytical methods are applied to a coupled oscillator model implemented in inhomogeneous networks. We first derive a mathematical principle that explains the emergence of directionality from the underlying brain network structure. We then apply the analytical methods to the anatomical brain networks of human, macaque, and mouse, successfully predicting simulation and empirical electroencephalographic data. The results demonstrate that the global directionality patterns in resting state brain networks can be predicted solely by their unique network structures. This study forms a foundation for a more comprehensive understanding of how neural information is directed and integrated in complex brain networks.

  8. Analysis and synthesis of distributed-lumped-active networks by digital computer

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The use of digital computational techniques in the analysis and synthesis of DLA (distributed lumped active) networks is considered. This class of networks consists of three distinct types of elements, namely, distributed elements (modeled by partial differential equations), lumped elements (modeled by algebraic relations and ordinary differential equations), and active elements (modeled by algebraic relations). Such a characterization is applicable to a broad class of circuits, especially including those usually referred to as linear integrated circuits, since the fabrication techniques for such circuits readily produce elements which may be modeled as distributed, as well as the more conventional lumped and active ones.

  9. HNET - A National Computerized Health Network

    PubMed Central

    Casey, Mark; Hamilton, Richard

    1988-01-01

    The HNET system demonstrated conceptually and technically a national text (and limited bit mapped graphics) computer network for use between innovative members of the health care industry. The HNET configuration of a leased high speed national packet switching network connecting any number of mainframe, mini, and micro computers was unique in it's relatively low capital costs and freedom from obsolescence. With multiple simultaneous conferences, databases, bulletin boards, calendars, and advanced electronic mail and surveys, it is marketable to innovative hospitals, clinics, physicians, health care associations and societies, nurses, multisite research projects libraries, etc.. Electronic publishing and education capabilities along with integrated voice and video transmission are identified as future enhancements.

  10. Prediction of enzymatic pathways by integrative pathway mapping

    PubMed Central

    Wichelecki, Daniel J; San Francisco, Brian; Zhao, Suwen; Rodionov, Dmitry A; Vetting, Matthew W; Al-Obaidi, Nawar F; Lin, Henry; O'Meara, Matthew J; Scott, David A; Morris, John H; Russel, Daniel; Almo, Steven C; Osterman, Andrei L

    2018-01-01

    The functions of most proteins are yet to be determined. The function of an enzyme is often defined by its interacting partners, including its substrate and product, and its role in larger metabolic networks. Here, we describe a computational method that predicts the functions of orphan enzymes by organizing them into a linear metabolic pathway. Given candidate enzyme and metabolite pathway members, this aim is achieved by finding those pathways that satisfy structural and network restraints implied by varied input information, including that from virtual screening, chemoinformatics, genomic context analysis, and ligand -binding experiments. We demonstrate this integrative pathway mapping method by predicting the L-gulonate catabolic pathway in Haemophilus influenzae Rd KW20. The prediction was subsequently validated experimentally by enzymology, crystallography, and metabolomics. Integrative pathway mapping by satisfaction of structural and network restraints is extensible to molecular networks in general and thus formally bridges the gap between structural biology and systems biology. PMID:29377793

  11. [The Development of Information Centralization and Management Integration System for Monitors Based on Wireless Sensor Network].

    PubMed

    Xu, Xiu; Zhang, Honglei; Li, Yiming; Li, Bin

    2015-07-01

    Developed the information centralization and management integration system for monitors of different brands and models with wireless sensor network technologies such as wireless location and wireless communication, based on the existing wireless network. With adaptive implementation and low cost, the system which possesses the advantages of real-time, efficiency and elaboration is able to collect status and data of the monitors, locate the monitors, and provide services with web server, video server and locating server via local network. Using an intranet computer, the clinical and device management staffs can access the status and parameters of monitors. Applications of this system provide convenience and save human resource for clinical departments, as well as promote the efficiency, accuracy and elaboration for the device management. The successful achievement of this system provides solution for integrated and elaborated management of the mobile devices including ventilator and infusion pump.

  12. Are Computer Science Students Ready for the Real World.

    ERIC Educational Resources Information Center

    Elliot, Noreen

    The typical undergraduate program in computer science includes an introduction to hardware and operating systems, file processing and database organization, data communication and networking, and programming. However, many graduates may lack the ability to integrate the concepts "learned" into a skill set and pattern of approaching problems that…

  13. Education of Engineering Students within a Multimedia/Hypermedia Environment--A Review.

    ERIC Educational Resources Information Center

    Anderl, R.; Vogel, U. R.

    This paper summarizes the activities of the Darmstadt University Department of Computer Integrated Design (Germany) related to: (1) distributed lectures (i.e., lectures distributed online through computer networks), including equipment used and ensuring sound and video quality; (2) lectures on demand, including providing access through the World…

  14. Integrating publicly-available data to generate computationally ...

    EPA Pesticide Factsheets

    The adverse outcome pathway (AOP) framework provides a way of organizing knowledge related to the key biological events that result in a particular health outcome. For the majority of environmental chemicals, the availability of curated pathways characterizing potential toxicity is limited. Methods are needed to assimilate large amounts of available molecular data and quickly generate putative AOPs for further testing and use in hazard assessment. A graph-based workflow was used to facilitate the integration of multiple data types to generate computationally-predicted (cp) AOPs. Edges between graph entities were identified through direct experimental or literature information or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20,000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways measured by differential gene expression and high-throughput screening targets. Sub-networks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (hepatic steatosis) were extracted using the network topology. Comparison of the cpAOP subnetworks to published mechanistic descriptions for both CCl4 toxicity and hepatic steatosis demonstrate that computational approaches can be used to replicate manually curated AOPs and identify pathway targets that lack genomic mar

  15. PathCase-SB architecture and database design

    PubMed Central

    2011-01-01

    Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889

  16. The Intellectual Structure of Metacognitive Scaffolding in Science Education: A Co-Citation Network Analysis

    ERIC Educational Resources Information Center

    Tang, Kai-Yu; Wang, Chia-Yu; Chang, Hsin-Yi; Chen, Sufen; Lo, Hao-Chang; Tsai, Chin-Chung

    2016-01-01

    The issues of metacognitive scaffolding in science education (MSiSE) have become increasingly popular and important. Differing from previous content reviews, this study proposes a series of quantitative computer-based analyses by integrating document co-citation analysis, social network analysis, and exploratory factor analysis to explore the…

  17. [Developing a home care nursing information system by utilizing wire-wireless network and mobile computing system].

    PubMed

    Park, Jung-Ho; Park, Sung-Ae; Yoon, Soon-Nyoung; Kang, Sung-Rye

    2004-04-01

    The purpose of this study was to develop a home care nursing network system for operating home care effectively and efficiently by utilizing a wire-wireless network and mobile computing in order to record and send patients' data in real time, and by combining the headquarter office and the local offices with home care nurses over the Internet. It complements the preceding research from 1999 by adding home care nursing standard guidelines and upgrading the PDA program. Method/1 and Prototyping were adopted to develop the main network system. The detailed research process is as follows : 1)home care nursing standard guidelines for Diabetes, cancer and peritoneal-dialysis were added in 12 domains of nursing problem fields with nursing assessment/intervention algorithms. 2) complementing the PDA program was done by omitting and integrating the home care nursing algorithm path which is unnecessary and duplicated. Also, upgrading the PDA system was done by utilizing the machinery and tools where the PDA and the data transmission modem are integrated, CDMX-1X base construction, in order to reduce a transmission error or transmission failure.

  18. Analyzing and interpreting genome data at the network level with ConsensusPathDB.

    PubMed

    Herwig, Ralf; Hardt, Christopher; Lienhard, Matthias; Kamburov, Atanas

    2016-10-01

    ConsensusPathDB consists of a comprehensive collection of human (as well as mouse and yeast) molecular interaction data integrated from 32 different public repositories and a web interface featuring a set of computational methods and visualization tools to explore these data. This protocol describes the use of ConsensusPathDB (http://consensuspathdb.org) with respect to the functional and network-based characterization of biomolecules (genes, proteins and metabolites) that are submitted to the system either as a priority list or together with associated experimental data such as RNA-seq. The tool reports interaction network modules, biochemical pathways and functional information that are significantly enriched by the user's input, applying computational methods for statistical over-representation, enrichment and graph analysis. The results of this protocol can be observed within a few minutes, even with genome-wide data. The resulting network associations can be used to interpret high-throughput data mechanistically, to characterize and prioritize biomarkers, to integrate different omics levels, to design follow-up functional assay experiments and to generate topology for kinetic models at different scales.

  19. Pathway connectivity and signaling coordination in the yeast stress-activated signaling network

    PubMed Central

    Chasman, Deborah; Ho, Yi-Hsuan; Berry, David B; Nemec, Corey M; MacGilvray, Matthew E; Hose, James; Merrill, Anna E; Lee, M Violet; Will, Jessica L; Coon, Joshua J; Ansari, Aseem Z; Craven, Mark; Gasch, Audrey P

    2014-01-01

    Stressed cells coordinate a multi-faceted response spanning many levels of physiology. Yet knowledge of the complete stress-activated regulatory network as well as design principles for signal integration remains incomplete. We developed an experimental and computational approach to integrate available protein interaction data with gene fitness contributions, mutant transcriptome profiles, and phospho-proteome changes in cells responding to salt stress, to infer the salt-responsive signaling network in yeast. The inferred subnetwork presented many novel predictions by implicating new regulators, uncovering unrecognized crosstalk between known pathways, and pointing to previously unknown ‘hubs’ of signal integration. We exploited these predictions to show that Cdc14 phosphatase is a central hub in the network and that modification of RNA polymerase II coordinates induction of stress-defense genes with reduction of growth-related transcripts. We find that the orthologous human network is enriched for cancer-causing genes, underscoring the importance of the subnetwork's predictions in understanding stress biology. PMID:25411400

  20. An integrated decision support system for TRAC: A proposal

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    Optimal allocation and usage of resources is a key to effective management. Resources of concern to TRAC are: Manpower (PSY), Money (Travel, contracts), Computing, Data, Models, etc. Management activities of TRAC include: Planning, Programming, Tasking, Monitoring, Updating, and Coordinating. Existing systems are insufficient, not completely automated, manpower intensive, and has the potential for data inconsistency exists. A system is proposed which suggests a means to integrate all project management activities of TRAC through the development of a sophisticated software and by utilizing the existing computing systems and network resources. The systems integration proposal is examined in detail.

  1. Performance Evaluation and Control of Distributed Computer Communication Networks.

    DTIC Science & Technology

    1985-09-01

    Zukerman, S. Katz, P. Rodriguez, R. Pazos , S. Resheff, Z. Tsai, Z. Zhang, L. Jong, V. Minh. Other participants are the following visiting... Pazos -Rangel "Bandwidth Allocation and Routing in ISDN’s," IEEE Communications Magazine, February 1984. Abstract The goal of communications network design...location and routing for integrated networks - is formulated, and efficient methods for its solution are presented. (2) R.A. Pazos -Rangel "Evaluation

  2. Communication Environments for Local Networks.

    DTIC Science & Technology

    1982-12-01

    San Francisco, February-March 1979, pp.272.275. [Frank 75] Frank, H., I. Gitman , and R. Van Slyke, "Packet radio system - Network * -considerations...34 in AFIPS Conference Proceedings, Volume 44: National Computer Conference, Anaheim, Calif., May 1975, pp. 217-231. [Frank 76a] Frank, H., I. Gitman ...Local, Regional and Larger Scale Integrated Networks, Volume 2, 4 February 1976. [Frank 76b] Frank, H., I. Gitman , and R. Van Slyke, Local and Regional

  3. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    NASA Astrophysics Data System (ADS)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  4. Radiology: "killer app" for next generation networks?

    PubMed

    McNeill, Kevin M

    2004-03-01

    The core principles of digital radiology were well developed by the end of the 1980 s. During the following decade tremendous improvements in computer technology enabled realization of those principles at an affordable cost. In this decade work can focus on highly distributed radiology in the context of the integrated health care enterprise. Over the same period computer networking has evolved from a relatively obscure field used by a small number of researchers across low-speed serial links to a pervasive technology that affects nearly all facets of society. Development directions in network technology will ultimately provide end-to-end data paths with speeds that match or exceed the speeds of data paths within the local network and even within workstations. This article describes key developments in Next Generation Networks, potential obstacles, and scenarios in which digital radiology can become a "killer app" that helps to drive deployment of new network infrastructure.

  5. On Using Home Networks and Cloud Computing for a Future Internet of Things

    NASA Astrophysics Data System (ADS)

    Niedermayer, Heiko; Holz, Ralph; Pahl, Marc-Oliver; Carle, Georg

    In this position paper we state four requirements for a Future Internet and sketch our initial concept. The requirements: (1) more comfort, (2) integration of home networks, (3) resources like service clouds in the network, and (4) access anywhere on any machine. Future Internet needs future quality and future comfort. There need to be new possiblities for everyone. Our focus is on higher layers and related to the many overlay proposals. We consider them to run on top of a basic Future Internet core. A new user experience means to include all user devices. Home networks and services should be a fundamental part of the Future Internet. Home networks extend access and allow interaction with the environment. Cloud Computing can provide reliable resources beyond local boundaries. For access anywhere, we also need secure storage for data and profiles in the network, in particular for access with non-personal devices (Internet terminal, ticket machine, ...).

  6. Dallas County Community College District.

    ERIC Educational Resources Information Center

    Rudy, Julia

    1989-01-01

    Management of information technology at Dallas County Community College District is centralized. Information technology organization and planning, integrated data network, computer services, end user services, and educational technology are discussed. (MLW)

  7. A comparison of queueing, cluster and distributed computing systems

    NASA Technical Reports Server (NTRS)

    Kaplan, Joseph A.; Nelson, Michael L.

    1993-01-01

    Using workstation clusters for distributed computing has become popular with the proliferation of inexpensive, powerful workstations. Workstation clusters offer both a cost effective alternative to batch processing and an easy entry into parallel computing. However, a number of workstations on a network does not constitute a cluster. Cluster management software is necessary to harness the collective computing power. A variety of cluster management and queuing systems are compared: Distributed Queueing Systems (DQS), Condor, Load Leveler, Load Balancer, Load Sharing Facility (LSF - formerly Utopia), Distributed Job Manager (DJM), Computing in Distributed Networked Environments (CODINE), and NQS/Exec. The systems differ in their design philosophy and implementation. Based on published reports on the different systems and conversations with the system's developers and vendors, a comparison of the systems are made on the integral issues of clustered computing.

  8. GLOBECOM '85 - Global Telecommunications Conference, New Orleans, LA, December 2-5, 1985, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Various papers on global telecommunications are presented. The general topics addressed include: multiservice integration with optical fibers, multicompany owned telecommunication networks, softworks quality and reliability, advanced on-board processing, impact of new services and systems on operations and maintenance, analytical studies of protocols for data communication networks, topics in packet radio networking, CCITT No. 7 to support new services, document processing and communication, antenna technology and system aspects in satellite communications. Also considered are: communication systems modelling methodology, experimental integrated local area voice/data nets, spread spectrum communications, motion video at the DS-0 rate, optical and data communications, intelligent work stations, switch performance analysis, novel radio communication systems, wireless local networks, ISDN services, LAN communication protocols, user-system interface, radio propagation and performance, mobile satellite system, software for computer networks, VLSI for ISDN terminals, quality management, man-machine interfaces in switching, and local area network performance.

  9. Linking and integrating computers for maternity care.

    PubMed

    Lumb, M; Fawdry, R

    1990-12-01

    Functionally separate computer systems have been developed for many different areas relevant to maternity care, e.g. maternity data collection, pathology and imaging reports, staff rostering, personnel, accounting, audit, primary care etc. Using land lines, modems and network gateways, many such quite distinct computer programs or databases can be made accessible from a single terminal. If computer systems are to attain their full potential for the improvement of the maternity care, there will be a need not only for terminal emulation but also for more complex integration. Major obstacles must be overcome before such integration is widely achieved. Technical and conceptual progress towards overcoming these problems is discussed, with particular reference to the OSI (open systems interconnection) initiative, to the Read clinical classification and to the MUMMIES CBS (Common Basic Specification) Maternity Care Project. The issue of confidentiality is also briefly explored.

  10. Computer Simulations of the Tumor Vasculature: Applications to Interstitial Fluid Flow, Drug Delivery, and Oxygen Supply.

    PubMed

    Welter, Michael; Rieger, Heiko

    2016-01-01

    Tumor vasculature, the blood vessel network supplying a growing tumor with nutrients such as oxygen or glucose, is in many respects different from the hierarchically organized arterio-venous blood vessel network in normal tissues. Angiogenesis (the formation of new blood vessels), vessel cooption (the integration of existing blood vessels into the tumor vasculature), and vessel regression remodel the healthy vascular network into a tumor-specific vasculature. Integrative models, based on detailed experimental data and physical laws, implement, in silico, the complex interplay of molecular pathways, cell proliferation, migration, and death, tissue microenvironment, mechanical and hydrodynamic forces, and the fine structure of the host tissue vasculature. With the help of computer simulations high-precision information about blood flow patterns, interstitial fluid flow, drug distribution, oxygen and nutrient distribution can be obtained and a plethora of therapeutic protocols can be tested before clinical trials. This chapter provides an overview over the current status of computer simulations of vascular remodeling during tumor growth including interstitial fluid flow, drug delivery, and oxygen supply within the tumor. The model predictions are compared with experimental and clinical data and a number of longstanding physiological paradigms about tumor vasculature and intratumoral solute transport are critically scrutinized.

  11. Energy-efficient STDP-based learning circuits with memristor synapses

    NASA Astrophysics Data System (ADS)

    Wu, Xinyu; Saxena, Vishal; Campbell, Kristy A.

    2014-05-01

    It is now accepted that the traditional von Neumann architecture, with processor and memory separation, is ill suited to process parallel data streams which a mammalian brain can efficiently handle. Moreover, researchers now envision computing architectures which enable cognitive processing of massive amounts of data by identifying spatio-temporal relationships in real-time and solving complex pattern recognition problems. Memristor cross-point arrays, integrated with standard CMOS technology, are expected to result in massively parallel and low-power Neuromorphic computing architectures. Recently, significant progress has been made in spiking neural networks (SNN) which emulate data processing in the cortical brain. These architectures comprise of a dense network of neurons and the synapses formed between the axons and dendrites. Further, unsupervised or supervised competitive learning schemes are being investigated for global training of the network. In contrast to a software implementation, hardware realization of these networks requires massive circuit overhead for addressing and individually updating network weights. Instead, we employ bio-inspired learning rules such as the spike-timing-dependent plasticity (STDP) to efficiently update the network weights locally. To realize SNNs on a chip, we propose to use densely integrating mixed-signal integrate-andfire neurons (IFNs) and cross-point arrays of memristors in back-end-of-the-line (BEOL) of CMOS chips. Novel IFN circuits have been designed to drive memristive synapses in parallel while maintaining overall power efficiency (<1 pJ/spike/synapse), even at spike rate greater than 10 MHz. We present circuit design details and simulation results of the IFN with memristor synapses, its response to incoming spike trains and STDP learning characterization.

  12. ChloroKB: A Web Application for the Integration of Knowledge Related to Chloroplast Metabolic Network1[OPEN

    PubMed Central

    Gloaguen, Pauline; Alban, Claude; Ravanel, Stéphane; Seigneurin-Berny, Daphné; Matringe, Michel; Ferro, Myriam; Bruley, Christophe; Rolland, Norbert; Vandenbrouck, Yves

    2017-01-01

    Higher plants, as autotrophic organisms, are effective sources of molecules. They hold great promise for metabolic engineering, but the behavior of plant metabolism at the network level is still incompletely described. Although structural models (stoichiometry matrices) and pathway databases are extremely useful, they cannot describe the complexity of the metabolic context, and new tools are required to visually represent integrated biocurated knowledge for use by both humans and computers. Here, we describe ChloroKB, a Web application (http://chlorokb.fr/) for visual exploration and analysis of the Arabidopsis (Arabidopsis thaliana) metabolic network in the chloroplast and related cellular pathways. The network was manually reconstructed through extensive biocuration to provide transparent traceability of experimental data. Proteins and metabolites were placed in their biological context (spatial distribution within cells, connectivity in the network, participation in supramolecular complexes, and regulatory interactions) using CellDesigner software. The network contains 1,147 reviewed proteins (559 localized exclusively in plastids, 68 in at least one additional compartment, and 520 outside the plastid), 122 proteins awaiting biochemical/genetic characterization, and 228 proteins for which genes have not yet been identified. The visual presentation is intuitive and browsing is fluid, providing instant access to the graphical representation of integrated processes and to a wealth of refined qualitative and quantitative data. ChloroKB will be a significant support for structural and quantitative kinetic modeling, for biological reasoning, when comparing novel data with established knowledge, for computer analyses, and for educational purposes. ChloroKB will be enhanced by continuous updates following contributions from plant researchers. PMID:28442501

  13. The Advantages of Using Technology in Second Language Education: Technology Integration in Foreign Language Teaching Demonstrates the Shift from a Behavioral to a Constructivist Learning Approach

    ERIC Educational Resources Information Center

    Wang, Li

    2005-01-01

    With the advent of networked computers and Internet technology, computer-based instruction has been widely used in language classrooms throughout the United States. Computer technologies have dramatically changed the way people gather information, conduct research and communicate with others worldwide. Considering the tremendous startup expenses,…

  14. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general computing workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed to demonstrate the ability of an expert system to autonomously control the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. Integration options are explored and several possible solutions are presented.

  15. Sig2BioPAX: Java tool for converting flat files to BioPAX Level 3 format.

    PubMed

    Webb, Ryan L; Ma'ayan, Avi

    2011-03-21

    The World Wide Web plays a critical role in enabling molecular, cell, systems and computational biologists to exchange, search, visualize, integrate, and analyze experimental data. Such efforts can be further enhanced through the development of semantic web concepts. The semantic web idea is to enable machines to understand data through the development of protocol free data exchange formats such as Resource Description Framework (RDF) and the Web Ontology Language (OWL). These standards provide formal descriptors of objects, object properties and their relationships within a specific knowledge domain. However, the overhead of converting datasets typically stored in data tables such as Excel, text or PDF into RDF or OWL formats is not trivial for non-specialists and as such produces a barrier to seamless data exchange between researchers, databases and analysis tools. This problem is particularly of importance in the field of network systems biology where biochemical interactions between genes and their protein products are abstracted to networks. For the purpose of converting biochemical interactions into the BioPAX format, which is the leading standard developed by the computational systems biology community, we developed an open-source command line tool that takes as input tabular data describing different types of molecular biochemical interactions. The tool converts such interactions into the BioPAX level 3 OWL format. We used the tool to convert several existing and new mammalian networks of protein interactions, signalling pathways, and transcriptional regulatory networks into BioPAX. Some of these networks were deposited into PathwayCommons, a repository for consolidating and organizing biochemical networks. The software tool Sig2BioPAX is a resource that enables experimental and computational systems biologists to contribute their identified networks and pathways of molecular interactions for integration and reuse with the rest of the research community.

  16. The application of data encryption technology in computer network communication security

    NASA Astrophysics Data System (ADS)

    Gong, Lina; Zhang, Li; Zhang, Wei; Li, Xuhong; Wang, Xia; Pan, Wenwen

    2017-04-01

    With the rapid development of Intemet and the extensive application of computer technology, the security of information becomes more and more serious, and the information security technology with data encryption technology as the core has also been developed greatly. Data encryption technology not only can encrypt and decrypt data, but also can realize digital signature, authentication and authentication and other functions, thus ensuring the confidentiality, integrity and confirmation of data transmission over the network. In order to improve the security of data in network communication, in this paper, a hybrid encryption system is used to encrypt and decrypt the triple DES algorithm with high security, and the two keys are encrypted with RSA algorithm, thus ensuring the security of the triple DES key and solving the problem of key management; At the same time to realize digital signature using Java security software, to ensure data integrity and non-repudiation. Finally, the data encryption system is developed by Java language. The data encryption system is simple and effective, with good security and practicality.

  17. Influence of neural adaptation on dynamics and equilibrium state of neural activities in a ring neural network

    NASA Astrophysics Data System (ADS)

    Takiyama, Ken

    2017-12-01

    How neural adaptation affects neural information processing (i.e. the dynamics and equilibrium state of neural activities) is a central question in computational neuroscience. In my previous works, I analytically clarified the dynamics and equilibrium state of neural activities in a ring-type neural network model that is widely used to model the visual cortex, motor cortex, and several other brain regions. The neural dynamics and the equilibrium state in the neural network model corresponded to a Bayesian computation and statistically optimal multiple information integration, respectively, under a biologically inspired condition. These results were revealed in an analytically tractable manner; however, adaptation effects were not considered. Here, I analytically reveal how the dynamics and equilibrium state of neural activities in a ring neural network are influenced by spike-frequency adaptation (SFA). SFA is an adaptation that causes gradual inhibition of neural activity when a sustained stimulus is applied, and the strength of this inhibition depends on neural activities. I reveal that SFA plays three roles: (1) SFA amplifies the influence of external input in neural dynamics; (2) SFA allows the history of the external input to affect neural dynamics; and (3) the equilibrium state corresponds to the statistically optimal multiple information integration independent of the existence of SFA. In addition, the equilibrium state in a ring neural network model corresponds to the statistically optimal integration of multiple information sources under biologically inspired conditions, independent of the existence of SFA.

  18. SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks.

    PubMed

    Zenke, Friedemann; Ganguli, Surya

    2018-06-01

    A vast majority of computation in the brain is performed by spiking neural networks. Despite the ubiquity of such spiking, we currently lack an understanding of how biological spiking neural circuits learn and compute in vivo, as well as how we can instantiate such capabilities in artificial spiking circuits in silico. Here we revisit the problem of supervised learning in temporally coding multilayer spiking neural networks. First, by using a surrogate gradient approach, we derive SuperSpike, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of deterministic integrate-and-fire neurons to perform nonlinear computations on spatiotemporal spike patterns. Second, inspired by recent results on feedback alignment, we compare the performance of our learning rule under different credit assignment strategies for propagating output errors to hidden units. Specifically, we test uniform, symmetric, and random feedback, finding that simpler tasks can be solved with any type of feedback, while more complex tasks require symmetric feedback. In summary, our results open the door to obtaining a better scientific understanding of learning and computation in spiking neural networks by advancing our ability to train them to solve nonlinear problems involving transformations between different spatiotemporal spike time patterns.

  19. Ubiquitous Virtual Private Network: A Solution for WSN Seamless Integration

    PubMed Central

    Villa, David; Moya, Francisco; Villanueva, Félix Jesús; Aceña, Óscar; López, Juan Carlos

    2014-01-01

    Sensor networks are becoming an essential part of ubiquitous systems and applications. However, there are no well-defined protocols or mechanisms to access the sensor network from the enterprise information system. We consider this issue as a heterogeneous network interconnection problem, and as a result, the same concepts may be applied. Specifically, we propose the use of object-oriented middlewares to provide a virtual private network in which all involved elements (sensor nodes or computer applications) will be able to communicate as if all of them were in a single and uniform network. PMID:24399154

  20. Networking the Home and University: How Families Can Be Integrated into Proximate/Distant Computer Systems.

    ERIC Educational Resources Information Center

    Watson, J. Allen; And Others

    1989-01-01

    Describes study that was conducted to determine the feasibility of networking home microcomputers with a university mainframe system in order to investigate a new family process research paradigm, as well as the design and function of the microcomputer/mainframe system. Test instrumentation is described and systems' reliability and validity are…

  1. The Mind Research Network - Mental Illness Neuroscience Discovery Grant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, J.; Calhoun, V.

    The scientific and technological programs of the Mind Research Network (MRN), reflect DOE missions in basic science and associated instrumentation, computational modeling, and experimental techniques. MRN's technical goals over the course of this project have been to develop and apply integrated, multi-modality functional imaging techniques derived from a decade of DOE-support research and technology development.

  2. First-Strike Advantage: The United States’ Counter to China’s Preemptive Integrated Network Electronic Warfare Strategy

    DTIC Science & Technology

    2013-06-01

    Ground: Chinese Capabilities for Computer Network Operations and Cyber Espionage,” 9. 57 Lolita C. Baldor, “Chinese Cyber Attacks On U.S. Continue...the Secretary of Defense, 2009. Baldor, Lolita C. “Chinese Cyber Attacks on U.S. Continue Totally Unabated, Leon Panetta.” Huffington Post (2012

  3. Design and implementation of a Windows NT network to support CNC activities

    NASA Technical Reports Server (NTRS)

    Shearrow, C. A.

    1996-01-01

    The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.

  4. Air Force Roadmap 2006-2025

    DTIC Science & Technology

    2006-06-01

    systems. Cyberspace is the electronic medium of net-centric operations, communications systems, and computers, in which horizontal integration and online...will be interoperable, more robust, responsive, and able to support faster spacecraft initialization times. This Intergrated Satellite Control... horizontally and vertically integrated information through machine-to-machine conversations enabled by a peer-based network of sensors, command

  5. School on Cloud: Towards a Paradigm Shift

    ERIC Educational Resources Information Center

    Koutsopoulos, Kostis C.; Kotsanis, Yannis C.

    2014-01-01

    This paper presents the basic concept of the EU Network School on Cloud: Namely, that present conditions require a new teaching and learning paradigm based on the integrated dimension of education, when considering the use of cloud computing. In other words, it is suggested that there is a need for an integrated approach which is simultaneously…

  6. Anterior Cingulate Cortex Instigates Adaptive Switches in Choice by Integrating Immediate and Delayed Components of Value in Ventromedial Prefrontal Cortex

    PubMed Central

    Guitart-Masip, Marc; Kurth-Nelson, Zeb; Dolan, Raymond J.

    2014-01-01

    Actions can lead to an immediate reward or punishment and a complex set of delayed outcomes. Adaptive choice necessitates the brain track and integrate both of these potential consequences. Here, we designed a sequential task whereby the decision to exploit or forego an available offer was contingent on comparing immediate value and a state-dependent future cost of expending a limited resource. Crucially, the dynamics of the task demanded frequent switches in policy based on an online computation of changing delayed consequences. We found that human subjects choose on the basis of a near-optimal integration of immediate reward and delayed consequences, with the latter computed in a prefrontal network. Within this network, anterior cingulate cortex (ACC) was dynamically coupled to ventromedial prefrontal cortex (vmPFC) when adaptive switches in choice were required. Our results suggest a choice architecture whereby interactions between ACC and vmPFC underpin an integration of immediate and delayed components of value to support flexible policy switching that accommodates the potential delayed consequences of an action. PMID:24573291

  7. Integration science and distributed networks

    NASA Astrophysics Data System (ADS)

    Landauer, Christopher; Bellman, Kirstie L.

    2002-07-01

    Our work on integration of data and knowledge sources is based in a common theoretical treatment of 'Integration Science', which leads to systematic processes for combining formal logical and mathematical systems, computational and physical systems, and human systems and organizations. The theory is based on the processing of explicit meta-knowledge about the roles played by the different knowledge sources and the methods of analysis and semantic implications of the different data values, together with information about the context in which and the purpose for which they are being combined. The research treatment is primarily mathematical, and though this kind of integration mathematics is still under development, there are some applicable common threads that have emerged already. Instead of describing the current state of the mathematical investigations, since they are not yet crystallized enough for formalisms, we describe our applications of the approach in several different areas, including our focus area of 'Constructed Complex Systems', which are complex heterogeneous systems managed or mediated by computing systems. In this context, it is important to remember that all systems are embedded, all systems are autonomous, and that all systems are distributed networks.

  8. Anterior cingulate cortex instigates adaptive switches in choice by integrating immediate and delayed components of value in ventromedial prefrontal cortex.

    PubMed

    Economides, Marcos; Guitart-Masip, Marc; Kurth-Nelson, Zeb; Dolan, Raymond J

    2014-02-26

    Actions can lead to an immediate reward or punishment and a complex set of delayed outcomes. Adaptive choice necessitates the brain track and integrate both of these potential consequences. Here, we designed a sequential task whereby the decision to exploit or forego an available offer was contingent on comparing immediate value and a state-dependent future cost of expending a limited resource. Crucially, the dynamics of the task demanded frequent switches in policy based on an online computation of changing delayed consequences. We found that human subjects choose on the basis of a near-optimal integration of immediate reward and delayed consequences, with the latter computed in a prefrontal network. Within this network, anterior cingulate cortex (ACC) was dynamically coupled to ventromedial prefrontal cortex (vmPFC) when adaptive switches in choice were required. Our results suggest a choice architecture whereby interactions between ACC and vmPFC underpin an integration of immediate and delayed components of value to support flexible policy switching that accommodates the potential delayed consequences of an action.

  9. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons.

    PubMed

    Probst, Dimitri; Petrovici, Mihai A; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems.

  10. Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons

    PubMed Central

    Probst, Dimitri; Petrovici, Mihai A.; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz

    2015-01-01

    The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems. PMID:25729361

  11. On dynamics of integrate-and-fire neural networks with conductance based synapses.

    PubMed

    Cessac, Bruno; Viéville, Thierry

    2008-01-01

    We present a mathematical analysis of networks with integrate-and-fire (IF) neurons with conductance based synapses. Taking into account the realistic fact that the spike time is only known within some finite precision, we propose a model where spikes are effective at times multiple of a characteristic time scale delta, where delta can be arbitrary small (in particular, well beyond the numerical precision). We make a complete mathematical characterization of the model-dynamics and obtain the following results. The asymptotic dynamics is composed by finitely many stable periodic orbits, whose number and period can be arbitrary large and can diverge in a region of the synaptic weights space, traditionally called the "edge of chaos", a notion mathematically well defined in the present paper. Furthermore, except at the edge of chaos, there is a one-to-one correspondence between the membrane potential trajectories and the raster plot. This shows that the neural code is entirely "in the spikes" in this case. As a key tool, we introduce an order parameter, easy to compute numerically, and closely related to a natural notion of entropy, providing a relevant characterization of the computational capabilities of the network. This allows us to compare the computational capabilities of leaky and IF models and conductance based models. The present study considers networks with constant input, and without time-dependent plasticity, but the framework has been designed for both extensions.

  12. Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership

    NASA Astrophysics Data System (ADS)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.

  13. Maximizing Information Diffusion in the Cyber-physical Integrated Network †

    PubMed Central

    Lu, Hongliang; Lv, Shaohe; Jiao, Xianlong; Wang, Xiaodong; Liu, Juan

    2015-01-01

    Nowadays, our living environment has been embedded with smart objects, such as smart sensors, smart watches and smart phones. They make cyberspace and physical space integrated by their abundant abilities of sensing, communication and computation, forming a cyber-physical integrated network. In order to maximize information diffusion in such a network, a group of objects are selected as the forwarding points. To optimize the selection, a minimum connected dominating set (CDS) strategy is adopted. However, existing approaches focus on minimizing the size of the CDS, neglecting an important factor: the weight of links. In this paper, we propose a distributed maximizing the probability of information diffusion (DMPID) algorithm in the cyber-physical integrated network. Unlike previous approaches that only consider the size of CDS selection, DMPID also considers the information spread probability that depends on the weight of links. To weaken the effects of excessively-weighted links, we also present an optimization strategy that can properly balance the two factors. The results of extensive simulation show that DMPID can nearly double the information diffusion probability, while keeping a reasonable size of selection with low overhead in different distributed networks. PMID:26569254

  14. COMPUTING THERAPY FOR PRECISION MEDICINE: COLLABORATIVE FILTERING INTEGRATES AND PREDICTS MULTI-ENTITY INTERACTIONS.

    PubMed

    Regenbogen, Sam; Wilkins, Angela D; Lichtarge, Olivier

    2016-01-01

    Biomedicine produces copious information it cannot fully exploit. Specifically, there is considerable need to integrate knowledge from disparate studies to discover connections across domains. Here, we used a Collaborative Filtering approach, inspired by online recommendation algorithms, in which non-negative matrix factorization (NMF) predicts interactions among chemicals, genes, and diseases only from pairwise information about their interactions. Our approach, applied to matrices derived from the Comparative Toxicogenomics Database, successfully recovered Chemical-Disease, Chemical-Gene, and Disease-Gene networks in 10-fold cross-validation experiments. Additionally, we could predict each of these interaction matrices from the other two. Integrating all three CTD interaction matrices with NMF led to good predictions of STRING, an independent, external network of protein-protein interactions. Finally, this approach could integrate the CTD and STRING interaction data to improve Chemical-Gene cross-validation performance significantly, and, in a time-stamped study, it predicted information added to CTD after a given date, using only data prior to that date. We conclude that collaborative filtering can integrate information across multiple types of biological entities, and that as a first step towards precision medicine it can compute drug repurposing hypotheses.

  15. COMPUTING THERAPY FOR PRECISION MEDICINE: COLLABORATIVE FILTERING INTEGRATES AND PREDICTS MULTI-ENTITY INTERACTIONS

    PubMed Central

    REGENBOGEN, SAM; WILKINS, ANGELA D.; LICHTARGE, OLIVIER

    2015-01-01

    Biomedicine produces copious information it cannot fully exploit. Specifically, there is considerable need to integrate knowledge from disparate studies to discover connections across domains. Here, we used a Collaborative Filtering approach, inspired by online recommendation algorithms, in which non-negative matrix factorization (NMF) predicts interactions among chemicals, genes, and diseases only from pairwise information about their interactions. Our approach, applied to matrices derived from the Comparative Toxicogenomics Database, successfully recovered Chemical-Disease, Chemical-Gene, and Disease-Gene networks in 10-fold cross-validation experiments. Additionally, we could predict each of these interaction matrices from the other two. Integrating all three CTD interaction matrices with NMF led to good predictions of STRING, an independent, external network of protein-protein interactions. Finally, this approach could integrate the CTD and STRING interaction data to improve Chemical-Gene cross-validation performance significantly, and, in a time-stamped study, it predicted information added to CTD after a given date, using only data prior to that date. We conclude that collaborative filtering can integrate information across multiple types of biological entities, and that as a first step towards precision medicine it can compute drug repurposing hypotheses. PMID:26776170

  16. Smart photonic networks and computer security for image data

    NASA Astrophysics Data System (ADS)

    Campello, Jorge; Gill, John T.; Morf, Martin; Flynn, Michael J.

    1998-02-01

    Work reported here is part of a larger project on 'Smart Photonic Networks and Computer Security for Image Data', studying the interactions of coding and security, switching architecture simulations, and basic technologies. Coding and security: coding methods that are appropriate for data security in data fusion networks were investigated. These networks have several characteristics that distinguish them form other currently employed networks, such as Ethernet LANs or the Internet. The most significant characteristics are very high maximum data rates; predominance of image data; narrowcasting - transmission of data form one source to a designated set of receivers; data fusion - combining related data from several sources; simple sensor nodes with limited buffering. These characteristics affect both the lower level network design and the higher level coding methods.Data security encompasses privacy, integrity, reliability, and availability. Privacy, integrity, and reliability can be provided through encryption and coding for error detection and correction. Availability is primarily a network issue; network nodes must be protected against failure or routed around in the case of failure. One of the more promising techniques is the use of 'secret sharing'. We consider this method as a special case of our new space-time code diversity based algorithms for secure communication. These algorithms enable us to exploit parallelism and scalable multiplexing schemes to build photonic network architectures. A number of very high-speed switching and routing architectures and their relationships with very high performance processor architectures were studied. Indications are that routers for very high speed photonic networks can be designed using the very robust and distributed TCP/IP protocol, if suitable processor architecture support is available.

  17. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  18. Finding gene regulatory network candidates using the gene expression knowledge base.

    PubMed

    Venkatesan, Aravind; Tripathi, Sushil; Sanz de Galdeano, Alejandro; Blondé, Ward; Lægreid, Astrid; Mironov, Vladimir; Kuiper, Martin

    2014-12-10

    Network-based approaches for the analysis of large-scale genomics data have become well established. Biological networks provide a knowledge scaffold against which the patterns and dynamics of 'omics' data can be interpreted. The background information required for the construction of such networks is often dispersed across a multitude of knowledge bases in a variety of formats. The seamless integration of this information is one of the main challenges in bioinformatics. The Semantic Web offers powerful technologies for the assembly of integrated knowledge bases that are computationally comprehensible, thereby providing a potentially powerful resource for constructing biological networks and network-based analysis. We have developed the Gene eXpression Knowledge Base (GeXKB), a semantic web technology based resource that contains integrated knowledge about gene expression regulation. To affirm the utility of GeXKB we demonstrate how this resource can be exploited for the identification of candidate regulatory network proteins. We present four use cases that were designed from a biological perspective in order to find candidate members relevant for the gastrin hormone signaling network model. We show how a combination of specific query definitions and additional selection criteria derived from gene expression data and prior knowledge concerning candidate proteins can be used to retrieve a set of proteins that constitute valid candidates for regulatory network extensions. Semantic web technologies provide the means for processing and integrating various heterogeneous information sources. The GeXKB offers biologists such an integrated knowledge resource, allowing them to address complex biological questions pertaining to gene expression. This work illustrates how GeXKB can be used in combination with gene expression results and literature information to identify new potential candidates that may be considered for extending a gene regulatory network.

  19. Fault tolerant architectures for integrated aircraft electronics systems

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Melliar-Smith, P. M.; Schwartz, R. L.

    1983-01-01

    Work into possible architectures for future flight control computer systems is described. Ada for Fault-Tolerant Systems, the NETS Network Error-Tolerant System architecture, and voting in asynchronous systems are covered.

  20. Multimedia and Some of Its Technical Issues.

    ERIC Educational Resources Information Center

    Wang, Shousan

    2000-01-01

    Discusses multimedia and its use in classroom teaching. Describes integrated services digital networks (ISDN); video-on-demand, that uses streaming technology via the Internet; and computer-assisted instruction. (Contains 19 references.) (LRW)

  1. An integrated network of Arabidopsis growth regulators and its use for gene prioritization.

    PubMed

    Sabaghian, Ehsan; Drebert, Zuzanna; Inzé, Dirk; Saeys, Yvan

    2015-12-01

    Elucidating the molecular mechanisms that govern plant growth has been an important topic in plant research, and current advances in large-scale data generation call for computational tools that efficiently combine these different data sources to generate novel hypotheses. In this work, we present a novel, integrated network that combines multiple large-scale data sources to characterize growth regulatory genes in Arabidopsis, one of the main plant model organisms. The contributions of this work are twofold: first, we characterized a set of carefully selected growth regulators with respect to their connectivity patterns in the integrated network, and, subsequently, we explored to which extent these connectivity patterns can be used to suggest new growth regulators. Using a large-scale comparative study, we designed new supervised machine learning methods to prioritize growth regulators. Our results show that these methods significantly improve current state-of-the-art prioritization techniques, and are able to suggest meaningful new growth regulators. In addition, the integrated network is made available to the scientific community, providing a rich data source that will be useful for many biological processes, not necessarily restricted to plant growth.

  2. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems

    PubMed Central

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D.

    2016-01-01

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems. PMID:27463718

  3. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems.

    PubMed

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D

    2016-07-25

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems.

  4. Microcosm to Cosmos: The Growth of a Divisional Computer Network

    PubMed Central

    Johannes, R.S.; Kahane, Stephen N.

    1987-01-01

    In 1982, we reported the deployment of a network of microcomputers in the Division of Gastroenterology[1]. This network was based upon Corvus Systems Omninet®. Corvus was one of the very first firms to offer networking products for PC's. This PC development occurred coincident with the planning phase of the Johns Hopkins Hospital's multisegment ethernet project. A rich communications infra-structure is now in place at the Johns Hopkins Medical Institutions[2,3]. Shortly after the hospital development under the direction of the Operational and Clinical Systems Division (OCS) development began, the Johns Hopkins School of Medicine began an Integrated Academic Information Management Systems (IAIMS) planning effort. We now present a model that uses aspects of all three planning efforts (PC networks, Hospital Information Systems & IAIMS) to build a divisional computing facility. This facility is viewed as a terminal leaf on then institutional network diagram. Nevertheless, it is noteworthy that this leaf, the divisional resource in the Division of Gastroenterology (GASNET), has a rich substructure and functionality of its own, perhaps revealing the recursive nature of network architecture. The current status, design and function of the GASNET computational facility is discussed. Among the major positive aspects of this design are the sharing and centralization of MS-DOS software, the high-speed DOS/Unix link that makes available most of the our institution's computing resources.

  5. Data Movement Dominates: Advanced Memory Technology to Address the Real Exascale Power Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Keren

    Energy is the fundamental barrier to Exascale supercomputing and is dominated by the cost of moving data from one point to another, not computation. Similarly, performance is dominated by data movement, not computation. The solution to this problem requires three critical technologies: 3D integration, optical chip-to-chip communication, and a new communication model. The central goal of the Sandia led "Data Movement Dominates" project aimed to develop memory systems and new architectures based on these technologies that have the potential to lower the cost of local memory accesses by orders of magnitude and provide substantially more bandwidth. Only through these transformationalmore » advances can future systems reach the goals of Exascale computing with a manageable power budgets. The Sandia led team included co-PIs from Columbia University, Lawrence Berkeley Lab, and the University of Maryland. The Columbia effort of Data Movement Dominates focused on developing a physically accurate simulation environment and experimental verification for optically-connected memory (OCM) systems that can enable continued performance scaling through high-bandwidth capacity, energy-efficient bit-rate transparency, and time-of-flight latency. With OCM, memory device parallelism and total capacity can scale to match future high-performance computing requirements without sacrificing data-movement efficiency. When we consider systems with integrated photonics, links to memory can be seamlessly integrated with the interconnection network-in a sense, memory becomes a primary aspect of the interconnection network. At the core of the Columbia effort, toward expanding our understanding of OCM enabled computing we have created an integrated modeling and simulation environment that uniquely integrates the physical behavior of the optical layer. The PhoenxSim suite of design and software tools developed under this effort has enabled the co-design of and performance evaluation photonics-enabled OCM architectures on Exascale computing systems.« less

  6. Definition, analysis and development of an optical data distribution network for integrated avionics and control systems. Part 2: Component development and system integration

    NASA Technical Reports Server (NTRS)

    Yen, H. W.; Morrison, R. J.

    1984-01-01

    Fiber optic transmission is emerging as an attractive concept in data distribution onboard civil aircraft. Development of an Optical Data Distribution Network for Integrated Avionics and Control Systems for commercial aircraft will provide a data distribution network that gives freedom from EMI-RFI and ground loop problems, eliminates crosstalk and short circuits, provides protection and immunity from lightning induced transients and give a large bandwidth data transmission capability. In addition there is a potential for significantly reducing the weight and increasing the reliability over conventional data distribution networks. Wavelength Division Multiplexing (WDM) is a candidate method for data communication between the various avionic subsystems. With WDM all systems could conceptually communicate with each other without time sharing and requiring complicated coding schemes for each computer and subsystem to recognize a message. However, the state of the art of optical technology limits the application of fiber optics in advanced integrated avionics and control systems. Therefore, it is necessary to address the architecture for a fiber optics data distribution system for integrated avionics and control systems as well as develop prototype components and systems.

  7. Feasibility Study of a Vision-Based Landing System for Unmanned Fixed-Wing Aircraft

    DTIC Science & Technology

    2017-06-01

    International Journal of Computer Science and Network Security 7 no. 3: 112–117. Accessed April 7, 2017. http://www.sciencedirect.com/science/ article /pii...the feasibility of applying computer vision techniques and visual feedback in the control loop for an autonomous system. This thesis examines the...integration into an autonomous aircraft control system. 14. SUBJECT TERMS autonomous systems, auto-land, computer vision, image processing

  8. Systems Toxicology: Real World Applications and Opportunities.

    PubMed

    Hartung, Thomas; FitzGerald, Rex E; Jennings, Paul; Mirams, Gary R; Peitsch, Manuel C; Rostami-Hodjegan, Amin; Shah, Imran; Wilks, Martin F; Sturla, Shana J

    2017-04-17

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams ("big data"), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity.

  9. Systems Toxicology: Real World Applications and Opportunities

    PubMed Central

    2017-01-01

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams (“big data”), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity. PMID:28362102

  10. Dynamic Optical Networks for Future Internet Environments

    NASA Astrophysics Data System (ADS)

    Matera, Francesco

    2014-05-01

    This article reports an overview on the evolution of the optical network scenario taking into account the exponential growth of connected devices, big data, and cloud computing that is driving a concrete transformation impacting the information and communication technology world. This hyper-connected scenario is deeply affecting relationships between individuals, enterprises, citizens, and public administrations, fostering innovative use cases in practically any environment and market, and introducing new opportunities and new challenges. The successful realization of this hyper-connected scenario depends on different elements of the ecosystem. In particular, it builds on connectivity and functionalities allowed by converged next-generation networks and their capacity to support and integrate with the Internet of Things, machine-to-machine, and cloud computing. This article aims at providing some hints of this scenario to contribute to analyze impacts on optical system and network issues and requirements. In particular, the role of the software-defined network is investigated by taking into account all scenarios regarding data centers, cloud computing, and machine-to-machine and trying to illustrate all the advantages that could be introduced by advanced optical communications.

  11. Similarity network fusion for aggregating data types on a genomic scale.

    PubMed

    Wang, Bo; Mezlini, Aziz M; Demir, Feyyaz; Fiume, Marc; Tu, Zhuowen; Brudno, Michael; Haibe-Kains, Benjamin; Goldenberg, Anna

    2014-03-01

    Recent technologies have made it cost-effective to collect diverse types of genome-wide data. Computational methods are needed to combine these data to create a comprehensive view of a given disease or a biological process. Similarity network fusion (SNF) solves this problem by constructing networks of samples (e.g., patients) for each available data type and then efficiently fusing these into one network that represents the full spectrum of underlying data. For example, to create a comprehensive view of a disease given a cohort of patients, SNF computes and fuses patient similarity networks obtained from each of their data types separately, taking advantage of the complementarity in the data. We used SNF to combine mRNA expression, DNA methylation and microRNA (miRNA) expression data for five cancer data sets. SNF substantially outperforms single data type analysis and established integrative approaches when identifying cancer subtypes and is effective for predicting survival.

  12. Integrated regulatory network reveals novel candidate regulators in the development of negative energy balance in cattle.

    PubMed

    Mozduri, Z; Bakhtiarizadeh, M R; Salehi, A

    2018-06-01

    Negative energy balance (NEB) is an altered metabolic state in modern high-yielding dairy cows. This metabolic state occurs in the early postpartum period when energy demands for milk production and maintenance exceed that of energy intake. Negative energy balance or poor adaptation to this metabolic state has important effects on the liver and can lead to metabolic disorders and reduced fertility. The roles of regulatory factors, including transcription factors (TFs) and micro RNAs (miRNAs) have often been separately studied for evaluating of NEB. However, adaptive response to NEB is controlled by complex gene networks and still not fully understood. In this study, we aimed to discover the integrated gene regulatory networks involved in NEB development in liver tissue. We downloaded data sets including mRNA and miRNA expression profiles related to three and four cows with severe and moderate NEB, respectively. Our method integrated two independent types of information: module inference network by TFs, miRNAs and mRNA expression profiles (RNA-seq data) and computational target predictions. In total, 176 modules were predicted by using gene expression data and 64 miRNAs and 63 TFs were assigned to these modules. By using our integrated computational approach, we identified 13 TF-module and 19 miRNA-module interactions. Most of these modules were associated with liver metabolic processes as well as immune and stress responses, which might play crucial roles in NEB development. Literature survey results also showed that several regulators and gene targets have already been characterized as important factors in liver metabolic processes. These results provided novel insights into regulatory mechanisms at the TF and miRNA levels during NEB. In addition, the method described in this study seems to be applicable to construct integrated regulatory networks for different diseases or disorders.

  13. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  14. A distributed data base management system. [for Deep Space Network

    NASA Technical Reports Server (NTRS)

    Bryan, A. I.

    1975-01-01

    Major system design features of a distributed data management system for the NASA Deep Space Network (DSN) designed for continuous two-way deep space communications are described. The reasons for which the distributed data base utilizing third-generation minicomputers is selected as the optimum approach for the DSN are threefold: (1) with a distributed master data base, valid data is available in real-time to support DSN management activities at each location; (2) data base integrity is the responsibility of local management; and (3) the data acquisition/distribution and processing power of a third-generation computer enables the computer to function successfully as a data handler or as an on-line process controller. The concept of the distributed data base is discussed along with the software, data base integrity, and hardware used. The data analysis/update constraint is examined.

  15. Expectation propagation for large scale Bayesian inference of non-linear molecular networks from perturbation data.

    PubMed

    Narimani, Zahra; Beigy, Hamid; Ahmad, Ashar; Masoudi-Nejad, Ali; Fröhlich, Holger

    2017-01-01

    Inferring the structure of molecular networks from time series protein or gene expression data provides valuable information about the complex biological processes of the cell. Causal network structure inference has been approached using different methods in the past. Most causal network inference techniques, such as Dynamic Bayesian Networks and ordinary differential equations, are limited by their computational complexity and thus make large scale inference infeasible. This is specifically true if a Bayesian framework is applied in order to deal with the unavoidable uncertainty about the correct model. We devise a novel Bayesian network reverse engineering approach using ordinary differential equations with the ability to include non-linearity. Besides modeling arbitrary, possibly combinatorial and time dependent perturbations with unknown targets, one of our main contributions is the use of Expectation Propagation, an algorithm for approximate Bayesian inference over large scale network structures in short computation time. We further explore the possibility of integrating prior knowledge into network inference. We evaluate the proposed model on DREAM4 and DREAM8 data and find it competitive against several state-of-the-art existing network inference methods.

  16. Mapping an Emergent Field of "Computational Education Policy": Policy Rationalities, Prediction and Data in the Age of Artificial Intelligence

    ERIC Educational Resources Information Center

    Gulson, Kalervo N.; Webb, P. Taylor

    2017-01-01

    Contemporary education policy involves the integration of novel forms of data and the creation of new data platforms, in addition to the infusion of business principles into school governance networks, and intensification of socio-technical relations. In this paper, we examine how "computational rationality" may be understood as…

  17. Technologies and Approaches to Elucidate and Model the Virulence Program of Salmonella.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Yoon, Hyunjin; Nakayasu, Ernesto S.

    Salmonella is a primary cause of enteric diseases in a variety of animals. During its evolution into a pathogenic bacterium, Salmonella acquired an elaborate regulatory network that responds to multiple environmental stimuli within host animals and integrates them resulting in fine regulation of the virulence program. The coordinated action by this regulatory network involves numerous virulence regulators, necessitating genome-wide profiling analysis to assess and combine efforts from multiple regulons. In this review we discuss recent high-throughput analytic approaches to understand the regulatory network of Salmonella that controls virulence processes. Application of high-throughput analyses have generated a large amount of datamore » and driven development of computational approaches required for data integration. Therefore, we also cover computer-aided network analyses to infer regulatory networks, and demonstrate how genome-scale data can be used to construct regulatory and metabolic systems models of Salmonella pathogenesis. Genes that are coordinately controlled by multiple virulence regulators under infectious conditions are more likely to be important for pathogenesis. Thus, reconstructing the global regulatory network during infection or, at the very least, under conditions that mimic the host cellular environment not only provides a bird’s eye view of Salmonella survival strategy in response to hostile host environments but also serves as an efficient means to identify novel virulence factors that are essential for Salmonella to accomplish systemic infection in the host.« less

  18. [Networking and integrated disease management. Advantages and disadvantages from the medical point of view].

    PubMed

    Hänsch, H; Fleck, E

    2005-07-01

    At the moment the terms "networking", "cost reduction" and "integrated disease management" are frequently discussed in all branches of the German health care system. Unfortunately there are different interpretations of these terms. "Integrated disease management" in the meaning of communication between clinical and outpatient health care has al ready existed for years now. Traditional ways of communication lead to information loss. Losing information is a reason for low cost effectiveness and a prolonged healing process directly harming the patient. A computer network may prevent information loss and may in crease the performance of data transfer. Different sides have al ready started networking, and it is now necessary to bundle the interests. This necessity has been recognized by the German legislative. To lead this project to success it is important to know and to fulfil some medical criteria. Defining and describing these conditions is the topic of this paper. Our special intent is to show that digital technique is necessary to improve cooperation among physicians.

  19. INTEGRATING GENETIC AND STRUCTURAL DATA ON HUMAN PROTEIN KINOME IN NETWORK-BASED MODELING OF KINASE SENSITIVITIES AND RESISTANCE TO TARGETED AND PERSONALIZED ANTICANCER DRUGS.

    PubMed

    Verkhivker, Gennady M

    2016-01-01

    The human protein kinome presents one of the largest protein families that orchestrate functional processes in complex cellular networks, and when perturbed, can cause various cancers. The abundance and diversity of genetic, structural, and biochemical data underlies the complexity of mechanisms by which targeted and personalized drugs can combat mutational profiles in protein kinases. Coupled with the evolution of system biology approaches, genomic and proteomic technologies are rapidly identifying and charactering novel resistance mechanisms with the goal to inform rationale design of personalized kinase drugs. Integration of experimental and computational approaches can help to bring these data into a unified conceptual framework and develop robust models for predicting the clinical drug resistance. In the current study, we employ a battery of synergistic computational approaches that integrate genetic, evolutionary, biochemical, and structural data to characterize the effect of cancer mutations in protein kinases. We provide a detailed structural classification and analysis of genetic signatures associated with oncogenic mutations. By integrating genetic and structural data, we employ network modeling to dissect mechanisms of kinase drug sensitivities to oncogenic EGFR mutations. Using biophysical simulations and analysis of protein structure networks, we show that conformational-specific drug binding of Lapatinib may elicit resistant mutations in the EGFR kinase that are linked with the ligand-mediated changes in the residue interaction networks and global network properties of key residues that are responsible for structural stability of specific functional states. A strong network dependency on high centrality residues in the conformation-specific Lapatinib-EGFR complex may explain vulnerability of drug binding to a broad spectrum of mutations and the emergence of drug resistance. Our study offers a systems-based perspective on drug design by unravelling complex relationships between robustness of targeted kinase genes and binding specificity of targeted kinase drugs. We discuss how these approaches can exploit advances in chemical biology and network science to develop novel strategies for rationally tailored and robust personalized drug therapies.

  20. Network integrity of the parental brain in infancy supports the development of children's social competencies.

    PubMed

    Abraham, Eyal; Hendler, Talma; Zagoory-Sharon, Orna; Feldman, Ruth

    2016-11-01

    The cross-generational transmission of mammalian sociality, initiated by the parent's postpartum brain plasticity and species-typical behavior that buttress offspring's socialization, has not been studied in humans. In this longitudinal study, we measured brain response of 45 primary-caregiving parents to their infant's stimuli, observed parent-infant interactions, and assayed parental oxytocin (OT). Intra- and inter-network connectivity were computed in three main networks of the human parental brain: core limbic, embodied simulation and mentalizing. During preschool, two key child social competencies were observed: emotion regulation and socialization. Parent's network integrity in infancy predicted preschoolers' social outcomes, with subcortical and cortical network integrity foreshadowing simple evolutionary-based regulatory tactics vs complex self-regulatory strategies and advanced socialization. Parent-infant synchrony mediated the links between connectivity of the parent's embodied simulation network and preschoolers' ability to use cognitive/executive emotion regulation strategies, highlighting the inherently dyadic nature of this network and its long-term effects on tuning young to social life. Parent's inter-network core limbic-embodied simulation connectivity predicted children's OT as moderated by parental OT. Findings challenge solipsistic neuroscience perspectives by demonstrating how the parent-offspring interface enables the brain of one human to profoundly impact long-term adaptation of another. © The Author (2016). Published by Oxford University Press.

  1. Network integrity of the parental brain in infancy supports the development of children’s social competencies

    PubMed Central

    Abraham, Eyal; Hendler, Talma; Zagoory-Sharon, Orna

    2016-01-01

    The cross-generational transmission of mammalian sociality, initiated by the parent’s postpartum brain plasticity and species-typical behavior that buttress offspring’s socialization, has not been studied in humans. In this longitudinal study, we measured brain response of 45 primary-caregiving parents to their infant’s stimuli, observed parent–infant interactions, and assayed parental oxytocin (OT). Intra- and inter-network connectivity were computed in three main networks of the human parental brain: core limbic, embodied simulation and mentalizing. During preschool, two key child social competencies were observed: emotion regulation and socialization. Parent’s network integrity in infancy predicted preschoolers’ social outcomes, with subcortical and cortical network integrity foreshadowing simple evolutionary-based regulatory tactics vs complex self-regulatory strategies and advanced socialization. Parent–infant synchrony mediated the links between connectivity of the parent’s embodied simulation network and preschoolers' ability to use cognitive/executive emotion regulation strategies, highlighting the inherently dyadic nature of this network and its long-term effects on tuning young to social life. Parent’s inter-network core limbic-embodied simulation connectivity predicted children’s OT as moderated by parental OT. Findings challenge solipsistic neuroscience perspectives by demonstrating how the parent–offspring interface enables the brain of one human to profoundly impact long-term adaptation of another. PMID:27369068

  2. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  3. The Research Path to the Virtual Class. ZIFF Papiere 105.

    ERIC Educational Resources Information Center

    Rajasingham, Lalita

    This paper describes a project conducted in 1991-92, based on research conducted in 1986-87 that demonstrated the need for a telecommunications system with the capacity of integrated services digital networks (ISDN) that would allow for sound, vision, and integrated computer services. Called the Tri-Centre Project, it set out to explore, from the…

  4. Novel technology for enhanced security and trust in communication networks

    NASA Astrophysics Data System (ADS)

    Milovanov, Alexander; Bukshpun, Leonid; Pradhan, Ranjit; Jannson, Tomasz

    2011-06-01

    A novel technology that significantly enhances security and trust in wireless and wired communication networks has been developed. It is based on integration of a novel encryption mechanism and novel data packet structure with enhanced security tools. This novel data packet structure results in an unprecedented level of security and trust, while at the same time reducing power consumption and computing/communication overhead in networks. As a result, networks are provided with protection against intrusion, exploitation, and cyber attacks and posses self-building, self-awareness, self-configuring, self-healing, and self-protecting intelligence.

  5. Review On Applications Of Neural Network To Computer Vision

    NASA Astrophysics Data System (ADS)

    Li, Wei; Nasrabadi, Nasser M.

    1989-03-01

    Neural network models have many potential applications to computer vision due to their parallel structures, learnability, implicit representation of domain knowledge, fault tolerance, and ability of handling statistical data. This paper demonstrates the basic principles, typical models and their applications in this field. Variety of neural models, such as associative memory, multilayer back-propagation perceptron, self-stabilized adaptive resonance network, hierarchical structured neocognitron, high order correlator, network with gating control and other models, can be applied to visual signal recognition, reinforcement, recall, stereo vision, motion, object tracking and other vision processes. Most of the algorithms have been simulated on com-puters. Some have been implemented with special hardware. Some systems use features, such as edges and profiles, of images as the data form for input. Other systems use raw data as input signals to the networks. We will present some novel ideas contained in these approaches and provide a comparison of these methods. Some unsolved problems are mentioned, such as extracting the intrinsic properties of the input information, integrating those low level functions to a high-level cognitive system, achieving invariances and other problems. Perspectives of applications of some human vision models and neural network models are analyzed.

  6. ChloroKB: A Web Application for the Integration of Knowledge Related to Chloroplast Metabolic Network.

    PubMed

    Gloaguen, Pauline; Bournais, Sylvain; Alban, Claude; Ravanel, Stéphane; Seigneurin-Berny, Daphné; Matringe, Michel; Tardif, Marianne; Kuntz, Marcel; Ferro, Myriam; Bruley, Christophe; Rolland, Norbert; Vandenbrouck, Yves; Curien, Gilles

    2017-06-01

    Higher plants, as autotrophic organisms, are effective sources of molecules. They hold great promise for metabolic engineering, but the behavior of plant metabolism at the network level is still incompletely described. Although structural models (stoichiometry matrices) and pathway databases are extremely useful, they cannot describe the complexity of the metabolic context, and new tools are required to visually represent integrated biocurated knowledge for use by both humans and computers. Here, we describe ChloroKB, a Web application (http://chlorokb.fr/) for visual exploration and analysis of the Arabidopsis ( Arabidopsis thaliana ) metabolic network in the chloroplast and related cellular pathways. The network was manually reconstructed through extensive biocuration to provide transparent traceability of experimental data. Proteins and metabolites were placed in their biological context (spatial distribution within cells, connectivity in the network, participation in supramolecular complexes, and regulatory interactions) using CellDesigner software. The network contains 1,147 reviewed proteins (559 localized exclusively in plastids, 68 in at least one additional compartment, and 520 outside the plastid), 122 proteins awaiting biochemical/genetic characterization, and 228 proteins for which genes have not yet been identified. The visual presentation is intuitive and browsing is fluid, providing instant access to the graphical representation of integrated processes and to a wealth of refined qualitative and quantitative data. ChloroKB will be a significant support for structural and quantitative kinetic modeling, for biological reasoning, when comparing novel data with established knowledge, for computer analyses, and for educational purposes. ChloroKB will be enhanced by continuous updates following contributions from plant researchers. © 2017 American Society of Plant Biologists. All Rights Reserved.

  7. Multi-petascale highly efficient parallel supercomputer

    DOEpatents

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen -Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Smith, Brian; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2015-07-14

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaOPS-scale computing, at decreased cost, power and footprint, and that allows for a maximum packaging density of processing nodes from an interconnect point of view. The Supercomputer exploits technological advances in VLSI that enables a computing model where many processors can be integrated into a single Application Specific Integrated Circuit (ASIC). Each ASIC computing node comprises a system-on-chip ASIC utilizing four or more processors integrated into one die, with each having full access to all system resources and enabling adaptive partitioning of the processors to functions such as compute or messaging I/O on an application by application basis, and preferably, enable adaptive partitioning of functions in accordance with various algorithmic phases within an application, or if I/O or other processors are underutilized, then can participate in computation or communication nodes are interconnected by a five dimensional torus network with DMA that optimally maximize the throughput of packet communications between nodes and minimize latency.

  8. A network of spiking neurons for computing sparse representations in an energy efficient way

    PubMed Central

    Hu, Tao; Genkin, Alexander; Chklovskii, Dmitri B.

    2013-01-01

    Computing sparse redundant representations is an important problem both in applied mathematics and neuroscience. In many applications, this problem must be solved in an energy efficient way. Here, we propose a hybrid distributed algorithm (HDA), which solves this problem on a network of simple nodes communicating via low-bandwidth channels. HDA nodes perform both gradient-descent-like steps on analog internal variables and coordinate-descent-like steps via quantized external variables communicated to each other. Interestingly, such operation is equivalent to a network of integrate-and-fire neurons, suggesting that HDA may serve as a model of neural computation. We compare the numerical performance of HDA with existing algorithms and show that in the asymptotic regime the representation error of HDA decays with time, t, as 1/t. We show that HDA is stable against time-varying noise, specifically, the representation error decays as 1/t for Gaussian white noise. PMID:22920853

  9. A network of spiking neurons for computing sparse representations in an energy-efficient way.

    PubMed

    Hu, Tao; Genkin, Alexander; Chklovskii, Dmitri B

    2012-11-01

    Computing sparse redundant representations is an important problem in both applied mathematics and neuroscience. In many applications, this problem must be solved in an energy-efficient way. Here, we propose a hybrid distributed algorithm (HDA), which solves this problem on a network of simple nodes communicating by low-bandwidth channels. HDA nodes perform both gradient-descent-like steps on analog internal variables and coordinate-descent-like steps via quantized external variables communicated to each other. Interestingly, the operation is equivalent to a network of integrate-and-fire neurons, suggesting that HDA may serve as a model of neural computation. We show that the numerical performance of HDA is on par with existing algorithms. In the asymptotic regime, the representation error of HDA decays with time, t, as 1/t. HDA is stable against time-varying noise; specifically, the representation error decays as 1/√t for gaussian white noise.

  10. Mediator infrastructure for information integration and semantic data integration environment for biomedical research.

    PubMed

    Grethe, Jeffrey S; Ross, Edward; Little, David; Sanders, Brian; Gupta, Amarnath; Astakhov, Vadim

    2009-01-01

    This paper presents current progress in the development of semantic data integration environment which is a part of the Biomedical Informatics Research Network (BIRN; http://www.nbirn.net) project. BIRN is sponsored by the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). A goal is the development of a cyberinfrastructure for biomedical research that supports advance data acquisition, data storage, data management, data integration, data mining, data visualization, and other computing and information processing services over the Internet. Each participating institution maintains storage of their experimental or computationally derived data. Mediator-based data integration system performs semantic integration over the databases to enable researchers to perform analyses based on larger and broader datasets than would be available from any single institution's data. This paper describes recent revision of the system architecture, implementation, and capabilities of the semantically based data integration environment for BIRN.

  11. Real-time computing platform for spiking neurons (RT-spike).

    PubMed

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  12. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support

    PubMed Central

    Camargo, João; Rochol, Juergen; Gerla, Mario

    2018-01-01

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends. PMID:29364172

  13. Memristor-Based Analog Computation and Neural Network Classification with a Dot Product Engine.

    PubMed

    Hu, Miao; Graves, Catherine E; Li, Can; Li, Yunning; Ge, Ning; Montgomery, Eric; Davila, Noraica; Jiang, Hao; Williams, R Stanley; Yang, J Joshua; Xia, Qiangfei; Strachan, John Paul

    2018-03-01

    Using memristor crossbar arrays to accelerate computations is a promising approach to efficiently implement algorithms in deep neural networks. Early demonstrations, however, are limited to simulations or small-scale problems primarily due to materials and device challenges that limit the size of the memristor crossbar arrays that can be reliably programmed to stable and analog values, which is the focus of the current work. High-precision analog tuning and control of memristor cells across a 128 × 64 array is demonstrated, and the resulting vector matrix multiplication (VMM) computing precision is evaluated. Single-layer neural network inference is performed in these arrays, and the performance compared to a digital approach is assessed. Memristor computing system used here reaches a VMM accuracy equivalent of 6 bits, and an 89.9% recognition accuracy is achieved for the 10k MNIST handwritten digit test set. Forecasts show that with integrated (on chip) and scaled memristors, a computational efficiency greater than 100 trillion operations per second per Watt is possible. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.

    PubMed

    Rosário, Denis; Schimuneck, Matias; Camargo, João; Nobre, Jéferson; Both, Cristiano; Rochol, Juergen; Gerla, Mario

    2018-01-24

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends.

  15. Towards a cyber-physical era: soft computing framework based multi-sensor array for water quality monitoring

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv

    2018-02-01

    New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.

  16. An integrated approach to characterize transcription factor and microRNA regulatory networks involved in Schwann cell response to peripheral nerve injury

    PubMed Central

    2013-01-01

    Background The regenerative response of Schwann cells after peripheral nerve injury is a critical process directly related to the pathophysiology of a number of neurodegenerative diseases. This SC injury response is dependent on an intricate gene regulatory program coordinated by a number of transcription factors and microRNAs, but the interactions among them remain largely unknown. Uncovering the transcriptional and post-transcriptional regulatory networks governing the Schwann cell injury response is a key step towards a better understanding of Schwann cell biology and may help develop novel therapies for related diseases. Performing such comprehensive network analysis requires systematic bioinformatics methods to integrate multiple genomic datasets. Results In this study we present a computational pipeline to infer transcription factor and microRNA regulatory networks. Our approach combined mRNA and microRNA expression profiling data, ChIP-Seq data of transcription factors, and computational transcription factor and microRNA target prediction. Using mRNA and microRNA expression data collected in a Schwann cell injury model, we constructed a regulatory network and studied regulatory pathways involved in Schwann cell response to injury. Furthermore, we analyzed network motifs and obtained insights on cooperative regulation of transcription factors and microRNAs in Schwann cell injury recovery. Conclusions This work demonstrates a systematic method for gene regulatory network inference that may be used to gain new information on gene regulation by transcription factors and microRNAs. PMID:23387820

  17. The evolution of the ISOLDE control system

    NASA Astrophysics Data System (ADS)

    Jonsson, O. C.; Catherall, R.; Deloose, I.; Drumm, P.; Evensen, A. H. M.; Gase, K.; Focker, G. J.; Fowler, A.; Kugler, E.; Lettry, J.; Olesen, G.; Ravn, H. L.; Isolde Collaboration

    The ISOLDE on-line mass separator facility is operating on a Personal Computer based control system since spring 1992. Front End Computers accessing the hardware are controlled from consoles running Microsoft Windows ™ through a Novell NetWare4 ™ local area network. The control system is transparently integrated in the CERN wide office network and makes heavy use of the CERN standard office application programs to control and to document the running of the ISOLDE isotope separators. This paper recalls the architecture of the control system, shows its recent developments and gives some examples of its graphical user interface.

  18. The evolution of the ISOLDE control system

    NASA Astrophysics Data System (ADS)

    Jonsson, O. C.; Catherall, R.; Deloose, I.; Evensen, A. H. M.; Gase, K.; Focker, G. J.; Fowler, A.; Kugler, E.; Lettry, J.; Olesen, G.; Ravn, H. L.; Drumm, P.

    1996-04-01

    The ISOLDE on-line mass separator facility is operating on a Personal Computer based control system since spring 1992. Front End Computers accessing the hardware are controlled from consoles running Microsoft Windows® through a Novell NetWare4® local area network. The control system is transparently integrated in the CERN wide office network and makes heavy use of the CERN standard office application programs to control and to document the running of the ISOLDE isotope separators. This paper recalls the architecture of the control system, shows its recent developments and gives some examples of its graphical user interface.

  19. Computerized power supply analysis: State equation generation and terminal models

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.

    1978-01-01

    To aid engineers that design power supply systems two analysis tools that can be used with the state equation analysis package were developed. These tools include integration routines that start with the description of a power supply in state equation form and yield analytical results. The first tool uses a computer program that works with the SUPER SCEPTRE circuit analysis program and prints the state equation for an electrical network. The state equations developed automatically by the computer program are used to develop an algorithm for reducing the number of state variables required to describe an electrical network. In this way a second tool is obtained in which the order of the network is reduced and a simpler terminal model is obtained.

  20. The Open System Interconnection as a building block in a health sciences information network.

    PubMed Central

    Boss, R W

    1985-01-01

    The interconnection of integrated health sciences library systems with other health sciences computer systems to achieve information networks will require either custom linkages among specific devices or the adoption of standards that all systems support. The most appropriate standards appear to be those being developed under the Open System Interconnection (OSI) reference model, which specifies a set of rules and functions that computers must follow to exchange information. The protocols have been modularized into seven different layers. The lowest three layers are generally available as off-the-shelf interfacing products. The higher layers require special development for particular applications. This paper describes the OSI, its application in health sciences networks, and specific tasks that remain to be undertaken. PMID:4052672

  1. Tracking the Reorganization of Module Structure in Time-Varying Weighted Brain Functional Connectivity Networks.

    PubMed

    Schmidt, Christoph; Piper, Diana; Pester, Britta; Mierau, Andreas; Witte, Herbert

    2018-05-01

    Identification of module structure in brain functional networks is a promising way to obtain novel insights into neural information processing, as modules correspond to delineated brain regions in which interactions are strongly increased. Tracking of network modules in time-varying brain functional networks is not yet commonly considered in neuroscience despite its potential for gaining an understanding of the time evolution of functional interaction patterns and associated changing degrees of functional segregation and integration. We introduce a general computational framework for extracting consensus partitions from defined time windows in sequences of weighted directed edge-complete networks and show how the temporal reorganization of the module structure can be tracked and visualized. Part of the framework is a new approach for computing edge weight thresholds for individual networks based on multiobjective optimization of module structure quality criteria as well as an approach for matching modules across time steps. By testing our framework using synthetic network sequences and applying it to brain functional networks computed from electroencephalographic recordings of healthy subjects that were exposed to a major balance perturbation, we demonstrate the framework's potential for gaining meaningful insights into dynamic brain function in the form of evolving network modules. The precise chronology of the neural processing inferred with our framework and its interpretation helps to improve the currently incomplete understanding of the cortical contribution for the compensation of such balance perturbations.

  2. Framework and Method for Controlling a Robotic System Using a Distributed Computer Network

    NASA Technical Reports Server (NTRS)

    Sanders, Adam M. (Inventor); Strawser, Philip A. (Inventor); Barajas, Leandro G. (Inventor); Permenter, Frank Noble (Inventor)

    2015-01-01

    A robotic system for performing an autonomous task includes a humanoid robot having a plurality of compliant robotic joints, actuators, and other integrated system devices that are controllable in response to control data from various control points, and having sensors for measuring feedback data at the control points. The system includes a multi-level distributed control framework (DCF) for controlling the integrated system components over multiple high-speed communication networks. The DCF has a plurality of first controllers each embedded in a respective one of the integrated system components, e.g., the robotic joints, a second controller coordinating the components via the first controllers, and a third controller for transmitting a signal commanding performance of the autonomous task to the second controller. The DCF virtually centralizes all of the control data and the feedback data in a single location to facilitate control of the robot across the multiple communication networks.

  3. RAIN: RNA–protein Association and Interaction Networks

    PubMed Central

    Junge, Alexander; Refsgaard, Jan C.; Garde, Christian; Pan, Xiaoyong; Santos, Alberto; Alkan, Ferhat; Anthon, Christian; von Mering, Christian; Workman, Christopher T.; Jensen, Lars Juhl; Gorodkin, Jan

    2017-01-01

    Protein association networks can be inferred from a range of resources including experimental data, literature mining and computational predictions. These types of evidence are emerging for non-coding RNAs (ncRNAs) as well. However, integration of ncRNAs into protein association networks is challenging due to data heterogeneity. Here, we present a database of ncRNA–RNA and ncRNA–protein interactions and its integration with the STRING database of protein–protein interactions. These ncRNA associations cover four organisms and have been established from curated examples, experimental data, interaction predictions and automatic literature mining. RAIN uses an integrative scoring scheme to assign a confidence score to each interaction. We demonstrate that RAIN outperforms the underlying microRNA-target predictions in inferring ncRNA interactions. RAIN can be operated through an easily accessible web interface and all interaction data can be downloaded. Database URL: http://rth.dk/resources/rain PMID:28077569

  4. Hybrid architecture for building secure sensor networks

    NASA Astrophysics Data System (ADS)

    Owens, Ken R., Jr.; Watkins, Steve E.

    2012-04-01

    Sensor networks have various communication and security architectural concerns. Three approaches are defined to address these concerns for sensor networks. The first area is the utilization of new computing architectures that leverage embedded virtualization software on the sensor. Deploying a small, embedded virtualization operating system on the sensor nodes that is designed to communicate to low-cost cloud computing infrastructure in the network is the foundation to delivering low-cost, secure sensor networks. The second area focuses on securing the sensor. Sensor security components include developing an identification scheme, and leveraging authentication algorithms and protocols that address security assurance within the physical, communication network, and application layers. This function will primarily be accomplished through encrypting the communication channel and integrating sensor network firewall and intrusion detection/prevention components to the sensor network architecture. Hence, sensor networks will be able to maintain high levels of security. The third area addresses the real-time and high priority nature of the data that sensor networks collect. This function requires that a quality-of-service (QoS) definition and algorithm be developed for delivering the right data at the right time. A hybrid architecture is proposed that combines software and hardware features to handle network traffic with diverse QoS requirements.

  5. Reward-Modulated Hebbian Plasticity as Leverage for Partially Embodied Control in Compliant Robotics

    PubMed Central

    Burms, Jeroen; Caluwaerts, Ken; Dambre, Joni

    2015-01-01

    In embodied computation (or morphological computation), part of the complexity of motor control is offloaded to the body dynamics. We demonstrate that a simple Hebbian-like learning rule can be used to train systems with (partial) embodiment, and can be extended outside of the scope of traditional neural networks. To this end, we apply the learning rule to optimize the connection weights of recurrent neural networks with different topologies and for various tasks. We then apply this learning rule to a simulated compliant tensegrity robot by optimizing static feedback controllers that directly exploit the dynamics of the robot body. This leads to partially embodied controllers, i.e., hybrid controllers that naturally integrate the computations that are performed by the robot body into a neural network architecture. Our results demonstrate the universal applicability of reward-modulated Hebbian learning. Furthermore, they demonstrate the robustness of systems trained with the learning rule. This study strengthens our belief that compliant robots should or can be seen as computational units, instead of dumb hardware that needs a complex controller. This link between compliant robotics and neural networks is also the main reason for our search for simple universal learning rules for both neural networks and robotics. PMID:26347645

  6. Predictive modelling-based design and experiments for synthesis and spinning of bioinspired silk fibres

    PubMed Central

    Gronau, Greta; Jacobsen, Matthew M.; Huang, Wenwen; Rizzo, Daniel J.; Li, David; Staii, Cristian; Pugno, Nicola M.; Wong, Joyce Y.; Kaplan, David L.; Buehler, Markus J.

    2016-01-01

    Scalable computational modelling tools are required to guide the rational design of complex hierarchical materials with predictable functions. Here, we utilize mesoscopic modelling, integrated with genetic block copolymer synthesis and bioinspired spinning process, to demonstrate de novo materials design that incorporates chemistry, processing and material characterization. We find that intermediate hydrophobic/hydrophilic block ratios observed in natural spider silks and longer chain lengths lead to outstanding silk fibre formation. This design by nature is based on the optimal combination of protein solubility, self-assembled aggregate size and polymer network topology. The original homogeneous network structure becomes heterogeneous after spinning, enhancing the anisotropic network connectivity along the shear flow direction. Extending beyond the classical polymer theory, with insights from the percolation network model, we illustrate the direct proportionality between network conductance and fibre Young's modulus. This integrated approach provides a general path towards de novo functional network materials with enhanced mechanical properties and beyond (optical, electrical or thermal) as we have experimentally verified. PMID:26017575

  7. Predictive modelling-based design and experiments for synthesis and spinning of bioinspired silk fibres.

    PubMed

    Lin, Shangchao; Ryu, Seunghwa; Tokareva, Olena; Gronau, Greta; Jacobsen, Matthew M; Huang, Wenwen; Rizzo, Daniel J; Li, David; Staii, Cristian; Pugno, Nicola M; Wong, Joyce Y; Kaplan, David L; Buehler, Markus J

    2015-05-28

    Scalable computational modelling tools are required to guide the rational design of complex hierarchical materials with predictable functions. Here, we utilize mesoscopic modelling, integrated with genetic block copolymer synthesis and bioinspired spinning process, to demonstrate de novo materials design that incorporates chemistry, processing and material characterization. We find that intermediate hydrophobic/hydrophilic block ratios observed in natural spider silks and longer chain lengths lead to outstanding silk fibre formation. This design by nature is based on the optimal combination of protein solubility, self-assembled aggregate size and polymer network topology. The original homogeneous network structure becomes heterogeneous after spinning, enhancing the anisotropic network connectivity along the shear flow direction. Extending beyond the classical polymer theory, with insights from the percolation network model, we illustrate the direct proportionality between network conductance and fibre Young's modulus. This integrated approach provides a general path towards de novo functional network materials with enhanced mechanical properties and beyond (optical, electrical or thermal) as we have experimentally verified.

  8. Passing messages between biological networks to refine predicted interactions.

    PubMed

    Glass, Kimberly; Huttenhower, Curtis; Quackenbush, John; Yuan, Guo-Cheng

    2013-01-01

    Regulatory network reconstruction is a fundamental problem in computational biology. There are significant limitations to such reconstruction using individual datasets, and increasingly people attempt to construct networks using multiple, independent datasets obtained from complementary sources, but methods for this integration are lacking. We developed PANDA (Passing Attributes between Networks for Data Assimilation), a message-passing model using multiple sources of information to predict regulatory relationships, and used it to integrate protein-protein interaction, gene expression, and sequence motif data to reconstruct genome-wide, condition-specific regulatory networks in yeast as a model. The resulting networks were not only more accurate than those produced using individual data sets and other existing methods, but they also captured information regarding specific biological mechanisms and pathways that were missed using other methodologies. PANDA is scalable to higher eukaryotes, applicable to specific tissue or cell type data and conceptually generalizable to include a variety of regulatory, interaction, expression, and other genome-scale data. An implementation of the PANDA algorithm is available at www.sourceforge.net/projects/panda-net.

  9. NMESys: An expert system for network fault detection

    NASA Technical Reports Server (NTRS)

    Nelson, Peter C.; Warpinski, Janet

    1991-01-01

    The problem of network management is becoming an increasingly difficult and challenging task. It is very common today to find heterogeneous networks consisting of many different types of computers, operating systems, and protocols. The complexity of implementing a network with this many components is difficult enough, while the maintenance of such a network is an even larger problem. A prototype network management expert system, NMESys, implemented in the C Language Integrated Production System (CLIPS). NMESys concentrates on solving some of the critical problems encountered in managing a large network. The major goal of NMESys is to provide a network operator with an expert system tool to quickly and accurately detect hard failures, potential failures, and to minimize or eliminate user down time in a large network.

  10. An integrate-and-fire model for synchronized bursting in a network of cultured cortical neurons.

    PubMed

    French, D A; Gruenstein, E I

    2006-12-01

    It has been suggested that spontaneous synchronous neuronal activity is an essential step in the formation of functional networks in the central nervous system. The key features of this type of activity consist of bursts of action potentials with associated spikes of elevated cytoplasmic calcium. These features are also observed in networks of rat cortical neurons that have been formed in culture. Experimental studies of these cultured networks have led to several hypotheses for the mechanisms underlying the observed synchronized oscillations. In this paper, bursting integrate-and-fire type mathematical models for regular spiking (RS) and intrinsic bursting (IB) neurons are introduced and incorporated through a small-world connection scheme into a two-dimensional excitatory network similar to those in the cultured network. This computer model exhibits spontaneous synchronous activity through mechanisms similar to those hypothesized for the cultured experimental networks. Traces of the membrane potential and cytoplasmic calcium from the model closely match those obtained from experiments. We also consider the impact on network behavior of the IB neurons, the geometry and the small world connection scheme.

  11. Portable Computer Technology (PCT) Research and Development Program Phase 2

    NASA Technical Reports Server (NTRS)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  12. A Markovian event-based framework for stochastic spiking neural networks.

    PubMed

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

  13. Efficient Use of Distributed Systems for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques

    2000-01-01

    Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.

  14. Databases, data integration, and expert systems: new directions in mineral resource assessment and mineral exploration

    USGS Publications Warehouse

    McCammon, Richard B.; Ramani, Raja V.; Mozumdar, Bijoy K.; Samaddar, Arun B.

    1994-01-01

    Overcoming future difficulties in searching for ore deposits deeper in the earth's crust will require closer attention to the collection and analysis of more diverse types of data and to more efficient use of current computer technologies. Computer technologies of greatest interest include methods of storage and retrieval of resource information, methods for integrating geologic, geochemical, and geophysical data, and the introduction of advanced computer technologies such as expert systems, multivariate techniques, and neural networks. Much experience has been gained in the past few years in applying these technologies. More experience is needed if they are to be implemented for everyday use in future assessments and exploration.

  15. Technology Needs for Teachers Web Development and Curriculum Adaptations

    NASA Technical Reports Server (NTRS)

    Carroll, Christy J.

    1999-01-01

    Computer-based mathematics and science curricula focusing on NASA inventions and technologies will enhance current teacher knowledge and skills. Materials and interactive software developed by educators will allow students to integrate their various courses, to work cooperatively, and to collaborate with both NASA scientists and students at other locations by using computer networks, email and the World Wide Web.

  16. Integrating Network Management for Cloud Computing Services

    DTIC Science & Technology

    2015-06-01

    abstraction and system design. In this dissertation, we make three major contributions. We rst propose to consolidate the tra c and infrastructure management...abstraction and system design. In this dissertation, we make three major contributions. We first propose to consolidate the traffic and infrastructure ...1.3.1 Safe Datacenter Traffic/ Infrastructure Management . . . . . . 9 1.3.2 End-host/Network Cooperative Traffic Management . . . . . . 10 1.3.3 Direct

  17. Nonvolatile Array Of Synapses For Neural Network

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul

    1993-01-01

    Elements of array programmed with help of ultraviolet light. A 32 x 32 very-large-scale integrated-circuit array of electronic synapses serves as building-block chip for analog neural-network computer. Synaptic weights stored in nonvolatile manner. Makes information content of array invulnerable to loss of power, and, by eliminating need for circuitry to refresh volatile synaptic memory, makes architecture simpler and more compact.

  18. Time Series Analysis for Spatial Node Selection in Environment Monitoring Sensor Networks

    PubMed Central

    Bhandari, Siddhartha; Jurdak, Raja; Kusy, Branislav

    2017-01-01

    Wireless sensor networks are widely used in environmental monitoring. The number of sensor nodes to be deployed will vary depending on the desired spatio-temporal resolution. Selecting an optimal number, position and sampling rate for an array of sensor nodes in environmental monitoring is a challenging question. Most of the current solutions are either theoretical or simulation-based where the problems are tackled using random field theory, computational geometry or computer simulations, limiting their specificity to a given sensor deployment. Using an empirical dataset from a mine rehabilitation monitoring sensor network, this work proposes a data-driven approach where co-integrated time series analysis is used to select the number of sensors from a short-term deployment of a larger set of potential node positions. Analyses conducted on temperature time series show 75% of sensors are co-integrated. Using only 25% of the original nodes can generate a complete dataset within a 0.5 °C average error bound. Our data-driven approach to sensor position selection is applicable for spatiotemporal monitoring of spatially correlated environmental parameters to minimize deployment cost without compromising data resolution. PMID:29271880

  19. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  20. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-28

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  1. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  2. Analysis of an algorithm for distributed recognition and accountability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, C.; Frincke, D.A.; Goan, T. Jr.

    1993-08-01

    Computer and network systems are available to attacks. Abandoning the existing huge infrastructure of possibly-insecure computer and network systems is impossible, and replacing them by totally secure systems may not be feasible or cost effective. A common element in many attacks is that a single user will often attempt to intrude upon multiple resources throughout a network. Detecting the attack can become significantly easier by compiling and integrating evidence of such intrusion attempts across the network rather than attempting to assess the situation from the vantage point of only a single host. To solve this problem, we suggest an approachmore » for distributed recognition and accountability (DRA), which consists of algorithms which ``process,`` at a central location, distributed and asynchronous ``reports`` generated by computers (or a subset thereof) throughout the network. Our highest-priority objectives are to observe ways by which an individual moves around in a network of computers, including changing user names to possibly hide his/her true identity, and to associate all activities of multiple instance of the same individual to the same network-wide user. We present the DRA algorithm and a sketch of its proof under an initial set of simplifying albeit realistic assumptions. Later, we relax these assumptions to accommodate pragmatic aspects such as missing or delayed ``reports,`` clock slew, tampered ``reports,`` etc. We believe that such algorithms will have widespread applications in the future, particularly in intrusion-detection system.« less

  3. Antenna analysis using neural networks

    NASA Technical Reports Server (NTRS)

    Smith, William T.

    1992-01-01

    Conventional computing schemes have long been used to analyze problems in electromagnetics (EM). The vast majority of EM applications require computationally intensive algorithms involving numerical integration and solutions to large systems of equations. The feasibility of using neural network computing algorithms for antenna analysis is investigated. The ultimate goal is to use a trained neural network algorithm to reduce the computational demands of existing reflector surface error compensation techniques. Neural networks are computational algorithms based on neurobiological systems. Neural nets consist of massively parallel interconnected nonlinear computational elements. They are often employed in pattern recognition and image processing problems. Recently, neural network analysis has been applied in the electromagnetics area for the design of frequency selective surfaces and beam forming networks. The backpropagation training algorithm was employed to simulate classical antenna array synthesis techniques. The Woodward-Lawson (W-L) and Dolph-Chebyshev (D-C) array pattern synthesis techniques were used to train the neural network. The inputs to the network were samples of the desired synthesis pattern. The outputs are the array element excitations required to synthesize the desired pattern. Once trained, the network is used to simulate the W-L or D-C techniques. Various sector patterns and cosecant-type patterns (27 total) generated using W-L synthesis were used to train the network. Desired pattern samples were then fed to the neural network. The outputs of the network were the simulated W-L excitations. A 20 element linear array was used. There were 41 input pattern samples with 40 output excitations (20 real parts, 20 imaginary). A comparison between the simulated and actual W-L techniques is shown for a triangular-shaped pattern. Dolph-Chebyshev is a different class of synthesis technique in that D-C is used for side lobe control as opposed to pattern shaping. The interesting thing about D-C synthesis is that the side lobes have the same amplitude. Five-element arrays were used. Again, 41 pattern samples were used for the input. Nine actual D-C patterns ranging from -10 dB to -30 dB side lobe levels were used to train the network. A comparison between simulated and actual D-C techniques for a pattern with -22 dB side lobe level is shown. The goal for this research was to evaluate the performance of neural network computing with antennas. Future applications will employ the backpropagation training algorithm to drastically reduce the computational complexity involved in performing EM compensation for surface errors in large space reflector antennas.

  4. Antenna analysis using neural networks

    NASA Astrophysics Data System (ADS)

    Smith, William T.

    1992-09-01

    Conventional computing schemes have long been used to analyze problems in electromagnetics (EM). The vast majority of EM applications require computationally intensive algorithms involving numerical integration and solutions to large systems of equations. The feasibility of using neural network computing algorithms for antenna analysis is investigated. The ultimate goal is to use a trained neural network algorithm to reduce the computational demands of existing reflector surface error compensation techniques. Neural networks are computational algorithms based on neurobiological systems. Neural nets consist of massively parallel interconnected nonlinear computational elements. They are often employed in pattern recognition and image processing problems. Recently, neural network analysis has been applied in the electromagnetics area for the design of frequency selective surfaces and beam forming networks. The backpropagation training algorithm was employed to simulate classical antenna array synthesis techniques. The Woodward-Lawson (W-L) and Dolph-Chebyshev (D-C) array pattern synthesis techniques were used to train the neural network. The inputs to the network were samples of the desired synthesis pattern. The outputs are the array element excitations required to synthesize the desired pattern. Once trained, the network is used to simulate the W-L or D-C techniques. Various sector patterns and cosecant-type patterns (27 total) generated using W-L synthesis were used to train the network. Desired pattern samples were then fed to the neural network. The outputs of the network were the simulated W-L excitations. A 20 element linear array was used. There were 41 input pattern samples with 40 output excitations (20 real parts, 20 imaginary).

  5. Two-Dimensional High-Lift Aerodynamic Optimization Using Neural Networks

    NASA Technical Reports Server (NTRS)

    Greenman, Roxana M.

    1998-01-01

    The high-lift performance of a multi-element airfoil was optimized by using neural-net predictions that were trained using a computational data set. The numerical data was generated using a two-dimensional, incompressible, Navier-Stokes algorithm with the Spalart-Allmaras turbulence model. Because it is difficult to predict maximum lift for high-lift systems, an empirically-based maximum lift criteria was used in this study to determine both the maximum lift and the angle at which it occurs. The 'pressure difference rule,' which states that the maximum lift condition corresponds to a certain pressure difference between the peak suction pressure and the pressure at the trailing edge of the element, was applied and verified with experimental observations for this configuration. Multiple input, single output networks were trained using the NASA Ames variation of the Levenberg-Marquardt algorithm for each of the aerodynamic coefficients (lift, drag and moment). The artificial neural networks were integrated with a gradient-based optimizer. Using independent numerical simulations and experimental data for this high-lift configuration, it was shown that this design process successfully optimized flap deflection, gap, overlap, and angle of attack to maximize lift. Once the neural nets were trained and integrated with the optimizer, minimal additional computer resources were required to perform optimization runs with different initial conditions and parameters. Applying the neural networks within the high-lift rigging optimization process reduced the amount of computational time and resources by 44% compared with traditional gradient-based optimization procedures for multiple optimization runs.

  6. Networked Instructional Chemistry: Using Technology To Teach Chemistry

    NASA Astrophysics Data System (ADS)

    Smith, Stanley; Stovall, Iris

    1996-10-01

    Networked multimedia microcomputers provide new ways to help students learn chemistry and to help instructors manage the learning environment. This technology is used to replace some traditional laboratory work, collect on-line experimental data, enhance lectures and quiz sections with multimedia presentations, provide prelaboratory training for beginning nonchemistry- major organic laboratory, provide electronic homework for organic chemistry students, give graduate students access to real NMR data for analysis, and provide access to molecular modeling tools. The integration of all of these activities into an active learning environment is made possible by a client-server network of hundreds of computers. This requires not only instructional software but also classroom and course management software, computers, networking, and room management. Combining computer-based work with traditional course material is made possible with software management tools that allow the instructor to monitor the progress of each student and make available an on-line gradebook so students can see their grades and class standing. This client-server based system extends the capabilities of the earlier mainframe-based PLATO system, which was used for instructional computing. This paper outlines the components of a technology center used to support over 5,000 students per semester.

  7. Some issues related to simulation of the tracking and communications computer network

    NASA Technical Reports Server (NTRS)

    Lacovara, Robert C.

    1989-01-01

    The Communications Performance and Integration branch of the Tracking and Communications Division has an ongoing involvement in the simulation of its flight hardware for Space Station Freedom. Specifically, the communication process between central processor(s) and orbital replaceable units (ORU's) is simulated with varying degrees of fidelity. The results of investigations into three aspects of this simulation effort are given. The most general area involves the use of computer assisted software engineering (CASE) tools for this particular simulation. The second area of interest is simulation methods for systems of mixed hardware and software. The final area investigated is the application of simulation methods to one of the proposed computer network protocols for space station, specifically IEEE 802.4.

  8. Computational models of airway branching morphogenesis.

    PubMed

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Some issues related to simulation of the tracking and communications computer network

    NASA Astrophysics Data System (ADS)

    Lacovara, Robert C.

    1989-12-01

    The Communications Performance and Integration branch of the Tracking and Communications Division has an ongoing involvement in the simulation of its flight hardware for Space Station Freedom. Specifically, the communication process between central processor(s) and orbital replaceable units (ORU's) is simulated with varying degrees of fidelity. The results of investigations into three aspects of this simulation effort are given. The most general area involves the use of computer assisted software engineering (CASE) tools for this particular simulation. The second area of interest is simulation methods for systems of mixed hardware and software. The final area investigated is the application of simulation methods to one of the proposed computer network protocols for space station, specifically IEEE 802.4.

  10. A convolutional neural network approach to calibrating the rotation axis for X-ray computed tomography.

    PubMed

    Yang, Xiaogang; De Carlo, Francesco; Phatak, Charudatta; Gürsoy, Dogˇa

    2017-03-01

    This paper presents an algorithm to calibrate the center-of-rotation for X-ray tomography by using a machine learning approach, the Convolutional Neural Network (CNN). The algorithm shows excellent accuracy from the evaluation of synthetic data with various noise ratios. It is further validated with experimental data of four different shale samples measured at the Advanced Photon Source and at the Swiss Light Source. The results are as good as those determined by visual inspection and show better robustness than conventional methods. CNN has also great potential for reducing or removing other artifacts caused by instrument instability, detector non-linearity, etc. An open-source toolbox, which integrates the CNN methods described in this paper, is freely available through GitHub at tomography/xlearn and can be easily integrated into existing computational pipelines available at various synchrotron facilities. Source code, documentation and information on how to contribute are also provided.

  11. A scalable silicon photonic chip-scale optical switch for high performance computing systems.

    PubMed

    Yu, Runxiang; Cheung, Stanley; Li, Yuliang; Okamoto, Katsunari; Proietti, Roberto; Yin, Yawei; Yoo, S J B

    2013-12-30

    This paper discusses the architecture and provides performance studies of a silicon photonic chip-scale optical switch for scalable interconnect network in high performance computing systems. The proposed switch exploits optical wavelength parallelism and wavelength routing characteristics of an Arrayed Waveguide Grating Router (AWGR) to allow contention resolution in the wavelength domain. Simulation results from a cycle-accurate network simulator indicate that, even with only two transmitter/receiver pairs per node, the switch exhibits lower end-to-end latency and higher throughput at high (>90%) input loads compared with electronic switches. On the device integration level, we propose to integrate all the components (ring modulators, photodetectors and AWGR) on a CMOS-compatible silicon photonic platform to ensure a compact, energy efficient and cost-effective device. We successfully demonstrate proof-of-concept routing functions on an 8 × 8 prototype fabricated using foundry services provided by OpSIS-IME.

  12. An integrated approach to infer dynamic protein-gene interactions - A case study of the human P53 protein.

    PubMed

    Wang, Junbai; Wu, Qianqian; Hu, Xiaohua Tony; Tian, Tianhai

    2016-11-01

    Investigating the dynamics of genetic regulatory networks through high throughput experimental data, such as microarray gene expression profiles, is a very important but challenging task. One of the major hindrances in building detailed mathematical models for genetic regulation is the large number of unknown model parameters. To tackle this challenge, a new integrated method is proposed by combining a top-down approach and a bottom-up approach. First, the top-down approach uses probabilistic graphical models to predict the network structure of DNA repair pathway that is regulated by the p53 protein. Two networks are predicted, namely a network of eight genes with eight inferred interactions and an extended network of 21 genes with 17 interactions. Then, the bottom-up approach using differential equation models is developed to study the detailed genetic regulations based on either a fully connected regulatory network or a gene network obtained by the top-down approach. Model simulation error, parameter identifiability and robustness property are used as criteria to select the optimal network. Simulation results together with permutation tests of input gene network structures indicate that the prediction accuracy and robustness property of the two predicted networks using the top-down approach are better than those of the corresponding fully connected networks. In particular, the proposed approach reduces computational cost significantly for inferring model parameters. Overall, the new integrated method is a promising approach for investigating the dynamics of genetic regulation. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)

    NASA Astrophysics Data System (ADS)

    2017-09-01

    The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary

  14. Discrepancy between mRNA and protein abundance: Insight from information retrieval process in computers

    PubMed Central

    Wang, Degeng

    2008-01-01

    Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers - multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks – biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird’s-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation. PMID:18757239

  15. Discrepancy between mRNA and protein abundance: insight from information retrieval process in computers.

    PubMed

    Wang, Degeng

    2008-12-01

    Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers-multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory, respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks-biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird's-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation.

  16. The application of integrated knowledge-based systems for the Biomedical Risk Assessment Intelligent Network (BRAIN)

    NASA Technical Reports Server (NTRS)

    Loftin, Karin C.; Ly, Bebe; Webster, Laurie; Verlander, James; Taylor, Gerald R.; Riley, Gary; Culbert, Chris

    1992-01-01

    One of NASA's goals for long duration space flight is to maintain acceptable levels of crew health, safety, and performance. One way of meeting this goal is through BRAIN, an integrated network of both human and computer elements. BRAIN will function as an advisor to mission managers by assessing the risk of inflight biomedical problems and recommending appropriate countermeasures. Described here is a joint effort among various NASA elements to develop BRAIN and the Infectious Disease Risk Assessment (IDRA) prototype. The implementation of this effort addresses the technological aspects of knowledge acquisition, integration of IDRA components, the use of expert systems to automate the biomedical prediction process, development of a user friendly interface, and integration of IDRA and ExerCISys systems. Because C language, CLIPS and the X-Window System are portable and easily integrated, they were chosen ss the tools for the initial IDRA prototype.

  17. SMC: SCENIC Model Control

    NASA Technical Reports Server (NTRS)

    Srivastava, Priyaka; Kraus, Jeff; Murawski, Robert; Golden, Bertsel, Jr.

    2015-01-01

    NASAs Space Communications and Navigation (SCaN) program manages three active networks: the Near Earth Network, the Space Network, and the Deep Space Network. These networks simultaneously support NASA missions and provide communications services to customers worldwide. To efficiently manage these resources and their capabilities, a team of student interns at the NASA Glenn Research Center is developing a distributed system to model the SCaN networks. Once complete, the system shall provide a platform that enables users to perform capacity modeling of current and prospective missions with finer-grained control of information between several simulation and modeling tools. This will enable the SCaN program to access a holistic view of its networks and simulate the effects of modifications in order to provide NASA with decisional information. The development of this capacity modeling system is managed by NASAs Strategic Center for Education, Networking, Integration, and Communication (SCENIC). Three primary third-party software tools offer their unique abilities in different stages of the simulation process. MagicDraw provides UMLSysML modeling, AGIs Systems Tool Kit simulates the physical transmission parameters and de-conflicts scheduled communication, and Riverbed Modeler (formerly OPNET) simulates communication protocols and packet-based networking. SCENIC developers are building custom software extensions to integrate these components in an end-to-end space communications modeling platform. A central control module acts as the hub for report-based messaging between client wrappers. Backend databases provide information related to mission parameters and ground station configurations, while the end user defines scenario-specific attributes for the model. The eight SCENIC interns are working under the direction of their mentors to complete an initial version of this capacity modeling system during the summer of 2015. The intern team is composed of four students in Computer Science, two in Computer Engineering, one in Electrical Engineering, and one studying Space Systems Engineering.

  18. Integration of the Chinese HPC Grid in ATLAS Distributed Computing

    NASA Astrophysics Data System (ADS)

    Filipčič, A.; ATLAS Collaboration

    2017-10-01

    Fifteen Chinese High-Performance Computing sites, many of them on the TOP500 list of most powerful supercomputers, are integrated into a common infrastructure providing coherent access to a user through an interface based on a RESTful interface called SCEAPI. These resources have been integrated into the ATLAS Grid production system using a bridge between ATLAS and SCEAPI which translates the authorization and job submission protocols between the two environments. The ARC Computing Element (ARC-CE) forms the bridge using an extended batch system interface to allow job submission to SCEAPI. The ARC-CE was setup at the Institute for High Energy Physics, Beijing, in order to be as close as possible to the SCEAPI front-end interface at the Computing Network Information Center, also in Beijing. This paper describes the technical details of the integration between ARC-CE and SCEAPI and presents results so far with two supercomputer centers, Tianhe-IA and ERA. These two centers have been the pilots for ATLAS Monte Carlo Simulation in SCEAPI and have been providing CPU power since fall 2015.

  19. Solar-Terrestrial and Astronomical Research Network (STAR-Network) - A Meaningful Practice of New Cyberinfrastructure on Space Science

    NASA Astrophysics Data System (ADS)

    Hu, X.; Zou, Z.

    2017-12-01

    For the next decades, comprehensive big data application environment is the dominant direction of cyberinfrastructure development on space science. To make the concept of such BIG cyberinfrastructure (e.g. Digital Space) a reality, these aspects of capability should be focused on and integrated, which includes science data system, digital space engine, big data application (tools and models) and the IT infrastructure. In the past few years, CAS Chinese Space Science Data Center (CSSDC) has made a helpful attempt in this direction. A cloud-enabled virtual research platform on space science, called Solar-Terrestrial and Astronomical Research Network (STAR-Network), has been developed to serve the full lifecycle of space science missions and research activities. It integrated a wide range of disciplinary and interdisciplinary resources, to provide science-problem-oriented data retrieval and query service, collaborative mission demonstration service, mission operation supporting service, space weather computing and Analysis service and other self-help service. This platform is supported by persistent infrastructure, including cloud storage, cloud computing, supercomputing and so on. Different variety of resource are interconnected: the science data can be displayed on the browser by visualization tools, the data analysis tools and physical models can be drived by the applicable science data, the computing results can be saved on the cloud, for example. So far, STAR-Network has served a series of space science mission in China, involving Strategic Pioneer Program on Space Science (this program has invested some space science satellite as DAMPE, HXMT, QUESS, and more satellite will be launched around 2020) and Meridian Space Weather Monitor Project. Scientists have obtained some new findings by using the science data from these missions with STAR-Network's contribution. We are confident that STAR-Network is an exciting practice of new cyberinfrastructure architecture on space science.

  20. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  1. The modeling and simulation of visuospatial working memory

    PubMed Central

    Liang, Lina; Zhang, Zhikang

    2010-01-01

    Camperi and Wang (Comput Neurosci 5:383–405, 1998) presented a network model for working memory that combines intrinsic cellular bistability with the recurrent network architecture of the neocortex. While Fall and Rinzel (Comput Neurosci 20:97–107, 2006) replaced this intrinsic bistability with a biological mechanism-Ca2+ release subsystem. In this study, we aim to further expand the above work. We integrate the traditional firing-rate network with Ca2+ subsystem-induced bistability, amend the synaptic weights and suggest that Ca2+ concentration only increase the efficacy of synaptic input but has nothing to do with the external input for the transient cue. We found that our network model maintained the persistent activity in response to a brief transient stimulus like that of the previous two models and the working memory performance was resistant to noise and distraction stimulus if Ca2+ subsystem was tuned to be bistable. PMID:22132045

  2. The grand challenge of managing the petascale facility.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, R. J.; Mathematics and Computer Science

    2007-02-28

    This report is the result of a study of networks and how they may need to evolve to support petascale leadership computing and science. As Dr. Ray Orbach, director of the Department of Energy's Office of Science, says in the spring 2006 issue of SciDAC Review, 'One remarkable example of growth in unexpected directions has been in high-end computation'. In the same article Dr. Michael Strayer states, 'Moore's law suggests that before the end of the next cycle of SciDAC, we shall see petaflop computers'. Given the Office of Science's strong leadership and support for petascale computing and facilities, wemore » should expect to see petaflop computers in operation in support of science before the end of the decade, and DOE/SC Advanced Scientific Computing Research programs are focused on making this a reality. This study took its lead from this strong focus on petascale computing and the networks required to support such facilities, but it grew to include almost all aspects of the DOE/SC petascale computational and experimental science facilities, all of which will face daunting challenges in managing and analyzing the voluminous amounts of data expected. In addition, trends indicate the increased coupling of unique experimental facilities with computational facilities, along with the integration of multidisciplinary datasets and high-end computing with data-intensive computing; and we can expect these trends to continue at the petascale level and beyond. Coupled with recent technology trends, they clearly indicate the need for including capability petascale storage, networks, and experiments, as well as collaboration tools and programming environments, as integral components of the Office of Science's petascale capability metafacility. The objective of this report is to recommend a new cross-cutting program to support the management of petascale science and infrastructure. The appendices of the report document current and projected DOE computation facilities, science trends, and technology trends, whose combined impact can affect the manageability and stewardship of DOE's petascale facilities. This report is not meant to be all-inclusive. Rather, the facilities, science projects, and research topics presented are to be considered examples to clarify a point.« less

  3. Testing the Model: A Phase 1/11 Randomized Double Blind Placebo Control Trial of Targeted Therapeutics: Liposomal Glutathione and Curcumin

    DTIC Science & Technology

    2016-10-01

    Can non- specific cellular immunity protect HIV-infected persons with very low CD4 counts? Presented at Conference on Integrating Psychology and...Under Review. 50. Nierenberg B, Cooper S, Feuer SJ, Broderick G. Applying Network Medicine to Chronic Illness: A Model for Integrating Psychology ...function in these subjects as compared to GW era sedentary healthy controls. We applied an integrative systems- based approach rooted in computational

  4. Integration of photoactive and electroactive components with vertical cavity surface emitting lasers

    DOEpatents

    Bryan, R.P.; Esherick, P.; Jewell, J.L.; Lear, K.L.; Olbright, G.R.

    1997-04-29

    A monolithically integrated optoelectronic device is provided which integrates a vertical cavity surface emitting laser and either a photosensitive or an electrosensitive device either as input or output to the vertical cavity surface emitting laser either in parallel or series connection. Both vertical and side-by-side arrangements are disclosed, and optical and electronic feedback means are provided. Arrays of these devices can be configured to enable optical computing and neural network applications. 9 figs.

  5. Integration of photoactive and electroactive components with vertical cavity surface emitting lasers

    DOEpatents

    Bryan, Robert P.; Esherick, Peter; Jewell, Jack L.; Lear, Kevin L.; Olbright, Gregory R.

    1997-01-01

    A monolithically integrated optoelectronic device is provided which integrates a vertical cavity surface emitting laser and either a photosensitive or an electrosensitive device either as input or output to the vertical cavity surface emitting laser either in parallel or series connection. Both vertical and side-by-side arrangements are disclosed, and optical and electronic feedback means are provided. Arrays of these devices can be configured to enable optical computing and neural network applications.

  6. CyPhyTown

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A series of software programs that enables students to progress from completely unsecured control of devices to control that protects network commands with authentication, integrity and confidentiality. The working example provided is for turning LED lights on and off on a Raspberry Pi computer.

  7. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  8. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  9. Temporal coding in a silicon network of integrate-and-fire neurons.

    PubMed

    Liu, Shih-Chii; Douglas, Rodney

    2004-09-01

    Spatio-temporal processing of spike trains by neuronal networks depends on a variety of mechanisms distributed across synapses, dendrites, and somata. In natural systems, the spike trains and the processing mechanisms cohere though their common physical instantiation. This coherence is lost when the natural system is encoded for simulation on a general purpose computer. By contrast, analog VLSI circuits are, like neurons, inherently related by their real-time physics, and so, could provide a useful substrate for exploring neuronlike event-based processing. Here, we describe a hybrid analog-digital VLSI chip comprising a set of integrate-and-fire neurons and short-term dynamical synapses that can be configured into simple network architectures with some properties of neocortical neuronal circuits. We show that, despite considerable fabrication variance in the properties of individual neurons, the chip offers a viable substrate for exploring real-time spike-based processing in networks of neurons.

  10. Integrative approach for inference of gene regulatory networks using lasso-based random featuring and application to psychiatric disorders.

    PubMed

    Kim, Dongchul; Kang, Mingon; Biswas, Ashis; Liu, Chunyu; Gao, Jean

    2016-08-10

    Inferring gene regulatory networks is one of the most interesting research areas in the systems biology. Many inference methods have been developed by using a variety of computational models and approaches. However, there are two issues to solve. First, depending on the structural or computational model of inference method, the results tend to be inconsistent due to innately different advantages and limitations of the methods. Therefore the combination of dissimilar approaches is demanded as an alternative way in order to overcome the limitations of standalone methods through complementary integration. Second, sparse linear regression that is penalized by the regularization parameter (lasso) and bootstrapping-based sparse linear regression methods were suggested in state of the art methods for network inference but they are not effective for a small sample size data and also a true regulator could be missed if the target gene is strongly affected by an indirect regulator with high correlation or another true regulator. We present two novel network inference methods based on the integration of three different criteria, (i) z-score to measure the variation of gene expression from knockout data, (ii) mutual information for the dependency between two genes, and (iii) linear regression-based feature selection. Based on these criterion, we propose a lasso-based random feature selection algorithm (LARF) to achieve better performance overcoming the limitations of bootstrapping as mentioned above. In this work, there are three main contributions. First, our z score-based method to measure gene expression variations from knockout data is more effective than similar criteria of related works. Second, we confirmed that the true regulator selection can be effectively improved by LARF. Lastly, we verified that an integrative approach can clearly outperform a single method when two different methods are effectively jointed. In the experiments, our methods were validated by outperforming the state of the art methods on DREAM challenge data, and then LARF was applied to inferences of gene regulatory network associated with psychiatric disorders.

  11. Real-time optimizations for integrated smart network camera

    NASA Astrophysics Data System (ADS)

    Desurmont, Xavier; Lienard, Bruno; Meessen, Jerome; Delaigle, Jean-Francois

    2005-02-01

    We present an integrated real-time smart network camera. This system is composed of an image sensor, an embedded PC based electronic card for image processing and some network capabilities. The application detects events of interest in visual scenes, highlights alarms and computes statistics. The system also produces meta-data information that could be shared between other cameras in a network. We describe the requirements of such a system and then show how the design of the system is optimized to process and compress video in real-time. Indeed, typical video-surveillance algorithms as background differencing, tracking and event detection should be highly optimized and simplified to be used in this hardware. To have a good adequation between hardware and software in this light embedded system, the software management is written on top of the java based middle-ware specification established by the OSGi alliance. We can integrate easily software and hardware in complex environments thanks to the Java Real-Time specification for the virtual machine and some network and service oriented java specifications (like RMI and Jini). Finally, we will report some outcomes and typical case studies of such a camera like counter-flow detection.

  12. SDN-NGenIA, a software defined next generation integrated architecture for HEP and data intensive science

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.

    2017-10-01

    The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.

  13. A General theory of Signal Integration for Fault-Tolerant Dynamic Distributed Sensor Networks

    DTIC Science & Technology

    1993-10-01

    related to a) the architecture and fault- tolerance of the distributed sensor network, b) the proper synchronisation of sensor signals, c) the...Computational complexities of the problem of distributed detection. 5) Issues related to recording of events and synchronization in distributed sensor...Intervals for Synchronization in Real Time Distributed Systems", Submitted to Electronic Encyclopedia. 3. V. G. Hegde and S. S. Iyengar "Efficient

  14. Technological requirements of teleneuropathological systems.

    PubMed

    Szymaś, J

    2000-01-01

    Teleneuropathology is the practice of conducting remote neuropathological examinations with the use of telecommunication links. Because of a limited number of expert neuropathologists, some, especially smaller departments have the equipment to conduct the examination but do not have a specialist who would be able to evaluate material from the central nervous system. In case of teleneuropathology, a neuropathologist examines tissue fragments taken during an operation by means of a telemicroscope connected with the computer through a telecommunications network. It enables the neuropathologist to operate the microscope and camera remotely. Two basic systems exist for performing remote neuropathological examination: static and dynamic. Both have different needs in medical, computing and telecommunication aspect. Depending on the type of service the public telephone network, the integrated services digital network, or optical fibre should be used. Conditionally Internet can be used as a link for teleneuropathological system. However, for the newest developments in teleneuropathology such as teleconference and remote operation on robotized microscope only transmission over the integrated service digital network, which guarantees high speed of transmission gives a possibility to communicate. Because images are basic information element in teleneuropathological systems the high capacity of acquisition, processing, storing, transmission, and visualization equipment is necessary. The farther development of telecommunication as well as standardization of recording and transmission procedures of pictorial data is necessary.

  15. Methods for design and evaluation of integrated hardware-software systems for concurrent computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.

  16. Smart BIT/TSMD Integration

    DTIC Science & Technology

    1991-12-01

    user using the ’: knn ’ option in the do-scenario command line). An instance of the K-Nearest Neighbor object is first created and initialized before...Navigation Computer HF High Frequency ILS Instrument Landing System KNN K - Nearest Neighbor LRU Line Replaceable Unit MC Mission Computer MTCA...approaches have been investigated here, K-nearest Neighbors ( KNN ) and neural networks (NN). Both approaches require that previously classified examples of

  17. Nonlinear Dynamic Model-Based Multiobjective Sensor Network Design Algorithm for a Plant with an Estimator-Based Control System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul, Prokash; Bhattacharyya, Debangsu; Turton, Richard

    Here, a novel sensor network design (SND) algorithm is developed for maximizing process efficiency while minimizing sensor network cost for a nonlinear dynamic process with an estimator-based control system. The multiobjective optimization problem is solved following a lexicographic approach where the process efficiency is maximized first followed by minimization of the sensor network cost. The partial net present value, which combines the capital cost due to the sensor network and the operating cost due to deviation from the optimal efficiency, is proposed as an alternative objective. The unscented Kalman filter is considered as the nonlinear estimator. The large-scale combinatorial optimizationmore » problem is solved using a genetic algorithm. The developed SND algorithm is applied to an acid gas removal (AGR) unit as part of an integrated gasification combined cycle (IGCC) power plant with CO 2 capture. Due to the computational expense, a reduced order nonlinear model of the AGR process is identified and parallel computation is performed during implementation.« less

  18. Nonlinear Dynamic Model-Based Multiobjective Sensor Network Design Algorithm for a Plant with an Estimator-Based Control System

    DOE PAGES

    Paul, Prokash; Bhattacharyya, Debangsu; Turton, Richard; ...

    2017-06-06

    Here, a novel sensor network design (SND) algorithm is developed for maximizing process efficiency while minimizing sensor network cost for a nonlinear dynamic process with an estimator-based control system. The multiobjective optimization problem is solved following a lexicographic approach where the process efficiency is maximized first followed by minimization of the sensor network cost. The partial net present value, which combines the capital cost due to the sensor network and the operating cost due to deviation from the optimal efficiency, is proposed as an alternative objective. The unscented Kalman filter is considered as the nonlinear estimator. The large-scale combinatorial optimizationmore » problem is solved using a genetic algorithm. The developed SND algorithm is applied to an acid gas removal (AGR) unit as part of an integrated gasification combined cycle (IGCC) power plant with CO 2 capture. Due to the computational expense, a reduced order nonlinear model of the AGR process is identified and parallel computation is performed during implementation.« less

  19. Mining integrated semantic networks for drug repositioning opportunities

    PubMed Central

    Mullen, Joseph; Tipney, Hannah

    2016-01-01

    Current research and development approaches to drug discovery have become less fruitful and more costly. One alternative paradigm is that of drug repositioning. Many marketed examples of repositioned drugs have been identified through serendipitous or rational observations, highlighting the need for more systematic methodologies to tackle the problem. Systems level approaches have the potential to enable the development of novel methods to understand the action of therapeutic compounds, but requires an integrative approach to biological data. Integrated networks can facilitate systems level analyses by combining multiple sources of evidence to provide a rich description of drugs, their targets and their interactions. Classically, such networks can be mined manually where a skilled person is able to identify portions of the graph (semantic subgraphs) that are indicative of relationships between drugs and highlight possible repositioning opportunities. However, this approach is not scalable. Automated approaches are required to systematically mine integrated networks for these subgraphs and bring them to the attention of the user. We introduce a formal framework for the definition of integrated networks and their associated semantic subgraphs for drug interaction analysis and describe DReSMin, an algorithm for mining semantically-rich networks for occurrences of a given semantic subgraph. This algorithm allows instances of complex semantic subgraphs that contain data about putative drug repositioning opportunities to be identified in a computationally tractable fashion, scaling close to linearly with network data. We demonstrate the utility of our approach by mining an integrated drug interaction network built from 11 sources. This work identified and ranked 9,643,061 putative drug-target interactions, showing a strong correlation between highly scored associations and those supported by literature. We discuss the 20 top ranked associations in more detail, of which 14 are novel and 6 are supported by the literature. We also show that our approach better prioritizes known drug-target interactions, than other state-of-the art approaches for predicting such interactions. PMID:26844016

  20. Integrating computational methods to retrofit enzymes to synthetic pathways.

    PubMed

    Brunk, Elizabeth; Neri, Marilisa; Tavernelli, Ivano; Hatzimanikatis, Vassily; Rothlisberger, Ursula

    2012-02-01

    Microbial production of desired compounds provides an efficient framework for the development of renewable energy resources. To be competitive to traditional chemistry, one requirement is to utilize the full capacity of the microorganism to produce target compounds with high yields and turnover rates. We use integrated computational methods to generate and quantify the performance of novel biosynthetic routes that contain highly optimized catalysts. Engineering a novel reaction pathway entails addressing feasibility on multiple levels, which involves handling the complexity of large-scale biochemical networks while respecting the critical chemical phenomena at the atomistic scale. To pursue this multi-layer challenge, our strategy merges knowledge-based metabolic engineering methods with computational chemistry methods. By bridging multiple disciplines, we provide an integral computational framework that could accelerate the discovery and implementation of novel biosynthetic production routes. Using this approach, we have identified and optimized a novel biosynthetic route for the production of 3HP from pyruvate. Copyright © 2011 Wiley Periodicals, Inc.

  1. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  2. Mobility in hospital work: towards a pervasive computing hospital environment.

    PubMed

    Morán, Elisa B; Tentori, Monica; González, Víctor M; Favela, Jesus; Martínez-Garcia, Ana I

    2007-01-01

    Handheld computers are increasingly being used by hospital workers. With the integration of wireless networks into hospital information systems, handheld computers can provide the basis for a pervasive computing hospital environment; to develop this designers need empirical information to understand how hospital workers interact with information while moving around. To characterise the medical phenomena we report the results of a workplace study conducted in a hospital. We found that individuals spend about half of their time at their base location, where most of their interactions occur. On average, our informants spent 23% of their time performing information management tasks, followed by coordination (17.08%), clinical case assessment (15.35%) and direct patient care (12.6%). We discuss how our results offer insights for the design of pervasive computing technology, and directions for further research and development in this field such as transferring information between heterogeneous devices and integration of the physical and digital domains.

  3. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Maxine D.; Leigh, Jason

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less

  4. Real-time, interactive animation of deformable two- and three-dimensional objects

    DOEpatents

    Desbrun, Mathieu; Schroeder, Peter; Meyer, Mark; Barr, Alan H.

    2003-06-03

    A method of updating in real-time the locations and velocities of mass points of a two- or three-dimensional object represented by a mass-spring system. A modified implicit Euler integration scheme is employed to determine the updated locations and velocities. In an optional post-integration step, the updated locations are corrected to preserve angular momentum. A processor readable medium and a network server each tangibly embodying the method are also provided. A system comprising a processor in combination with the medium, and a system comprising the server in combination with a client for accessing the server over a computer network, are also provided.

  5. Tools and Models for Integrating Multiple Cellular Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerstein, Mark

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novelmore » algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed CRIT for correlation analysis in systems biology [5]. For Aim 3, we have further investigated the scaling relationship that the number of Transcription Factors (TFs) in a genome is proportional to the square of the total number of genes. We have extended the analysis from transcription factors to various classes of functional categories, and from individual categories to joint distribution [6]. By introducing a new analytical framework, we have generalized the original toolbox model to take into account of metabolic network with arbitrary network topology [7].« less

  6. Comprehensive Anti-error Study on Power Grid Dispatching Based on Regional Regulation and Integration

    NASA Astrophysics Data System (ADS)

    Zhang, Yunju; Chen, Zhongyi; Guo, Ming; Lin, Shunsheng; Yan, Yinyang

    2018-01-01

    With the large capacity of the power system, the development trend of the large unit and the high voltage, the scheduling operation is becoming more frequent and complicated, and the probability of operation error increases. This paper aims at the problem of the lack of anti-error function, single scheduling function and low working efficiency for technical support system in regional regulation and integration, the integrated construction of the error prevention of the integrated architecture of the system of dispatching anti - error of dispatching anti - error of power network based on cloud computing has been proposed. Integrated system of error prevention of Energy Management System, EMS, and Operation Management System, OMS have been constructed either. The system architecture has good scalability and adaptability, which can improve the computational efficiency, reduce the cost of system operation and maintenance, enhance the ability of regional regulation and anti-error checking with broad development prospects.

  7. Workstations take over conceptual design

    NASA Technical Reports Server (NTRS)

    Kidwell, George H.

    1987-01-01

    Workstations provide sufficient computing memory and speed for early evaluations of aircraft design alternatives to identify those worthy of further study. It is recommended that the programming of such machines permit integrated calculations of the configuration and performance analysis of new concepts, along with the capability of changing up to 100 variables at a time and swiftly viewing the results. Computations can be augmented through links to mainframes and supercomputers. Programming, particularly debugging operations, are enhanced by the capability of working with one program line at a time and having available on-screen error indices. Workstation networks permit on-line communication among users and with persons and computers outside the facility. Application of the capabilities is illustrated through a description of NASA-Ames design efforts for an oblique wing for a jet performed on a MicroVAX network.

  8. STICAP: A linear circuit analysis program with stiff systems capability. Volume 1: Theory manual. [network analysis

    NASA Technical Reports Server (NTRS)

    Cooke, C. H.

    1975-01-01

    STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.

  9. The BlueGene/L supercomputer

    NASA Astrophysics Data System (ADS)

    Bhanota, Gyan; Chen, Dong; Gara, Alan; Vranas, Pavlos

    2003-05-01

    The architecture of the BlueGene/L massively parallel supercomputer is described. Each computing node consists of a single compute ASIC plus 256 MB of external memory. The compute ASIC integrates two 700 MHz PowerPC 440 integer CPU cores, two 2.8 Gflops floating point units, 4 MB of embedded DRAM as cache, a memory controller for external memory, six 1.4 Gbit/s bi-directional ports for a 3-dimensional torus network connection, three 2.8 Gbit/s bi-directional ports for connecting to a global tree network and a Gigabit Ethernet for I/O. 65,536 of such nodes are connected into a 3-d torus with a geometry of 32×32×64. The total peak performance of the system is 360 Teraflops and the total amount of memory is 16 TeraBytes.

  10. Neural networks and applications tutorial

    NASA Astrophysics Data System (ADS)

    Guyon, I.

    1991-09-01

    The importance of neural networks has grown dramatically during this decade. While only a few years ago they were primarily of academic interest, now dozens of companies and many universities are investigating the potential use of these systems and products are beginning to appear. The idea of building a machine whose architecture is inspired by that of the brain has roots which go far back in history. Nowadays, technological advances of computers and the availability of custom integrated circuits, permit simulations of hundreds or even thousands of neurons. In conjunction, the growing interest in learning machines, non-linear dynamics and parallel computation spurred renewed attention in artificial neural networks. Many tentative applications have been proposed, including decision systems (associative memories, classifiers, data compressors and optimizers), or parametric models for signal processing purposes (system identification, automatic control, noise canceling, etc.). While they do not always outperform standard methods, neural network approaches are already used in some real world applications for pattern recognition and signal processing tasks. The tutorial is divided into six lectures, that where presented at the Third Graduate Summer Course on Computational Physics (September 3-7, 1990) on Parallel Architectures and Applications, organized by the European Physical Society: (1) Introduction: machine learning and biological computation. (2) Adaptive artificial neurons (perceptron, ADALINE, sigmoid units, etc.): learning rules and implementations. (3) Neural network systems: architectures, learning algorithms. (4) Applications: pattern recognition, signal processing, etc. (5) Elements of learning theory: how to build networks which generalize. (6) A case study: a neural network for on-line recognition of handwritten alphanumeric characters.

  11. On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    PubMed

    Kaabi, Mohamed Ghaith; Tonnelier, Arnaud; Martinez, Dominique

    2011-05-01

    In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced an approximate event-driven strategy, named voltage stepping, that allows the generic simulation of nonlinear spiking neurons. Promising results were achieved in the simulation of single quadratic integrate-and-fire neurons. Here, we assess the performance of voltage stepping in network simulations by considering more complex neurons (quadratic integrate-and-fire neurons with adaptation) coupled with multiple synapses. To handle the discrete nature of synaptic interactions, we recast voltage stepping in a general framework, the discrete event system specification. The efficiency of the method is assessed through simulations and comparisons with a modified time-stepping scheme of the Runge-Kutta type. We demonstrated numerically that the original order of voltage stepping is preserved when simulating connected spiking neurons, independent of the network activity and connectivity.

  12. A Small World of Neuronal Synchrony

    PubMed Central

    Yu, Shan; Huang, Debin; Singer, Wolf

    2008-01-01

    A small-world network has been suggested to be an efficient solution for achieving both modular and global processing—a property highly desirable for brain computations. Here, we investigated functional networks of cortical neurons using correlation analysis to identify functional connectivity. To reconstruct the interaction network, we applied the Ising model based on the principle of maximum entropy. This allowed us to assess the interactions by measuring pairwise correlations and to assess the strength of coupling from the degree of synchrony. Visual responses were recorded in visual cortex of anesthetized cats, simultaneously from up to 24 neurons. First, pairwise correlations captured most of the patterns in the population's activity and, therefore, provided a reliable basis for the reconstruction of the interaction networks. Second, and most importantly, the resulting networks had small-world properties; the average path lengths were as short as in simulated random networks, but the clustering coefficients were larger. Neurons differed considerably with respect to the number and strength of interactions, suggesting the existence of “hubs” in the network. Notably, there was no evidence for scale-free properties. These results suggest that cortical networks are optimized for the coexistence of local and global computations: feature detection and feature integration or binding. PMID:18400792

  13. Boolean logic tree of graphene-based chemical system for molecular computation and intelligent molecular search query.

    PubMed

    Huang, Wei Tao; Luo, Hong Qun; Li, Nian Bing

    2014-05-06

    The most serious, and yet unsolved, problem of constructing molecular computing devices consists in connecting all of these molecular events into a usable device. This report demonstrates the use of Boolean logic tree for analyzing the chemical event network based on graphene, organic dye, thrombin aptamer, and Fenton reaction, organizing and connecting these basic chemical events. And this chemical event network can be utilized to implement fluorescent combinatorial logic (including basic logic gates and complex integrated logic circuits) and fuzzy logic computing. On the basis of the Boolean logic tree analysis and logic computing, these basic chemical events can be considered as programmable "words" and chemical interactions as "syntax" logic rules to construct molecular search engine for performing intelligent molecular search query. Our approach is helpful in developing the advanced logic program based on molecules for application in biosensing, nanotechnology, and drug delivery.

  14. Evolution of the intelligent telecommunications network

    NASA Astrophysics Data System (ADS)

    Mayo, J. S.

    1982-02-01

    The development of the U.S. telecommunications network is described and traced from the invention of the telephone by Bell in 1876 to the use of integrated circuits and the UNIX system for interactive computers. The dialing system was introduced in the 19th century, and amplifiers were invented to permit coast to coast communication by 1914. Hierarchical switching was installed in the 1930s, along with telephoto and teletype services. PCM was invented in the 1930s, but was limited to military applications until the transistorized computer was fabricated in 1958, which coincided with spaceflight and the Telstar satellite in 1962. Fiber optics systems with laser pulse transmission are now entering widespread application, following the 1976 introduction of superfast digital switches controlled by a computer and capable of handling 1/2 million calls per hour. Projected advances are in increased teleconferencing, electronic mail, and full computer terminal services.

  15. WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, K; Kagadis, G; Xing, L

    As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set againstmore » new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.« less

  16. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  17. Distributed communications and control network for robotic mining

    NASA Technical Reports Server (NTRS)

    Schiffbauer, William H.

    1989-01-01

    The application of robotics to coal mining machines is one approach pursued to increase productivity while providing enhanced safety for the coal miner. Toward that end, a network composed of microcontrollers, computers, expert systems, real time operating systems, and a variety of program languages are being integrated that will act as the backbone for intelligent machine operation. Actual mining machines, including a few customized ones, have been given telerobotic semiautonomous capabilities by applying the described network. Control devices, intelligent sensors and computers onboard these machines are showing promise of achieving improved mining productivity and safety benefits. Current research using these machines involves navigation, multiple machine interaction, machine diagnostics, mineral detection, and graphical machine representation. Guidance sensors and systems employed include: sonar, laser rangers, gyroscopes, magnetometers, clinometers, and accelerometers. Information on the network of hardware/software and its implementation on mining machines are presented. Anticipated coal production operations using the network are discussed. A parallelism is also drawn between the direction of present day underground coal mining research to how the lunar soil (regolith) may be mined. A conceptual lunar mining operation that employs a distributed communication and control network is detailed.

  18. Receiver-Assisted Congestion Control to Achieve High Throughput in Lossy Wireless Networks

    NASA Astrophysics Data System (ADS)

    Shi, Kai; Shu, Yantai; Yang, Oliver; Luo, Jiarong

    2010-04-01

    Many applications would require fast data transfer in high-speed wireless networks nowadays. However, due to its conservative congestion control algorithm, Transmission Control Protocol (TCP) cannot effectively utilize the network capacity in lossy wireless networks. In this paper, we propose a receiver-assisted congestion control mechanism (RACC) in which the sender performs loss-based control, while the receiver is performing delay-based control. The receiver measures the network bandwidth based on the packet interarrival interval and uses it to compute a congestion window size deemed appropriate for the sender. After receiving the advertised value feedback from the receiver, the sender then uses the additive increase and multiplicative decrease (AIMD) mechanism to compute the correct congestion window size to be used. By integrating the loss-based and the delay-based congestion controls, our mechanism can mitigate the effect of wireless losses, alleviate the timeout effect, and therefore make better use of network bandwidth. Simulation and experiment results in various scenarios show that our mechanism can outperform conventional TCP in high-speed and lossy wireless environments.

  19. Integrative Computational Network Analysis Reveals Site-Specific Mediators of Inflammation in Alzheimer's Disease

    PubMed Central

    Ravichandran, Srikanth; Michelucci, Alessandro; del Sol, Antonio

    2018-01-01

    Alzheimer's disease (AD) is a major neurodegenerative disease and is one of the most common cause of dementia in older adults. Among several factors, neuroinflammation is known to play a critical role in the pathogenesis of chronic neurodegenerative diseases. In particular, studies of brains affected by AD show a clear involvement of several inflammatory pathways. Furthermore, depending on the brain regions affected by the disease, the nature and the effect of inflammation can vary. Here, in order to shed more light on distinct and common features of inflammation in different brain regions affected by AD, we employed a computational approach to analyze gene expression data of six site-specific neuronal populations from AD patients. Our network based computational approach is driven by the concept that a sustained inflammatory environment could result in neurotoxicity leading to the disease. Thus, our method aims to infer intracellular signaling pathways/networks that are likely to be constantly activated or inhibited due to persistent inflammatory conditions. The computational analysis identified several inflammatory mediators, such as tumor necrosis factor alpha (TNF-a)-associated pathway, as key upstream receptors/ligands that are likely to transmit sustained inflammatory signals. Further, the analysis revealed that several inflammatory mediators were mainly region specific with few commonalities across different brain regions. Taken together, our results show that our integrative approach aids identification of inflammation-related signaling pathways that could be responsible for the onset or the progression of AD and can be applied to study other neurodegenerative diseases. Furthermore, such computational approaches can enable the translation of clinical omics data toward the development of novel therapeutic strategies for neurodegenerative diseases. PMID:29551980

  20. A pedagogical walkthrough of computational modeling and simulation of Wnt signaling pathway using static causal models in MATLAB.

    PubMed

    Sinha, Shriprakash

    2016-12-01

    Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.

  1. Systematic review of computational methods for identifying miRNA-mediated RNA-RNA crosstalk.

    PubMed

    Li, Yongsheng; Jin, Xiyun; Wang, Zishan; Li, Lili; Chen, Hong; Lin, Xiaoyu; Yi, Song; Zhang, Yunpeng; Xu, Juan

    2017-10-25

    Posttranscriptional crosstalk and communication between RNAs yield large regulatory competing endogenous RNA (ceRNA) networks via shared microRNAs (miRNAs), as well as miRNA synergistic networks. The ceRNA crosstalk represents a novel layer of gene regulation that controls both physiological and pathological processes such as development and complex diseases. The rapidly expanding catalogue of ceRNA regulation has provided evidence for exploitation as a general model to predict the ceRNAs in silico. In this article, we first reviewed the current progress of RNA-RNA crosstalk in human complex diseases. Then, the widely used computational methods for modeling ceRNA-ceRNA interaction networks are further summarized into five types: two types of global ceRNA regulation prediction methods and three types of context-specific prediction methods, which are based on miRNA-messenger RNA regulation alone, or by integrating heterogeneous data, respectively. To provide guidance in the computational prediction of ceRNA-ceRNA interactions, we finally performed a comparative study of different combinations of miRNA-target methods as well as five types of ceRNA identification methods by using literature-curated ceRNA regulation and gene perturbation. The results revealed that integration of different miRNA-target prediction methods and context-specific miRNA/gene expression profiles increased the performance for identifying ceRNA regulation. Moreover, different computational methods were complementary in identifying ceRNA regulation and captured different functional parts of similar pathways. We believe that the application of these computational techniques provides valuable functional insights into ceRNA regulation and is a crucial step for informing subsequent functional validation studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Integrative Computational Network Analysis Reveals Site-Specific Mediators of Inflammation in Alzheimer's Disease.

    PubMed

    Ravichandran, Srikanth; Michelucci, Alessandro; Del Sol, Antonio

    2018-01-01

    Alzheimer's disease (AD) is a major neurodegenerative disease and is one of the most common cause of dementia in older adults. Among several factors, neuroinflammation is known to play a critical role in the pathogenesis of chronic neurodegenerative diseases. In particular, studies of brains affected by AD show a clear involvement of several inflammatory pathways. Furthermore, depending on the brain regions affected by the disease, the nature and the effect of inflammation can vary. Here, in order to shed more light on distinct and common features of inflammation in different brain regions affected by AD, we employed a computational approach to analyze gene expression data of six site-specific neuronal populations from AD patients. Our network based computational approach is driven by the concept that a sustained inflammatory environment could result in neurotoxicity leading to the disease. Thus, our method aims to infer intracellular signaling pathways/networks that are likely to be constantly activated or inhibited due to persistent inflammatory conditions. The computational analysis identified several inflammatory mediators, such as tumor necrosis factor alpha (TNF-a)-associated pathway, as key upstream receptors/ligands that are likely to transmit sustained inflammatory signals. Further, the analysis revealed that several inflammatory mediators were mainly region specific with few commonalities across different brain regions. Taken together, our results show that our integrative approach aids identification of inflammation-related signaling pathways that could be responsible for the onset or the progression of AD and can be applied to study other neurodegenerative diseases. Furthermore, such computational approaches can enable the translation of clinical omics data toward the development of novel therapeutic strategies for neurodegenerative diseases.

  3. Towards cortex sized artificial neural systems.

    PubMed

    Johansson, Christopher; Lansner, Anders

    2007-01-01

    We propose, implement, and discuss an abstract model of the mammalian neocortex. This model is instantiated with a sparse recurrently connected neural network that has spiking leaky integrator units and continuous Hebbian learning. First we study the structure, modularization, and size of neocortex, and then we describe a generic computational model of the cortical circuitry. A characterizing feature of the model is that it is based on the modularization of neocortex into hypercolumns and minicolumns. Both a floating- and fixed-point arithmetic implementation of the model are presented along with simulation results. We conclude that an implementation on a cluster computer is not communication but computation bounded. A mouse and rat cortex sized version of our model executes in 44% and 23% of real-time respectively. Further, an instance of the model with 1.6 x 10(6) units and 2 x 10(11) connections performed noise reduction and pattern completion. These implementations represent the current frontier of large-scale abstract neural network simulations in terms of network size and running speed.

  4. On an LAS-integrated soft PLC system based on WorldFIP fieldbus.

    PubMed

    Liang, Geng; Li, Zhijun; Li, Wen; Bai, Yan

    2012-01-01

    Communication efficiency is lowered and real-time performance is not good enough in discrete control based on traditional WorldFIP field intelligent nodes in case that the scale of control in field is large. A soft PLC system based on WorldFIP fieldbus was designed and implemented. Link Activity Scheduler (LAS) was integrated into the system and field intelligent I/O modules acted as networked basic nodes. Discrete control logic was implemented with the LAS-integrated soft PLC system. The proposed system was composed of configuration and supervisory sub-systems and running sub-systems. The configuration and supervisory sub-system was implemented with a personal computer or an industrial personal computer; running subsystems were designed and implemented based on embedded hardware and software systems. Communication and schedule in the running subsystem was implemented with an embedded sub-module; discrete control and system self-diagnosis were implemented with another embedded sub-module. Structure of the proposed system was presented. Methodology for the design of the sub-systems was expounded. Experiments were carried out to evaluate the performance of the proposed system both in discrete and process control by investigating the effect of network data transmission delay induced by the soft PLC in WorldFIP network and CPU workload on resulting control performances. The experimental observations indicated that the proposed system is practically applicable. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Networked Virtual Organizations: A Chance for Small and Medium Sized Enterprises on Global Markets

    NASA Astrophysics Data System (ADS)

    Cellary, Wojciech

    Networked Virtual Organizations (NVOs) are a right answer to challenges of globalized, diversified, and dynamic contemporary economy. NVOs need more than e-trade and outsourcing, namely, they need out-tasking and e-collaboration. To out-task, but retain control on the way a task is performed by an external partner, two integrations are required: (1) integration of computer management systems of enterprises cooperating within an NVO; and (2) integration of cooperating representatives of NVO member enterprises into a virtual team. NVOs provide a particular chance to Small and Medium size Enterprises (SMEs) to find their place on global markets and to play a significant role on them. Requirements for SMEs to be able to successfully join an NVO are analyzed in the paper.

  6. Neural network pattern recognition of thermal-signature spectra for chemical defense

    NASA Astrophysics Data System (ADS)

    Carrieri, Arthur H.; Lim, Pascal I.

    1995-05-01

    We treat infrared patterns of absorption or emission by nerve and blister agent compounds (and simulants of this chemical group) as features for the training of neural networks to detect the compounds' liquid layers on the ground or their vapor plumes during evaporation by external heating. Training of a four-layer network architecture is composed of a backward-error-propagation algorithm and a gradient-descent paradigm. We conduct testing by feed-forwarding preprocessed spectra through the network in a scaled format consistent with the structure of the training-data-set representation. The best-performance weight matrix (spectral filter) evolved from final network training and testing with software simulation trials is electronically transferred to a set of eight artificial intelligence integrated circuits (ICs') in specific modular form (splitting of weight matrices). This form makes full use of all input-output IC nodes. This neural network computer serves an important real-time detection function when it is integrated into pre-and postprocessing data-handling units of a tactical prototype thermoluminescence sensor now under development at the Edgewood Research, Development, and Engineering Center.

  7. Passing Messages between Biological Networks to Refine Predicted Interactions

    PubMed Central

    Glass, Kimberly; Huttenhower, Curtis; Quackenbush, John; Yuan, Guo-Cheng

    2013-01-01

    Regulatory network reconstruction is a fundamental problem in computational biology. There are significant limitations to such reconstruction using individual datasets, and increasingly people attempt to construct networks using multiple, independent datasets obtained from complementary sources, but methods for this integration are lacking. We developed PANDA (Passing Attributes between Networks for Data Assimilation), a message-passing model using multiple sources of information to predict regulatory relationships, and used it to integrate protein-protein interaction, gene expression, and sequence motif data to reconstruct genome-wide, condition-specific regulatory networks in yeast as a model. The resulting networks were not only more accurate than those produced using individual data sets and other existing methods, but they also captured information regarding specific biological mechanisms and pathways that were missed using other methodologies. PANDA is scalable to higher eukaryotes, applicable to specific tissue or cell type data and conceptually generalizable to include a variety of regulatory, interaction, expression, and other genome-scale data. An implementation of the PANDA algorithm is available at www.sourceforge.net/projects/panda-net. PMID:23741402

  8. NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Carter, David; Wetzel, Scott

    2000-01-01

    The NASA SLR Operational Center is responsible for: 1) NASA SLR network control, sustaining engineering, and logistics; 2) ILRS mission operations; and 3) ILRS and NASA SLR data operations. NASA SLR network control and sustaining engineering tasks include technical support, daily system performance monitoring, system scheduling, operator training, station status reporting, system relocation, logistics and support of the ILRS Networks and Engineering Working Group. These activities ensure the NASA SLR systems are meeting ILRS and NASA mission support requirements. ILRS mission operations tasks include mission planning, mission analysis, mission coordination, development of mission support plans, and support of the ILRS Missions Working Group. These activities ensure than new mission and campaign requirements are coordinated with the ILRS. Global Normal Points (NP) data, NASA SLR FullRate (FR) data, and satellite predictions are managed as part of data operations. Part of this operation includes supporting the ILRS Data Formats and Procedures Working Group. Global NP data operations consist of receipt, format and data integrity verification, archiving and merging. This activity culminates in the daily electronic transmission of NP files to the CDDIS. Currently of all these functions are automated. However, to ensure the timely and accurate flow of data, regular monitoring and maintenance of the operational software systems, computer systems and computer networking are performed. Tracking statistics between the stations and the data centers are compared periodically to eliminate lost data. Future activities in this area include sub-daily (i.e., hourly) NP data management, more stringent data integrity tests, and automatic station notification of format and data integrity issues.

  9. Integration of healthcare information: from enterprise PACS to patient centered multimedia health record.

    PubMed

    Soriano, Enrique; Plazzotta, Fernando; Campos, Fernando; Kaminker, Diego; Cancio, Alfredo; Aguilera Díaz, Jerónimo; Luna, Daniel; Seehaus, Alberto; Carcía Mónaco, Ricardo; de Quirós, Fernán González Bernaldo

    2010-01-01

    Every single piece of healthcare information should be fully integrated and transparent within the electronic health record. The Italian Hospital of Buenos Aires initiated the project Multimedia Health Record with the goal to achieve this integration while maintaining a holistic view of current structure of the systems of the Hospital, where the axis remains are the patient and longitudinal history, commencing with section Computed Tomography. Was implemented DICOM standard for communication and image storage and bought a PACS. It was necessary adapt our generic reporting system for live up to the commercial RIS. The Computerized Tomography (CT) Scanners of our hospital were easily integrated into the DICOM network and all the CT Scans generated by our radiology service were stored in the PACS, reported using the Structured Reporting System (we installed diagnostic terminals equipped with 3 monitors) and displayed in the EHR at any point of HIBA's healthcare network.

  10. Overview of NASA communications infrastructure

    NASA Technical Reports Server (NTRS)

    Arnold, Ray J.; Fuechsel, Charles

    1991-01-01

    The infrastructure of NASA communications systems for effecting coordination across NASA offices and with the national and international research and technological communities is discussed. The offices and networks of the communication system include the Office of Space Science and Applications (OSSA), which manages all NASA missions, and the Office of Space Operations, which furnishes communication support through the NASCOM, the mission critical communications support network, and the Program Support Communications network. The NASA Science Internet was established by OSSA to centrally manage, develop, and operate an integrated computer network service dedicated to NASA's space science and application research. Planned for the future is the National Research and Education Network, which will provide communications infrastructure to enhance science resources at a national level.

  11. Protein-Protein Interaction Network and Gene Ontology

    NASA Astrophysics Data System (ADS)

    Choi, Yunkyu; Kim, Seok; Yi, Gwan-Su; Park, Jinah

    Evolution of computer technologies makes it possible to access a large amount and various kinds of biological data via internet such as DNA sequences, proteomics data and information discovered about them. It is expected that the combination of various data could help researchers find further knowledge about them. Roles of a visualization system are to invoke human abilities to integrate information and to recognize certain patterns in the data. Thus, when the various kinds of data are examined and analyzed manually, an effective visualization system is an essential part. One instance of these integrated visualizations can be combination of protein-protein interaction (PPI) data and Gene Ontology (GO) which could help enhance the analysis of PPI network. We introduce a simple but comprehensive visualization system that integrates GO and PPI data where GO and PPI graphs are visualized side-by-side and supports quick reference functions between them. Furthermore, the proposed system provides several interactive visualization methods for efficiently analyzing the PPI network and GO directedacyclic- graph such as context-based browsing and common ancestors finding.

  12. Airborne Tactical Data Network Gateways: Evaluating EPLRS’ Ability to Integrate With Wireless Meshed Networks

    DTIC Science & Technology

    2005-09-01

    Computer Memory Card International Association PHY Physical PLI Position Location Information PLRS Position Location Reporting System PoP Point of...it is widely acknowledged that the JTRS program will not be providing any sustentative operational capability prior to FY’09. This reality has...Figure 5, and a man-packed antenna (AS- 3448/PSQ-4). Back-up (cryptographic key) memory is maintained by a traditional 9v 24

  13. Integrated DoD Voice and Data Networks and Ground Packet Radio Technology

    DTIC Science & Technology

    1976-08-01

    as the traffic requirement level increases. Moreover, the satellite switch selection problem is only meaningful over a limited traffic range. When...5: CPU TIMES VS. NUMBER OF SWITCHES SATELLITE SWITCH SELECTION ALGORITHM Computer Used: PDP-10 ♦O’S" means 0 minutes and 5 seconds. 5.30...Saturation Algorithm for Topo\\ogical Design of Parket-Switched Communications Networks," National Te3 ecommunications Conference Proceed- ings, San

  14. The Johns Hopkins Medical Institutions' Premise Distribution Plan

    PubMed Central

    Barta, Wendy; Buckholtz, Howard; Johnston, Mark; Lenhard, Raymond; Tolchin, Stephen; Vienne, Donald

    1987-01-01

    A Premise Distribution Plan is being developed to address the growing voice and data communications needs at Johns Hopkins Medical Institutions. More specifically, the use of a rapidly expanding Ethernet computer network and a new Integrated Services Digital Network (ISDN) Digital Centrex system must be planned to provide easy, reliable and cost-effective data and voice communications services. Existing Premise Distribution Systems are compared along with voice and data technologies which would use them.

  15. Central Limit Theorem: New SOCR Applet and Demonstration Activity

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicholas; Sanchez, Juana

    2008-01-01

    Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multi-faceted learning environments, which may facilitate student comprehension and information…

  16. Rural District's Partnerships Bear Fruit in Three Years.

    ERIC Educational Resources Information Center

    Jensen, Dennis

    1996-01-01

    A partnership between Wayne State College, Wayne (Nebraska) community schools, and the local chamber of commerce produced fiber-optic telecommunications, Internet access, technology integration, automated libraries, computer networking, and a technology curriculum. The article describes project design, implementation, and growth, as well as…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sathasivam, Saratha

    New activation function is examined for its ability to accelerate the performance of doing logic programming in Hopfield network. This method has a higher capacity and upgrades the neuro symbolic integration. Computer simulations are carried out to validate the effectiveness of the new activation function. Empirical results obtained support our theory.

  18. Delay/Disruption Tolerant Networking for the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Schlesinger, Adam; Willman, Brett M.; Pitts, Lee; Davidson, Suzanne R.; Pohlchuck, William A.

    2017-01-01

    Disruption Tolerant Networking (DTN) is an emerging data networking technology designed to abstract the hardware communication layer from the spacecraft/payload computing resources. DTN is specifically designed to operate in environments where link delays and disruptions are common (e.g., space-based networks). The National Aeronautics and Space Administration (NASA) has demonstrated DTN on several missions, such as the Deep Impact Networking (DINET) experiment, the Earth Observing Mission 1 (EO-1) and the Lunar Laser Communication Demonstration (LLCD). To further the maturation of DTN, NASA is implementing DTN protocols on the International Space Station (ISS). This paper explains the architecture of the ISS DTN network, the operational support for the system, the results from integrated ground testing, and the future work for DTN expansion.

  19. Enabling Discoveries in Earth Sciences Through the Geosciences Network (GEON)

    NASA Astrophysics Data System (ADS)

    Seber, D.; Baru, C.; Memon, A.; Lin, K.; Youn, C.

    2005-12-01

    Taking advantage of the state-of-the-art information technology resources GEON researchers are building a cyberinfrastructure designed to enable data sharing, semantic data integration, high-end computations and 4D visualization in easy-to-use web-based environments. The GEON Network currently allows users to search and register Earth science resources such as data sets (GIS layers, GMT files, geoTIFF images, ASCII files, relational databases etc), software applications or ontologies. Portal based access mechanisms enable developers to built dynamic user interfaces to conduct advanced processing and modeling efforts across distributed computers and supercomputers. Researchers and educators can access the networked resources through the GEON portal and its portlets that were developed to conduct better and more comprehensive science and educational studies. For example, the SYNSEIS portlet in GEON enables users to access in near-real time seismic waveforms from the IRIS Data Management Center, easily build a 3D geologic model within the area of the seismic station(s) and the epicenter and perform a 3D synthetic seismogram analysis to understand the lithospheric structure and earthquake source parameters for any given earthquake in the US. Similarly, GEON's workbench area enables users to create their own work environment and copy, visualize and analyze any data sets within the network, and create subsets of the data sets for their own purposes. Since all these resources are built as part of a Service-oriented Architecture (SOA), they are also used in other development platforms. One such platform is Kepler Workflow system which can access web service based resources and provides users with graphical programming interfaces to build a model to conduct computations and/or visualization efforts using the networked resources. Developments in the area of semantic integration of the networked datasets continue to advance and prototype studies can be accessed via the GEON portal at www.geongrid.org

  20. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework

    PubMed Central

    Wang, Xiao-Jing

    2016-01-01

    The ability to simultaneously record from large numbers of neurons in behaving animals has ushered in a new era for the study of the neural circuit mechanisms underlying cognitive functions. One promising approach to uncovering the dynamical and computational principles governing population responses is to analyze model recurrent neural networks (RNNs) that have been optimized to perform the same tasks as behaving animals. Because the optimization of network parameters specifies the desired output but not the manner in which to achieve this output, “trained” networks serve as a source of mechanistic hypotheses and a testing ground for data analyses that link neural computation to behavior. Complete access to the activity and connectivity of the circuit, and the ability to manipulate them arbitrarily, make trained networks a convenient proxy for biological circuits and a valuable platform for theoretical investigation. However, existing RNNs lack basic biological features such as the distinction between excitatory and inhibitory units (Dale’s principle), which are essential if RNNs are to provide insights into the operation of biological circuits. Moreover, trained networks can achieve the same behavioral performance but differ substantially in their structure and dynamics, highlighting the need for a simple and flexible framework for the exploratory training of RNNs. Here, we describe a framework for gradient descent-based training of excitatory-inhibitory RNNs that can incorporate a variety of biological knowledge. We provide an implementation based on the machine learning library Theano, whose automatic differentiation capabilities facilitate modifications and extensions. We validate this framework by applying it to well-known experimental paradigms such as perceptual decision-making, context-dependent integration, multisensory integration, parametric working memory, and motor sequence generation. Our results demonstrate the wide range of neural activity patterns and behavior that can be modeled, and suggest a unified setting in which diverse cognitive computations and mechanisms can be studied. PMID:26928718

  1. NDEx: A Community Resource for Sharing and Publishing of Biological Networks.

    PubMed

    Pillich, Rudolf T; Chen, Jing; Rynkov, Vladimir; Welker, David; Pratt, Dexter

    2017-01-01

    Networks are a powerful and flexible paradigm that facilitate communication and computation about interactions of any type, whether social, economic, or biological. NDEx, the Network Data Exchange, is an online commons to enable new modes of collaboration and publication using biological networks. NDEx creates an access point and interface to a broad range of networks, whether they express molecular interactions, curated relationships from literature, or the outputs of systematic analysis of big data. Research organizations can use NDEx as a distribution channel for networks they generate or curate. Developers of bioinformatic applications can store and query NDEx networks via a common programmatic interface. NDEx can also facilitate the integration of networks as data in electronic publications, thus making a step toward an ecosystem in which networks bearing data, hypotheses, and findings flow seamlessly between scientists.

  2. Strategic Integration of Multiple Bioinformatics Resources for System Level Analysis of Biological Networks.

    PubMed

    D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia

    2017-01-01

    Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.

  3. A Framework for Integration of Heterogeneous Medical Imaging Networks

    PubMed Central

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021

  4. A framework for integration of heterogeneous medical imaging networks.

    PubMed

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.

  5. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review

    PubMed Central

    McClelland, James L.

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868

  6. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review.

    PubMed

    McClelland, James L

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.

  7. Computational prediction of protein-protein interactions in Leishmania predicted proteomes.

    PubMed

    Rezende, Antonio M; Folador, Edson L; Resende, Daniela de M; Ruiz, Jeronimo C

    2012-01-01

    The Trypanosomatids parasites Leishmania braziliensis, Leishmania major and Leishmania infantum are important human pathogens. Despite of years of study and genome availability, effective vaccine has not been developed yet, and the chemotherapy is highly toxic. Therefore, it is clear just interdisciplinary integrated studies will have success in trying to search new targets for developing of vaccines and drugs. An essential part of this rationale is related to protein-protein interaction network (PPI) study which can provide a better understanding of complex protein interactions in biological system. Thus, we modeled PPIs for Trypanosomatids through computational methods using sequence comparison against public database of protein or domain interaction for interaction prediction (Interolog Mapping) and developed a dedicated combined system score to address the predictions robustness. The confidence evaluation of network prediction approach was addressed using gold standard positive and negative datasets and the AUC value obtained was 0.94. As result, 39,420, 43,531 and 45,235 interactions were predicted for L. braziliensis, L. major and L. infantum respectively. For each predicted network the top 20 proteins were ranked by MCC topological index. In addition, information related with immunological potential, degree of protein sequence conservation among orthologs and degree of identity compared to proteins of potential parasite hosts was integrated. This information integration provides a better understanding and usefulness of the predicted networks that can be valuable to select new potential biological targets for drug and vaccine development. Network modularity which is a key when one is interested in destabilizing the PPIs for drug or vaccine purposes along with multiple alignments of the predicted PPIs were performed revealing patterns associated with protein turnover. In addition, around 50% of hypothetical protein present in the networks received some degree of functional annotation which represents an important contribution since approximately 60% of Leishmania predicted proteomes has no predicted function.

  8. Integrating Data Distribution and Data Assimilation Between the OOI CI and the NOAA DIF

    NASA Astrophysics Data System (ADS)

    Meisinger, M.; Arrott, M.; Clemesha, A.; Farcas, C.; Farcas, E.; Im, T.; Schofield, O.; Krueger, I.; Klacansky, I.; Orcutt, J.; Peach, C.; Chave, A.; Raymer, D.; Vernon, F.

    2008-12-01

    The Ocean Observatories Initiative (OOI) is an NSF funded program to establish the ocean observing infrastructure of the 21st century benefiting research and education. It is currently approaching final design and promises to deliver cyber and physical observatory infrastructure components as well as substantial core instrumentation to study environmental processes of the ocean at various scales, from coastal shelf-slope exchange processes to the deep ocean. The OOI's data distribution network lies at the heart of its cyber- infrastructure, which enables a multitude of science and education applications, ranging from data analysis, to processing, visualization and ontology supported query and mediation. In addition, it fundamentally supports a class of applications exploiting the knowledge gained from analyzing observational data for objective-driven ocean observing applications, such as automatically triggered response to episodic environmental events and interactive instrument tasking and control. The U.S. Department of Commerce through NOAA operates the Integrated Ocean Observing System (IOOS) providing continuous data in various formats, rates and scales on open oceans and coastal waters to scientists, managers, businesses, governments, and the public to support research and inform decision-making. The NOAA IOOS program initiated development of the Data Integration Framework (DIF) to improve management and delivery of an initial subset of ocean observations with the expectation of achieving improvements in a select set of NOAA's decision-support tools. Both OOI and NOAA through DIF collaborate on an effort to integrate the data distribution, access and analysis needs of both programs. We present details and early findings from this collaboration; one part of it is the development of a demonstrator combining web-based user access to oceanographic data through ERDDAP, efficient science data distribution, and scalable, self-healing deployment in a cloud computing environment. ERDDAP is a web-based front-end application integrating oceanographic data sources of various formats, for instance CDF data files as aggregated through NcML or presented using a THREDDS server. The OOI-designed data distribution network provides global traffic management and computational load balancing for observatory resources; it makes use of the OpenDAP Data Access Protocol (DAP) for efficient canonical science data distribution over the network. A cloud computing strategy is the basis for scalable, self-healing organization of an observatory's computing and storage resources, independent of the physical location and technical implementation of these resources.

  9. The Application of Integrated Knowledge-based Systems for the Biomedical Risk Assessment Intelligent Network (BRAIN)

    NASA Technical Reports Server (NTRS)

    Loftin, Karin C.; Ly, Bebe; Webster, Laurie; Verlander, James; Taylor, Gerald R.; Riley, Gary; Culbert, Chris; Holden, Tina; Rudisill, Marianne

    1993-01-01

    One of NASA's goals for long duration space flight is to maintain acceptable levels of crew health, safety, and performance. One way of meeting this goal is through the Biomedical Risk Assessment Intelligent Network (BRAIN), an integrated network of both human and computer elements. The BRAIN will function as an advisor to flight surgeons by assessing the risk of in-flight biomedical problems and recommending appropriate countermeasures. This paper describes the joint effort among various NASA elements to develop BRAIN and an Infectious Disease Risk Assessment (IDRA) prototype. The implementation of this effort addresses the technological aspects of the following: (1) knowledge acquisition; (2) integration of IDRA components; (3) use of expert systems to automate the biomedical prediction process; (4) development of a user-friendly interface; and (5) integration of the IDRA prototype and Exercise Countermeasures Intelligent System (ExerCISys). Because the C Language, CLIPS (the C Language Integrated Production System), and the X-Window System were portable and easily integrated, they were chosen as the tools for the initial IDRA prototype. The feasibility was tested by developing an IDRA prototype that predicts the individual risk of influenza. The application of knowledge-based systems to risk assessment is of great market value to the medical technology industry.

  10. GLOBECOM '89 - IEEE Global Telecommunications Conference and Exhibition, Dallas, TX, Nov. 27-30, 1989, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    The present conference discusses topics in multiwavelength network technology and its applications, advanced digital radio systems in their propagation environment, mobile radio communications, switching programmability, advancements in computer communications, integrated-network management and security, HDTV and image processing in communications, basic exchange communications radio advancements in digital switching, intelligent network evolution, speech coding for telecommunications, and multiple access communications. Also discussed are network designs for quality assurance, recent progress in coherent optical systems, digital radio applications, advanced communications technologies for mobile users, communication software for switching systems, AI and expert systems in network management, intelligent multiplexing nodes, video and image coding, network protocols and performance, system methods in quality and reliability, the design and simulation of lightwave systems, local radio networks, mobile satellite communications systems, fiber networks restoration, packet video networks, human interfaces for future networks, and lightwave networking.

  11. Multilevel functional genomics data integration as a tool for understanding physiology: a network biology perspective.

    PubMed

    Davidsen, Peter K; Turan, Nil; Egginton, Stuart; Falciani, Francesco

    2016-02-01

    The overall aim of physiological research is to understand how living systems function in an integrative manner. Consequently, the discipline of physiology has since its infancy attempted to link multiple levels of biological organization. Increasingly this has involved mathematical and computational approaches, typically to model a small number of components spanning several levels of biological organization. With the advent of "omics" technologies, which can characterize the molecular state of a cell or tissue (intended as the level of expression and/or activity of its molecular components), the number of molecular components we can quantify has increased exponentially. Paradoxically, the unprecedented amount of experimental data has made it more difficult to derive conceptual models underlying essential mechanisms regulating mammalian physiology. We present an overview of state-of-the-art methods currently used to identifying biological networks underlying genomewide responses. These are based on a data-driven approach that relies on advanced computational methods designed to "learn" biology from observational data. In this review, we illustrate an application of these computational methodologies using a case study integrating an in vivo model representing the transcriptional state of hypoxic skeletal muscle with a clinical study representing muscle wasting in chronic obstructive pulmonary disease patients. The broader application of these approaches to modeling multiple levels of biological data in the context of modern physiology is discussed. Copyright © 2016 the American Physiological Society.

  12. Applicability of computational systems biology in toxicology.

    PubMed

    Kongsbak, Kristine; Hadrup, Niels; Audouze, Karine; Vinggaard, Anne Marie

    2014-07-01

    Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search. However, computational systems biology offers more advantages than providing a high-throughput literature search; it may form the basis for establishment of hypotheses on potential links between environmental chemicals and human diseases, which would be very difficult to establish experimentally. This is possible due to the existence of comprehensive databases containing information on networks of human protein-protein interactions and protein-disease associations. Experimentally determined targets of the specific chemical of interest can be fed into these networks to obtain additional information that can be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method in the hypothesis-generating phase of toxicological research. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  13. Enhanced Contact Graph Routing (ECGR) MACHETE Simulation Model

    NASA Technical Reports Server (NTRS)

    Segui, John S.; Jennings, Esther H.; Clare, Loren P.

    2013-01-01

    Contact Graph Routing (CGR) for Delay/Disruption Tolerant Networking (DTN) space-based networks makes use of the predictable nature of node contacts to make real-time routing decisions given unpredictable traffic patterns. The contact graph will have been disseminated to all nodes before the start of route computation. CGR was designed for space-based networking environments where future contact plans are known or are independently computable (e.g., using known orbital dynamics). For each data item (known as a bundle in DTN), a node independently performs route selection by examining possible paths to the destination. Route computation could conceivably run thousands of times a second, so computational load is important. This work refers to the simulation software model of Enhanced Contact Graph Routing (ECGR) for DTN Bundle Protocol in JPL's MACHETE simulation tool. The simulation model was used for performance analysis of CGR and led to several performance enhancements. The simulation model was used to demonstrate the improvements of ECGR over CGR as well as other routing methods in space network scenarios. ECGR moved to using earliest arrival time because it is a global monotonically increasing metric that guarantees the safety properties needed for the solution's correctness since route re-computation occurs at each node to accommodate unpredicted changes (e.g., traffic pattern, link quality). Furthermore, using earliest arrival time enabled the use of the standard Dijkstra algorithm for path selection. The Dijkstra algorithm for path selection has a well-known inexpensive computational cost. These enhancements have been integrated into the open source CGR implementation. The ECGR model is also useful for route metric experimentation and comparisons with other DTN routing protocols particularly when combined with MACHETE's space networking models and Delay Tolerant Link State Routing (DTLSR) model.

  14. Genome Scale Modeling in Systems Biology: Algorithms and Resources

    PubMed Central

    Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali

    2014-01-01

    In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031

  15. Boosting Probabilistic Graphical Model Inference by Incorporating Prior Knowledge from Multiple Sources

    PubMed Central

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291

  16. Exploring MEDLINE Space with Random Indexing and Pathfinder Networks

    PubMed Central

    Cohen, Trevor

    2008-01-01

    The integration of disparate research domains is a prerequisite for the success of the translational science initiative. MEDLINE abstracts contain content from a broad range of disciplines, presenting an opportunity for the development of methods able to integrate the knowledge they contain. Latent Semantic Analysis (LSA) and related methods learn human-like associations between terms from unannotated text. However, their computational and memory demands limits their ability to address a corpus of this size. Furthermore, visualization methods previously used in conjunction with LSA have limited ability to define the local structure of the associative networks LSA learns. This paper explores these issues by (1) processing the entire MEDLINE corpus using Random Indexing, a variant of LSA, and (2) exploring learned associations using Pathfinder Networks. Meaningful associations are inferred from MEDLINE, including a drug-disease association undetected by PUBMED search. PMID:18999236

  17. Exploring MEDLINE space with random indexing and pathfinder networks.

    PubMed

    Cohen, Trevor

    2008-11-06

    The integration of disparate research domains is a prerequisite for the success of the translational science initiative. MEDLINE abstracts contain content from a broad range of disciplines, presenting an opportunity for the development of methods able to integrate the knowledge they contain. Latent Semantic Analysis (LSA) and related methods learn human-like associations between terms from unannotated text. However, their computational and memory demands limits their ability to address a corpus of this size. Furthermore, visualization methods previously used in conjunction with LSA have limited ability to define the local structure of the associative networks LSA learns. This paper explores these issues by (1) processing the entire MEDLINE corpus using Random Indexing, a variant of LSA, and (2) exploring learned associations using Pathfinder Networks. Meaningful associations are inferred from MEDLINE, including a drug-disease association undetected by PUBMED search.

  18. Network and data security design for telemedicine applications.

    PubMed

    Makris, L; Argiriou, N; Strintzis, M G

    1997-01-01

    The maturing of telecommunication technologies has ushered in a whole new era of applications and services in the health care environment. Teleworking, teleconsultation, mutlimedia conferencing and medical data distribution are rapidly becoming commonplace in clinical practice. As a result, a set of problems arises, concerning data confidentiality and integrity. Public computer networks, such as the emerging ISDN technology, are vulnerable to eavesdropping. Therefore it is important for telemedicine applications to employ end-to-end encryption mechanisms securing the data channel from unauthorized access of modification. We propose a network access and encryption system that is both economical and easily implemented for integration in developing or existing applications, using well-known and thoroughly tested encryption algorithms. Public-key cryptography is used for session-key exchange, while symmetric algorithms are used for bulk encryption. Mechanisms for session-key generation and exchange are also provided.

  19. The Role of Energy Reservoirs in Distributed Computing: Manufacturing, Implementing, and Optimizing Energy Storage in Energy-Autonomous Sensor Nodes

    NASA Astrophysics Data System (ADS)

    Cowell, Martin Andrew

    The world already hosts more internet connected devices than people, and that ratio is only increasing. These devices seamlessly integrate with peoples lives to collect rich data and give immediate feedback about complex systems from business, health care, transportation, and security. As every aspect of global economies integrate distributed computing into their industrial systems and these systems benefit from rich datasets. Managing the power demands of these distributed computers will be paramount to ensure the continued operation of these networks, and is elegantly addressed by including local energy harvesting and storage on a per-node basis. By replacing non-rechargeable batteries with energy harvesting, wireless sensor nodes will increase their lifetimes by an order of magnitude. This work investigates the coupling of high power energy storage with energy harvesting technologies to power wireless sensor nodes; with sections covering device manufacturing, system integration, and mathematical modeling. First we consider the energy storage mechanism of supercapacitors and batteries, and identify favorable characteristics in both reservoir types. We then discuss experimental methods used to manufacture high power supercapacitors in our labs. We go on to detail the integration of our fabricated devices with collaborating labs to create functional sensor node demonstrations. With the practical knowledge gained through in-lab manufacturing and system integration, we build mathematical models to aid in device and system design. First, we model the mechanism of energy storage in porous graphene supercapacitors to aid in component architecture optimization. We then model the operation of entire sensor nodes for the purpose of optimally sizing the energy harvesting and energy reservoir components. In consideration of deploying these sensor nodes in real-world environments, we model the operation of our energy harvesting and power management systems subject to spatially and temporally varying energy availability in order to understand sensor node reliability. Looking to the future, we see an opportunity for further research to implement machine learning algorithms to control the energy resources of distributed computing networks.

  20. Some comparisons of complexity in dictionary-based and linear computational models.

    PubMed

    Gnecco, Giorgio; Kůrková, Věra; Sanguineti, Marcello

    2011-03-01

    Neural networks provide a more flexible approximation of functions than traditional linear regression. In the latter, one can only adjust the coefficients in linear combinations of fixed sets of functions, such as orthogonal polynomials or Hermite functions, while for neural networks, one may also adjust the parameters of the functions which are being combined. However, some useful properties of linear approximators (such as uniqueness, homogeneity, and continuity of best approximation operators) are not satisfied by neural networks. Moreover, optimization of parameters in neural networks becomes more difficult than in linear regression. Experimental results suggest that these drawbacks of neural networks are offset by substantially lower model complexity, allowing accuracy of approximation even in high-dimensional cases. We give some theoretical results comparing requirements on model complexity for two types of approximators, the traditional linear ones and so called variable-basis types, which include neural networks, radial, and kernel models. We compare upper bounds on worst-case errors in variable-basis approximation with lower bounds on such errors for any linear approximator. Using methods from nonlinear approximation and integral representations tailored to computational units, we describe some cases where neural networks outperform any linear approximator. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Reverse engineering biological networks :applications in immune responses to bio-toxins.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, Anthony A.; Sinclair, Michael B.; Davidson, George S.

    Our aim is to determine the network of events, or the regulatory network, that defines an immune response to a bio-toxin. As a model system, we are studying T cell regulatory network triggered through tyrosine kinase receptor activation using a combination of pathway stimulation and time-series microarray experiments. Our approach is composed of five steps (1) microarray experiments and data error analysis, (2) data clustering, (3) data smoothing and discretization, (4) network reverse engineering, and (5) network dynamics analysis and fingerprint identification. The technological outcome of this study is a suite of experimental protocols and computational tools that reverse engineermore » regulatory networks provided gene expression data. The practical biological outcome of this work is an immune response fingerprint in terms of gene expression levels. Inferring regulatory networks from microarray data is a new field of investigation that is no more than five years old. To the best of our knowledge, this work is the first attempt that integrates experiments, error analyses, data clustering, inference, and network analysis to solve a practical problem. Our systematic approach of counting, enumeration, and sampling networks matching experimental data is new to the field of network reverse engineering. The resulting mathematical analyses and computational tools lead to new results on their own and should be useful to others who analyze and infer networks.« less

  2. Failure probability analysis of optical grid

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  3. Elementary signaling modes predict the essentiality of signal transduction network components

    PubMed Central

    2011-01-01

    Background Understanding how signals propagate through signaling pathways and networks is a central goal in systems biology. Quantitative dynamic models help to achieve this understanding, but are difficult to construct and validate because of the scarcity of known mechanistic details and kinetic parameters. Structural and qualitative analysis is emerging as a feasible and useful alternative for interpreting signal transduction. Results In this work, we present an integrative computational method for evaluating the essentiality of components in signaling networks. This approach expands an existing signaling network to a richer representation that incorporates the positive or negative nature of interactions and the synergistic behaviors among multiple components. Our method simulates both knockout and constitutive activation of components as node disruptions, and takes into account the possible cascading effects of a node's disruption. We introduce the concept of elementary signaling mode (ESM), as the minimal set of nodes that can perform signal transduction independently. Our method ranks the importance of signaling components by the effects of their perturbation on the ESMs of the network. Validation on several signaling networks describing the immune response of mammals to bacteria, guard cell abscisic acid signaling in plants, and T cell receptor signaling shows that this method can effectively uncover the essentiality of components mediating a signal transduction process and results in strong agreement with the results of Boolean (logical) dynamic models and experimental observations. Conclusions This integrative method is an efficient procedure for exploratory analysis of large signaling and regulatory networks where dynamic modeling or experimental tests are impractical. Its results serve as testable predictions, provide insights into signal transduction and regulatory mechanisms and can guide targeted computational or experimental follow-up studies. The source codes for the algorithms developed in this study can be found at http://www.phys.psu.edu/~ralbert/ESM. PMID:21426566

  4. Understanding spatial and temporal patterning of astrocyte calcium transients via interactions between network transport and extracellular diffusion

    NASA Astrophysics Data System (ADS)

    Shtrahman, E.; Maruyama, D.; Olariu, E.; Fink, C. G.; Zochowski, M.

    2017-02-01

    Astrocytes form interconnected networks in the brain and communicate via calcium signaling. We investigate how modes of coupling between astrocytes influence the spatio-temporal patterns of calcium signaling within astrocyte networks and specifically how these network interactions promote coordination within this group of cells. To investigate these complex phenomena, we study reduced cultured networks of astrocytes and neurons. We image the spatial temporal patterns of astrocyte calcium activity and quantify how perturbing the coupling between astrocytes influences astrocyte activity patterns. To gain insight into the pattern formation observed in these cultured networks, we compare the experimentally observed calcium activity patterns to the patterns produced by a reduced computational model, where we represent astrocytes as simple units that integrate input through two mechanisms: gap junction coupling (network transport) and chemical release (extracellular diffusion). We examine the activity patterns in the simulated astrocyte network and their dependence upon these two coupling mechanisms. We find that gap junctions and extracellular chemical release interact in astrocyte networks to modulate the spatiotemporal patterns of their calcium dynamics. We show agreement between the computational and experimental findings, which suggests that the complex global patterns can be understood as a result of simple local coupling mechanisms.

  5. Expert System Software

    NASA Technical Reports Server (NTRS)

    1989-01-01

    C Language Integrated Production System (CLIPS) is a software shell for developing expert systems is designed to allow research and development of artificial intelligence on conventional computers. Originally developed by Johnson Space Center, it enables highly efficient pattern matching. A collection of conditions and actions to be taken if the conditions are met is built into a rule network. Additional pertinent facts are matched to the rule network. Using the program, E.I. DuPont de Nemours & Co. is monitoring chemical production machines; California Polytechnic State University is investigating artificial intelligence in computer aided design; Mentor Graphics has built a new Circuit Synthesis system, and Brooke and Brooke, a law firm, can determine which facts from a file are most important.

  6. Wearable sensors for health monitoring

    NASA Astrophysics Data System (ADS)

    Suciu, George; Butca, Cristina; Ochian, Adelina; Halunga, Simona

    2015-02-01

    In this paper we describe several wearable sensors, designed for monitoring the health condition of the patients, based on an experimental model. Wearable sensors enable long-term continuous physiological monitoring, which is important for the treatment and management of many chronic illnesses, neurological disorders, and mental health issues. The system is based on a wearable sensors network, which is connected to a computer or smartphone. The wearable sensor network integrates several wearable sensors that can measure different parameters such as body temperature, heart rate and carbon monoxide quantity from the air. After the portable sensors measuring parameter values, they are transmitted by microprocessor through the Bluetooth to the application developed on computer or smartphone, to be interpreted.

  7. Law of Large Numbers: The Theory, Applications and Technology-Based Education

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas; Gould, Robert

    2009-01-01

    Modern approaches for technology-based blended education utilize a variety of recently developed novel pedagogical, computational and network resources. Such attempts employ technology to deliver integrated, dynamically-linked, interactive-content and heterogeneous learning environments, which may improve student comprehension and information…

  8. Integrated risk/cost planning models for the US Air Traffic system

    NASA Technical Reports Server (NTRS)

    Mulvey, J. M.; Zenios, S. A.

    1985-01-01

    A prototype network planning model for the U.S. Air Traffic control system is described. The model encompasses the dual objectives of managing collision risks and transportation costs where traffic flows can be related to these objectives. The underlying structure is a network graph with nonseparable convex costs; the model is solved efficiently by capitalizing on its intrinsic characteristics. Two specialized algorithms for solving the resulting problems are described: (1) truncated Newton, and (2) simplicial decomposition. The feasibility of the approach is demonstrated using data collected from a control center in the Midwest. Computational results with different computer systems are presented, including a vector supercomputer (CRAY-XMP). The risk/cost model has two primary uses: (1) as a strategic planning tool using aggregate flight information, and (2) as an integrated operational system for forecasting congestion and monitoring (controlling) flow throughout the U.S. In the latter case, access to a supercomputer is required due to the model's enormous size.

  9. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    NASA Astrophysics Data System (ADS)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  10. Predicting clinical outcome of neuroblastoma patients using an integrative network-based approach.

    PubMed

    Tranchevent, Léon-Charles; Nazarov, Petr V; Kaoma, Tony; Schmartz, Georges P; Muller, Arnaud; Kim, Sang-Yoon; Rajapakse, Jagath C; Azuaje, Francisco

    2018-06-07

    One of the main current challenges in computational biology is to make sense of the huge amounts of multidimensional experimental data that are being produced. For instance, large cohorts of patients are often screened using different high-throughput technologies, effectively producing multiple patient-specific molecular profiles for hundreds or thousands of patients. We propose and implement a network-based method that integrates such patient omics data into Patient Similarity Networks. Topological features derived from these networks were then used to predict relevant clinical features. As part of the 2017 CAMDA challenge, we have successfully applied this strategy to a neuroblastoma dataset, consisting of genomic and transcriptomic data. In particular, we observe that models built on our network-based approach perform at least as well as state of the art models. We furthermore explore the effectiveness of various topological features and observe, for instance, that redundant centrality metrics can be combined to build more powerful models. We demonstrate that the networks inferred from omics data contain clinically relevant information and that patient clinical outcomes can be predicted using only network topological data. This article was reviewed by Yang-Yu Liu, Tomislav Smuc and Isabel Nepomuceno.

  11. Fundamentally Distributed Information Processing Integrates the Motor Network into the Mental Workspace during Mental Rotation.

    PubMed

    Schlegel, Alexander; Konuthula, Dedeepya; Alexander, Prescott; Blackwood, Ethan; Tse, Peter U

    2016-08-01

    The manipulation of mental representations in the human brain appears to share similarities with the physical manipulation of real-world objects. In particular, some neuroimaging studies have found increased activity in motor regions during mental rotation, suggesting that mental and physical operations may involve overlapping neural populations. Does the motor network contribute information processing to mental rotation? If so, does it play a similar computational role in both mental and manual rotation, and how does it communicate with the wider network of areas involved in the mental workspace? Here we used multivariate methods and fMRI to study 24 participants as they mentally rotated 3-D objects or manually rotated their hands in one of four directions. We find that information processing related to mental rotations is distributed widely among many cortical and subcortical regions, that the motor network becomes tightly integrated into a wider mental workspace network during mental rotation, and that motor network activity during mental rotation only partially resembles that involved in manual rotation. Additionally, these findings provide evidence that the mental workspace is organized as a distributed core network that dynamically recruits specialized subnetworks for specific tasks as needed.

  12. Deep learning beyond Lefschetz thimbles

    NASA Astrophysics Data System (ADS)

    Alexandru, Andrei; Bedaque, Paulo F.; Lamm, Henry; Lawrence, Scott

    2017-11-01

    The generalized thimble method to treat field theories with sign problems requires repeatedly solving the computationally expensive holomorphic flow equations. We present a machine learning technique to bypass this problem. The central idea is to obtain a few field configurations via the flow equations to train a feed-forward neural network. The trained network defines a new manifold of integration which reduces the sign problem and can be rapidly sampled. We present results for the 1 +1 dimensional Thirring model with Wilson fermions on sizable lattices. In addition to the gain in speed, the parametrization of the integration manifold we use avoids the "trapping" of Monte Carlo chains which plagues large-flow calculations, a considerable shortcoming of the previous attempts.

  13. Web-based network analysis and visualization using CellMaps

    PubMed Central

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-01-01

    Summary: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. Availability and Implementation: The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps. The client is implemented in JavaScript and the server in C and Java. Contact: jdopazo@cipf.es Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27296979

  14. Web-based network analysis and visualization using CellMaps.

    PubMed

    Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín

    2016-10-01

    : CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps The client is implemented in JavaScript and the server in C and Java. jdopazo@cipf.es Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  15. Design of on-board Bluetooth wireless network system based on fault-tolerant technology

    NASA Astrophysics Data System (ADS)

    You, Zheng; Zhang, Xiangqi; Yu, Shijie; Tian, Hexiang

    2007-11-01

    In this paper, the Bluetooth wireless data transmission technology is applied in on-board computer system, to realize wireless data transmission between peripherals of the micro-satellite integrating electronic system, and in view of the high demand of reliability of a micro-satellite, a design of Bluetooth wireless network based on fault-tolerant technology is introduced. The reliability of two fault-tolerant systems is estimated firstly using Markov model, then the structural design of this fault-tolerant system is introduced; several protocols are established to make the system operate correctly, some related problems are listed and analyzed, with emphasis on Fault Auto-diagnosis System, Active-standby switch design and Data-Integrity process.

  16. A Computational framework for telemedicine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; von Laszewski, G.; Thiruvathukal, G. K.

    1998-07-01

    Emerging telemedicine applications require the ability to exploit diverse and geographically distributed resources. Highspeed networks are used to integrate advanced visualization devices, sophisticated instruments, large databases, archival storage devices, PCs, workstations, and supercomputers. This form of telemedical environment is similar to networked virtual supercomputers, also known as metacomputers. Metacomputers are already being used in many scientific application areas. In this article, we analyze requirements necessary for a telemedical computing infrastructure and compare them with requirements found in a typical metacomputing environment. We will show that metacomputing environments can be used to enable a more powerful and unified computational infrastructure formore » telemedicine. The Globus metacomputing toolkit can provide the necessary low level mechanisms to enable a large scale telemedical infrastructure. The Globus toolkit components are designed in a modular fashion and can be extended to support the specific requirements for telemedicine.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christoph, G.G; Jackson, K.A.; Neuman, M.C.

    An effective method for detecting computer misuse is the automatic auditing and analysis of on-line user activity. This activity is reflected in the system audit record, by changes in the vulnerability posture of the system configuration, and in other evidence found through active testing of the system. In 1989 we started developing an automatic misuse detection system for the Integrated Computing Network (ICN) at Los Alamos National Laboratory. Since 1990 this system has been operational, monitoring a variety of network systems and services. We call it the Network Anomaly Detection and Intrusion Reporter, or NADIR. During the last year andmore » a half, we expanded NADIR to include processing of audit and activity records for the Cray UNICOS operating system. This new component is called the UNICOS Real-time NADIR, or UNICORN. UNICORN summarizes user activity and system configuration information in statistical profiles. In near real-time, it can compare current activity to historical profiles and test activity against expert rules that express our security policy and define improper or suspicious behavior. It reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. UNICORN is currently operational on four Crays in Los Alamos` main computing network, the ICN.« less

  18. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software

    PubMed Central

    Mejías, Andrés; Herrera, Reyes S.; Márquez, Marco A.; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-01

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented. PMID:28067801

  19. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.

    PubMed

    Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-05

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  20. Integrated Service Provisioning in an Ipv6 over ATM Research Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eli Dart; Helen Chen; Jerry Friesen

    1999-02-01

    During the past few years, the worldwide Internet has grown at a phenomenal rate, which has spurred the proposal of innovative network technologies to support the fast, efficient and low-latency transport of a wide spectrum of multimedia traffic types. Existing network infrastructures have been plagued by their inability to provide for real-time application traffic as well as their general lack of resources and resilience to congestion. This work proposes to address these issues by implementing a prototype high-speed network infrastructure consisting of Internet Protocol Version 6 (IPv6) on top of an Asynchronous Transfer Mode (ATM) transport medium. Since ATM ismore » connection-oriented whereas IP uses a connection-less paradigm, the efficient integration of IPv6 over ATM is especially challenging and has generated much interest in the research community. We propose, in collaboration with an industry partner, to implement IPv6 over ATM using a unique approach that integrates IP over fast A TM hardware while still preserving IP's connection-less paradigm. This is achieved by replacing ATM's control software with IP's routing code and by caching IP's forwarding decisions in ATM's VPI/VCI translation tables. Prototype ''VR'' and distributed-parallel-computing applications will also be developed to exercise the realtime capability of our IPv6 over ATM network.« less

  1. Semantic Web meets Integrative Biology: a survey.

    PubMed

    Chen, Huajun; Yu, Tong; Chen, Jake Y

    2013-01-01

    Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.

  2. Western State Hospital: implementing a MUMPS-based PC network.

    PubMed

    Russ, D C

    1991-06-01

    Western State Hospital, a state-administered 1,200-bed mental health institution near Tacoma, Wash., confronted the challenge of automating its large campus through the application of the Healthcare Integrated Information System (HIIS). It is the first adaptation of the Veterans Administration's Decentralized Hospital Computer Program software in a mental health institution of this size, and the first DHCP application to be installed on a PC client/server network in a large U.S. hospital.

  3. Harnessing Diversity towards the Reconstructing of Large Scale Gene Regulatory Networks

    PubMed Central

    Yamanaka, Ryota; Kitano, Hiroaki

    2013-01-01

    Elucidating gene regulatory network (GRN) from large scale experimental data remains a central challenge in systems biology. Recently, numerous techniques, particularly consensus driven approaches combining different algorithms, have become a potentially promising strategy to infer accurate GRNs. Here, we develop a novel consensus inference algorithm, TopkNet that can integrate multiple algorithms to infer GRNs. Comprehensive performance benchmarking on a cloud computing framework demonstrated that (i) a simple strategy to combine many algorithms does not always lead to performance improvement compared to the cost of consensus and (ii) TopkNet integrating only high-performance algorithms provide significant performance improvement compared to the best individual algorithms and community prediction. These results suggest that a priori determination of high-performance algorithms is a key to reconstruct an unknown regulatory network. Similarity among gene-expression datasets can be useful to determine potential optimal algorithms for reconstruction of unknown regulatory networks, i.e., if expression-data associated with known regulatory network is similar to that with unknown regulatory network, optimal algorithms determined for the known regulatory network can be repurposed to infer the unknown regulatory network. Based on this observation, we developed a quantitative measure of similarity among gene-expression datasets and demonstrated that, if similarity between the two expression datasets is high, TopkNet integrating algorithms that are optimal for known dataset perform well on the unknown dataset. The consensus framework, TopkNet, together with the similarity measure proposed in this study provides a powerful strategy towards harnessing the wisdom of the crowds in reconstruction of unknown regulatory networks. PMID:24278007

  4. Quantitative and Systems Pharmacology 3. Network-Based Identification of New Targets for Natural Products Enables Potential Uses in Aging-Associated Disorders.

    PubMed

    Fang, Jiansong; Gao, Li; Ma, Huili; Wu, Qihui; Wu, Tian; Wu, Jun; Wang, Qi; Cheng, Feixiong

    2017-01-01

    Aging that refers the accumulation of genetic and physiology changes in cells and tissues over a lifetime has been shown a high risk of developing various complex diseases, such as neurodegenerative disease, cardiovascular disease and cancer. Over the past several decades, natural products have been demonstrated as anti-aging interveners via extending lifespan and preventing aging-associated disorders. In this study, we developed an integrated systems pharmacology infrastructure to uncover new indications for aging-associated disorders by natural products. Specifically, we incorporated 411 high-quality aging-associated human genes or human-orthologous genes from mus musculus (MM), saccharomyces cerevisiae (SC), c aenorhabditis elegans (CE), and drosophila melanogaster (DM). We constructed a global drug-target network of natural products by integrating both experimental and computationally predicted drug-target interactions (DTI). We further built the statistical network models for identification of new anti-aging indications of natural products through integration of the curated aging-associated genes and drug-target network of natural products. High accuracy was achieved on the network models. We showcased several network-predicted anti-aging indications of four typical natural products (caffeic acid, metformin, myricetin, and resveratrol) with new mechanism-of-actions. In summary, this study offers a powerful systems pharmacology infrastructure to identify natural products for treatment of aging-associated disorders.

  5. Quantitative and Systems Pharmacology 3. Network-Based Identification of New Targets for Natural Products Enables Potential Uses in Aging-Associated Disorders

    PubMed Central

    Fang, Jiansong; Gao, Li; Ma, Huili; Wu, Qihui; Wu, Tian; Wu, Jun; Wang, Qi; Cheng, Feixiong

    2017-01-01

    Aging that refers the accumulation of genetic and physiology changes in cells and tissues over a lifetime has been shown a high risk of developing various complex diseases, such as neurodegenerative disease, cardiovascular disease and cancer. Over the past several decades, natural products have been demonstrated as anti-aging interveners via extending lifespan and preventing aging-associated disorders. In this study, we developed an integrated systems pharmacology infrastructure to uncover new indications for aging-associated disorders by natural products. Specifically, we incorporated 411 high-quality aging-associated human genes or human-orthologous genes from mus musculus (MM), saccharomyces cerevisiae (SC), caenorhabditis elegans (CE), and drosophila melanogaster (DM). We constructed a global drug-target network of natural products by integrating both experimental and computationally predicted drug-target interactions (DTI). We further built the statistical network models for identification of new anti-aging indications of natural products through integration of the curated aging-associated genes and drug-target network of natural products. High accuracy was achieved on the network models. We showcased several network-predicted anti-aging indications of four typical natural products (caffeic acid, metformin, myricetin, and resveratrol) with new mechanism-of-actions. In summary, this study offers a powerful systems pharmacology infrastructure to identify natural products for treatment of aging-associated disorders. PMID:29093681

  6. A computer network with scada and case tools for on-line process control in greenhouses

    NASA Astrophysics Data System (ADS)

    Gieling, Th. H.; van Meurs, W. Th. M.; Janssen, H. J. J.

    Climate control computers in greenhouses are used to control heating and ventilation, supply water and dilute and dispense nutrients. They integrate models into optimally controlled systems. This paper describes how information technology, as in use in other sectors of industry, is applied to greenhouse control. The introduction of modern software and hardware concepts in horticulture adds power and extra opportunities to climate control in greenhouses.

  7. Outcomes from the First Wingman Software in the Loop Integration Event: January 2017

    DTIC Science & Technology

    2017-06-28

    for public release; distribution is unlimited. NOTICES Disclaimers The findings in this report are not to be construed as an official...and enhance communication among manned‐unmanned team members, which are critical to achieve Training and Doctrine Command 6+1 required capabilities...Computers to Run the SIL 10 4.1.2 Problem 2: Computer Networking 10 4.1.3 Problem 3: Installation of ARES 11 4.2 Developing Matching Virtual

  8. A computer network with SCADA and case tools for on-line process control in greenhouses.

    PubMed

    Gieling ThH; van Meurs WTh; Janssen, H J

    1996-01-01

    Climate control computers in greenhouses are used to control heating and ventilation, supply water and dilute and dispense nutrients. They integrate models into optimally controlled systems. This paper describes how information technology, as in use in other sectors of industry, is applied to greenhouse control. The introduction of modern software and hardware concepts in horticulture adds power and extra oppurtunities to climate contol in greenhouses.

  9. A survey of body sensor networks.

    PubMed

    Lai, Xiaochen; Liu, Quanli; Wei, Xin; Wang, Wei; Zhou, Guoqiao; Han, Guangyi

    2013-04-24

    The technology of sensor, pervasive computing, and intelligent information processing is widely used in Body Sensor Networks (BSNs), which are a branch of wireless sensor networks (WSNs). BSNs are playing an increasingly important role in the fields of medical treatment, social welfare and sports, and are changing the way humans use computers. Existing surveys have placed emphasis on the concept and architecture of BSNs, signal acquisition, context-aware sensing, and system technology, while this paper will focus on sensor, data fusion, and network communication. And we will introduce the research status of BSNs, the analysis of hotspots, and future development trends, the discussion of major challenges and technical problems facing currently. The typical research projects and practical application of BSNs are introduced as well. BSNs are progressing along the direction of multi-technology integration and intelligence. Although there are still many problems, the future of BSNs is fundamentally promising, profoundly changing the human-machine relationships and improving the quality of people's lives.

  10. Using Equation-Free Computation to Accelerate Network-Free Stochastic Simulation of Chemical Kinetics.

    PubMed

    Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S

    2018-06-21

    The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.

  11. Modeling Signaling Networks to Advance New Cancer Therapies.

    PubMed

    Saez-Rodriguez, Julio; MacNamara, Aidan; Cook, Simon

    2015-01-01

    Cell signaling pathways control cells' responses to their environment through an intricate network of proteins and small molecules partitioned by intracellular structures, such as the cytoskeleton and nucleus. Our understanding of these pathways has been revised recently with the advent of more advanced experimental techniques; no longer are signaling pathways viewed as linear cascades of information flowing from membrane-bound receptors to the nucleus. Instead, such pathways must be understood in the context of networks, and studying such networks requires an integration of computational and experimental approaches. This understanding is becoming more important in designing novel therapies for diseases such as cancer. Using the MAPK (mitogen-activated protein kinase) and PI3K (class I phosphoinositide-3' kinase) pathways as case studies of cellular signaling, we give an overview of these pathways and their functions. We then describe, using a number of case studies, how computational modeling has aided in understanding these pathways' deregulation in cancer, and how such understanding can be used to optimally tailor current therapies or help design new therapies against cancer.

  12. A Survey of Body Sensor Networks

    PubMed Central

    Lai, Xiaochen; Liu, Quanli; Wei, Xin; Wang, Wei; Zhou, Guoqiao; Han, Guangyi

    2013-01-01

    The technology of sensor, pervasive computing, and intelligent information processing is widely used in Body Sensor Networks (BSNs), which are a branch of wireless sensor networks (WSNs). BSNs are playing an increasingly important role in the fields of medical treatment, social welfare and sports, and are changing the way humans use computers. Existing surveys have placed emphasis on the concept and architecture of BSNs, signal acquisition, context-aware sensing, and system technology, while this paper will focus on sensor, data fusion, and network communication. And we will introduce the research status of BSNs, the analysis of hotspots, and future development trends, the discussion of major challenges and technical problems facing currently. The typical research projects and practical application of BSNs are introduced as well. BSNs are progressing along the direction of multi-technology integration and intelligence. Although there are still many problems, the future of BSNs is fundamentally promising, profoundly changing the human-machine relationships and improving the quality of people's lives. PMID:23615581

  13. Computer-assisted stereotactic neurological surgery: pre-planning and on-site real-time operating control and simulation system

    NASA Astrophysics Data System (ADS)

    Zamorano, Lucia J.; Jiang, Charlie Z. W.

    1993-09-01

    In this decade the concept and development of computer assisted stereotactic neurological surgery has improved dramatically. First, the computer network replaced the tape as the data transportation media. Second, newer systems include multi-modality image correlation and frameless stereotactics as an integral part of their functionality, and offer extensive assistance to the neurosurgeon from the preplanning stages to and throughout the operation itself. These are very important changes, and have spurred the development of many interesting techniques. Successful systems include the ISG and NSPS-3.0.

  14. Providing image management and communication functionality as an integral part of an existing hospital information system

    NASA Astrophysics Data System (ADS)

    Dayhoff, Ruth E.; Maloney, Daniel L.

    1990-08-01

    The effective delivery of health care has become increasingly dependent on a wide range of medical data which includes a variety of images. Manual and computer-based medical records ordinarily do not contain image data, leaving the physician to deal with a fragmented patient record widely scattered throughout the hospital. The Department of Veterans Affairs (VA) is currently installing a prototype hospital information system (HIS) workstation network to demonstrate the feasibility of providing image management and communications (IMAC) functionality as an integral part of an existing hospital information system. The core of this system is a database management system adapted to handle images as a new data type. A general model for this integration is discussed and specifics of the hospital-wide network of image display workstations are given.

  15. Comprehensive security framework for the communication and storage of medical images

    NASA Astrophysics Data System (ADS)

    Slik, David; Montour, Mike; Altman, Tym

    2003-05-01

    Confidentiality, integrity verification and access control of medical imagery and associated metadata is critical for the successful deployment of integrated healthcare networks that extend beyond the department level. As medical imagery continues to become widely accessed across multiple administrative domains and geographically distributed locations, image data should be able to travel and be stored on untrusted infrastructure, including public networks and server equipment operated by external entities. Given these challenges associated with protecting large-scale distributed networks, measures must be taken to protect patient identifiable information while guarding against tampering, denial of service attacks, and providing robust audit mechanisms. The proposed framework outlines a series of security practices for the protection of medical images, incorporating Transport Layer Security (TLS), public and secret key cryptography, certificate management and a token based trusted computing base. It outlines measures that can be utilized to protect information stored within databases, online and nearline storage, and during transport over trusted and untrusted networks. In addition, it provides a framework for ensuring end-to-end integrity of image data from acquisition to viewing, and presents a potential solution to the challenges associated with access control across multiple administrative domains and institution user bases.

  16. Advanced Satellite Research Project: SCAR Research Database. Bibliographic analysis

    NASA Technical Reports Server (NTRS)

    Pelton, Joseph N.

    1991-01-01

    The literature search was provided to locate and analyze the most recent literature that was relevant to the research. This was done by cross-relating books, articles, monographs, and journals that relate to the following topics: (1) Experimental Systems - Advanced Communications Technology Satellite (ACTS), and (2) Integrated System Digital Network (ISDN) and Advance Communication Techniques (ISDN and satellites, ISDN standards, broadband ISDN, flame relay and switching, computer networks and satellites, satellite orbits and technology, satellite transmission quality, and network configuration). Bibliographic essay on literature citations and articles reviewed during the literature search task is provided.

  17. Comparison of Event Detection Methods for Centralized Sensor Networks

    NASA Technical Reports Server (NTRS)

    Sauvageon, Julien; Agogiono, Alice M.; Farhang, Ali; Tumer, Irem Y.

    2006-01-01

    The development of an Integrated Vehicle Health Management (IVHM) for space vehicles has become a great concern. Smart Sensor Networks is one of the promising technologies that are catching a lot of attention. In this paper, we propose to a qualitative comparison of several local event (hot spot) detection algorithms in centralized redundant sensor networks. The algorithms are compared regarding their ability to locate and evaluate the event under noise and sensor failures. The purpose of this study is to check if the ratio performance/computational power of the Mote Fuzzy Validation and Fusion algorithm is relevant compare to simpler methods.

  18. High-throughput Bayesian Network Learning using Heterogeneous Multicore Computers

    PubMed Central

    Linderman, Michael D.; Athalye, Vivek; Meng, Teresa H.; Asadi, Narges Bani; Bruggner, Robert; Nolan, Garry P.

    2017-01-01

    Aberrant intracellular signaling plays an important role in many diseases. The causal structure of signal transduction networks can be modeled as Bayesian Networks (BNs), and computationally learned from experimental data. However, learning the structure of Bayesian Networks (BNs) is an NP-hard problem that, even with fast heuristics, is too time consuming for large, clinically important networks (20–50 nodes). In this paper, we present a novel graphics processing unit (GPU)-accelerated implementation of a Monte Carlo Markov Chain-based algorithm for learning BNs that is up to 7.5-fold faster than current general-purpose processor (GPP)-based implementations. The GPU-based implementation is just one of several implementations within the larger application, each optimized for a different input or machine configuration. We describe the methodology we use to build an extensible application, assembled from these variants, that can target a broad range of heterogeneous systems, e.g., GPUs, multicore GPPs. Specifically we show how we use the Merge programming model to efficiently integrate, test and intelligently select among the different potential implementations. PMID:28819655

  19. Ubiquitous healthcare computing with SEnsor Grid Enhancement with Data Management System (SEGEDMA).

    PubMed

    Preve, Nikolaos

    2011-12-01

    Wireless Sensor Network (WSN) can be deployed to monitor the health of patients suffering from critical diseases. Also a wireless network consisting of biomedical sensors can be implanted into the patient's body and can monitor the patients' conditions. These sensor devices, apart from having an enormous capability of collecting data from their physical surroundings, are also resource constraint in nature with a limited processing and communication ability. Therefore we have to integrate them with the Grid technology in order to process and store the collected data by the sensor nodes. In this paper, we proposed the SEnsor Grid Enhancement Data Management system, called SEGEDMA ensuring the integration of different network technologies and the continuous data access to system users. The main contribution of this work is to achieve the interoperability of both technologies through a novel network architecture ensuring also the interoperability of Open Geospatial Consortium (OGC) and HL7 standards. According to the results, SEGEDMA can be applied successfully in a decentralized healthcare environment.

  20. Polytopol computing for multi-core and distributed systems

    NASA Astrophysics Data System (ADS)

    Spaanenburg, Henk; Spaanenburg, Lambert; Ranefors, Johan

    2009-05-01

    Multi-core computing provides new challenges to software engineering. The paper addresses such issues in the general setting of polytopol computing, that takes multi-core problems in such widely differing areas as ambient intelligence sensor networks and cloud computing into account. It argues that the essence lies in a suitable allocation of free moving tasks. Where hardware is ubiquitous and pervasive, the network is virtualized into a connection of software snippets judiciously injected to such hardware that a system function looks as one again. The concept of polytopol computing provides a further formalization in terms of the partitioning of labor between collector and sensor nodes. Collectors provide functions such as a knowledge integrator, awareness collector, situation displayer/reporter, communicator of clues and an inquiry-interface provider. Sensors provide functions such as anomaly detection (only communicating singularities, not continuous observation), they are generally powered or self-powered, amorphous (not on a grid) with generation-and-attrition, field re-programmable, and sensor plug-and-play-able. Together the collector and the sensor are part of the skeleton injector mechanism, added to every node, and give the network the ability to organize itself into some of many topologies. Finally we will discuss a number of applications and indicate how a multi-core architecture supports the security aspects of the skeleton injector.

  1. China's Chemical Information Online Service: ChI2Net.

    ERIC Educational Resources Information Center

    Naiyan, Yu; And Others

    1997-01-01

    Describes the Chemical Integrated Information Service Network (ChI2Net), a comprehensive online information service system which includes chemical, technical, economic, market, news, and management information based on computer and modern communication technology that was built by the China National Chemical Information Centre. (Author/LRW)

  2. Software Trends and Trendsetters: How They're Shaping an Industry.

    ERIC Educational Resources Information Center

    McGinty, Tony; And Others

    1987-01-01

    Discusses trends in educational software and the effects of new developments on publishers and on the computer industry. Marketing prospects for software are examined, and recent advances are highlighted, including integrated learning systems, skill-based software, software tied to textbooks, networking, and freeware. (LRW)

  3. Implementing a frame representation in CLIPS/COOL

    NASA Technical Reports Server (NTRS)

    Myers, Leonard; Snyder, James

    1991-01-01

    An implementation is described and evaluated of frames in COOL. The test case is a frame based semantic network previously implemented in CLIPS (C Language Integrated Production System) Version 4.3 as part of the Intelligent Computer Aided Design System (ICADS) and reported at the first CLIPS conference.

  4. From computers to ubiquitous computing by 2010: health care.

    PubMed

    Aziz, Omer; Lo, Benny; Pansiot, Julien; Atallah, Louis; Yang, Guang-Zhong; Darzi, Ara

    2008-10-28

    Over the past decade, miniaturization and cost reduction in semiconductors have led to computers smaller in size than a pinhead with powerful processing abilities that are affordable enough to be disposable. Similar advances in wireless communication, sensor design and energy storage have meant that the concept of a truly pervasive 'wireless sensor network', used to monitor environments and objects within them, has become a reality. The need for a wireless sensor network designed specifically for human body monitoring has led to the development of wireless 'body sensor network' (BSN) platforms composed of tiny integrated microsensors with on-board processing and wireless data transfer capability. The ubiquitous computing abilities of BSNs offer the prospect of continuous monitoring of human health in any environment, be it home, hospital, outdoors or the workplace. This pervasive technology comes at a time when Western world health care costs have sharply risen, reflected by increasing expenditure on health care as a proportion of gross domestic product over the last 20 years. Drivers of this rise include an ageing post 'baby boom' population, higher incidence of chronic disease and the need for earlier diagnosis. This paper outlines the role of pervasive health care technologies in providing more efficient health care.

  5. Function, dynamics and evolution of network motif modules in integrated gene regulatory networks of worm and plant.

    PubMed

    Defoort, Jonas; Van de Peer, Yves; Vermeirssen, Vanessa

    2018-06-05

    Gene regulatory networks (GRNs) consist of different molecular interactions that closely work together to establish proper gene expression in time and space. Especially in higher eukaryotes, many questions remain on how these interactions collectively coordinate gene regulation. We study high quality GRNs consisting of undirected protein-protein, genetic and homologous interactions, and directed protein-DNA, regulatory and miRNA-mRNA interactions in the worm Caenorhabditis elegans and the plant Arabidopsis thaliana. Our data-integration framework integrates interactions in composite network motifs, clusters these in biologically relevant, higher-order topological network motif modules, overlays these with gene expression profiles and discovers novel connections between modules and regulators. Similar modules exist in the integrated GRNs of worm and plant. We show how experimental or computational methodologies underlying a certain data type impact network topology. Through phylogenetic decomposition, we found that proteins of worm and plant tend to functionally interact with proteins of a similar age, while at the regulatory level TFs favor same age, but also older target genes. Despite some influence of the duplication mode difference, we also observe at the motif and module level for both species a preference for age homogeneity for undirected and age heterogeneity for directed interactions. This leads to a model where novel genes are added together to the GRNs in a specific biological functional context, regulated by one or more TFs that also target older genes in the GRNs. Overall, we detected topological, functional and evolutionary properties of GRNs that are potentially universal in all species.

  6. Photonics and other approaches to high speed communications

    NASA Technical Reports Server (NTRS)

    Maly, Kurt

    1992-01-01

    Our research group of 4 faculty and about 10-15 graduate students was actively involved (as a group) in the development of computer communication networks for the last five years. Many of its individuals have been involved in related research for a much longer period. The overall research goal is to extend network performance to higher data rates, to improve protocol performance at most ISO layers and to improve network operational performance. We briefly state our research goals, then discuss the research accomplishments and direct your attention to attached and/or published papers which cover the following topics: scalable parallel communications; high performance interconnection between high data rate networks; and a simple, effective media access protocol system for integrated, high data rate networks.

  7. A Process Management System for Networked Manufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Tingting; Wang, Huifen; Liu, Linyan

    With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.

  8. Time Triggered Protocol (TTP) for Integrated Modular Avionics

    NASA Technical Reports Server (NTRS)

    Motzet, Guenter; Gwaltney, David A.; Bauer, Guenther; Jakovljevic, Mirko; Gagea, Leonard

    2006-01-01

    Traditional avionics computing systems are federated, with each system provided on a number of dedicated hardware units. Federated applications are physically separated from one another and analysis of the systems is undertaken individually. Integrated Modular Avionics (IMA) takes these federated functions and integrates them on a common computing platform in a tightly deterministic distributed real-time network of computing modules in which the different applications can run. IMA supports different levels of criticality in the same computing resource and provides a platform for implementation of fault tolerance through hardware and application redundancy. Modular implementation has distinct benefits in design, testing and system maintainability. This paper covers the requirements for fault tolerant bus systems used to provide reliable communication between IMA computing modules. An overview of the Time Triggered Protocol (TTP) specification and implementation as a reliable solution for IMA systems is presented. Application examples in aircraft avionics and a development system for future space application are covered. The commercially available TTP controller can be also be implemented in an FPGA and the results from implementation studies are covered. Finally future direction for the application of TTP and related development activities are presented.

  9. Architecture for hospital information integration

    NASA Astrophysics Data System (ADS)

    Chimiak, William J.; Janariz, Daniel L.; Martinez, Ralph

    1999-07-01

    The ongoing integration of hospital information systems (HIS) continues. Data storage systems, data networks and computers improve, data bases grow and health-care applications increase. Some computer operating systems continue to evolve and some fade. Health care delivery now depends on this computer-assisted environment. The result is the critical harmonization of the various hospital information systems becomes increasingly difficult. The purpose of this paper is to present an architecture for HIS integration that is computer-language-neutral and computer- hardware-neutral for the informatics applications. The proposed architecture builds upon the work done at the University of Arizona on middleware, the work of the National Electrical Manufacturers Association, and the American College of Radiology. It is a fresh approach to allowing applications engineers to access medical data easily and thus concentrates on the application techniques in which they are expert without struggling with medical information syntaxes. The HIS can be modeled using a hierarchy of information sub-systems thus facilitating its understanding. The architecture includes the resulting information model along with a strict but intuitive application programming interface, managed by CORBA. The CORBA requirement facilitates interoperability. It should also reduce software and hardware development times.

  10. Sociospace: A smart social framework based on the IP Multimedia Subsystem

    NASA Astrophysics Data System (ADS)

    Hasswa, Ahmed

    Advances in smart technologies, wireless networking, and increased interest in contextual services have led to the emergence of ubiquitous and pervasive computing as one of the most promising areas of computing in recent years. Smart Spaces, in particular, have gained significant interest within the research community. Currently, most Smart Spaces rely on physical components, such as sensors, to acquire information about the real-world environment. Although current sensor networks can acquire some useful contextual information from the physical environment, their information resources are often limited, and the data acquired is often unreliable. We argue that by introducing social network information into such systems, smarter and more adaptive spaces can be created. Social networks have recently become extremely popular, and are now an integral part of millions of people's daily lives. Through social networks, users create profiles, build relationships, and join groups, forming intermingled sets and communities. Social Networks contain a wealth of information, which, if exploited properly, can lead to a whole new level of smart contextual services. A mechanism is therefore needed to extract data from heterogeneous social networks, to link profiles across different networks, and to aggregate the data obtained. We therefore propose the design and implementation of a Smart Spaces framework that utilizes the social context. In order to manage services and sessions, we integrate our system with the IP Multimedia Subsystem. Our system, which we call SocioSpace, includes full design and implementation of all components, including the central server, the location management system, the social network interfacing system, the service delivery platform, and user agents. We have built a prototype for proof of concept and carried out exhaustive performance analysis; the results show that SocioSpace is scalable, extensible, and fault-tolerant. It is capable of creating Smart Spaces that can truly deliver adaptive services that enhance the users' overall experience, increase their satisfaction, and make the surroundings more beneficial and interesting to them.

  11. A principled dimension-reduction method for the population density approach to modeling networks of neurons with synaptic dynamics.

    PubMed

    Ly, Cheng

    2013-10-01

    The population density approach to neural network modeling has been utilized in a variety of contexts. The idea is to group many similar noisy neurons into populations and track the probability density function for each population that encompasses the proportion of neurons with a particular state rather than simulating individual neurons (i.e., Monte Carlo). It is commonly used for both analytic insight and as a time-saving computational tool. The main shortcoming of this method is that when realistic attributes are incorporated in the underlying neuron model, the dimension of the probability density function increases, leading to intractable equations or, at best, computationally intensive simulations. Thus, developing principled dimension-reduction methods is essential for the robustness of these powerful methods. As a more pragmatic tool, it would be of great value for the larger theoretical neuroscience community. For exposition of this method, we consider a single uncoupled population of leaky integrate-and-fire neurons receiving external excitatory synaptic input only. We present a dimension-reduction method that reduces a two-dimensional partial differential-integral equation to a computationally efficient one-dimensional system and gives qualitatively accurate results in both the steady-state and nonequilibrium regimes. The method, termed modified mean-field method, is based entirely on the governing equations and not on any auxiliary variables or parameters, and it does not require fine-tuning. The principles of the modified mean-field method have potential applicability to more realistic (i.e., higher-dimensional) neural networks.

  12. Combining the Finite Element Method with Structural Connectome-based Analysis for Modeling Neurotrauma: Connectome Neurotrauma Mechanics

    PubMed Central

    Kraft, Reuben H.; Mckee, Phillip Justin; Dagro, Amy M.; Grafton, Scott T.

    2012-01-01

    This article presents the integration of brain injury biomechanics and graph theoretical analysis of neuronal connections, or connectomics, to form a neurocomputational model that captures spatiotemporal characteristics of trauma. We relate localized mechanical brain damage predicted from biofidelic finite element simulations of the human head subjected to impact with degradation in the structural connectome for a single individual. The finite element model incorporates various length scales into the full head simulations by including anisotropic constitutive laws informed by diffusion tensor imaging. Coupling between the finite element analysis and network-based tools is established through experimentally-based cellular injury thresholds for white matter regions. Once edges are degraded, graph theoretical measures are computed on the “damaged” network. For a frontal impact, the simulations predict that the temporal and occipital regions undergo the most axonal strain and strain rate at short times (less than 24 hrs), which leads to cellular death initiation, which results in damage that shows dependence on angle of impact and underlying microstructure of brain tissue. The monotonic cellular death relationships predict a spatiotemporal change of structural damage. Interestingly, at 96 hrs post-impact, computations predict no network nodes were completely disconnected from the network, despite significant damage to network edges. At early times () network measures of global and local efficiency were degraded little; however, as time increased to 96 hrs the network properties were significantly reduced. In the future, this computational framework could help inform functional networks from physics-based structural brain biomechanics to obtain not only a biomechanics-based understanding of injury, but also neurophysiological insight. PMID:22915997

  13. From in silico astrocyte cell models to neuron-astrocyte network models: A review.

    PubMed

    Oschmann, Franziska; Berry, Hugues; Obermayer, Klaus; Lenk, Kerstin

    2018-01-01

    The idea that astrocytes may be active partners in synaptic information processing has recently emerged from abundant experimental reports. Because of their spatial proximity to neurons and their bidirectional communication with them, astrocytes are now considered as an important third element of the synapse. Astrocytes integrate and process synaptic information and by doing so generate cytosolic calcium signals that are believed to reflect neuronal transmitter release. Moreover, they regulate neuronal information transmission by releasing gliotransmitters into the synaptic cleft affecting both pre- and postsynaptic receptors. Concurrent with the first experimental reports of the astrocytic impact on neural network dynamics, computational models describing astrocytic functions have been developed. In this review, we give an overview over the published computational models of astrocytic functions, from single-cell dynamics to the tripartite synapse level and network models of astrocytes and neurons. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. A neuronal network model with simplified tonotopicity for tinnitus generation and its relief by sound therapy.

    PubMed

    Nagashino, Hirofumi; Kinouchi, Yohsuke; Danesh, Ali A; Pandya, Abhijit S

    2013-01-01

    Tinnitus is the perception of sound in the ears or in the head where no external source is present. Sound therapy is one of the most effective techniques for tinnitus treatment that have been proposed. In order to investigate mechanisms of tinnitus generation and the clinical effects of sound therapy, we have proposed conceptual and computational models with plasticity using a neural oscillator or a neuronal network model. In the present paper, we propose a neuronal network model with simplified tonotopicity of the auditory system as more detailed structure. In this model an integrate-and-fire neuron model is employed and homeostatic plasticity is incorporated. The computer simulation results show that the present model can show the generation of oscillation and its cessation by external input. It suggests that the present framework is promising as a modeling for the tinnitus generation and the effects of sound therapy.

  15. Hybrid expert system for decision supporting in the medical area: complexity and cognitive computing.

    PubMed

    Brasil, L M; de Azevedo, F M; Barreto, J M

    2001-09-01

    This paper proposes a hybrid expert system (HES) to minimise some complexity problems pervasive to the artificial intelligence such as: the knowledge elicitation process, known as the bottleneck of expert systems; the model choice for knowledge representation to code human reasoning; the number of neurons in the hidden layer and the topology used in the connectionist approach; the difficulty to obtain the explanation on how the network arrived to a conclusion. Two algorithms applied to developing of HES are also suggested. One of them is used to train the fuzzy neural network and the other to obtain explanations on how the fuzzy neural network attained a conclusion. To overcome these difficulties the cognitive computing was integrated to the developed system. A case study is presented (e.g. epileptic crisis) with the problem definition and simulations. Results are also discussed.

  16. A DNA-based molecular motor that can navigate a network of tracks

    NASA Astrophysics Data System (ADS)

    Wickham, Shelley F. J.; Bath, Jonathan; Katsuda, Yousuke; Endo, Masayuki; Hidaka, Kumi; Sugiyama, Hiroshi; Turberfield, Andrew J.

    2012-03-01

    Synthetic molecular motors can be fuelled by the hydrolysis or hybridization of DNA. Such motors can move autonomously and programmably, and long-range transport has been observed on linear tracks. It has also been shown that DNA systems can compute. Here, we report a synthetic DNA-based system that integrates long-range transport and information processing. We show that the path of a motor through a network of tracks containing four possible routes can be programmed using instructions that are added externally or carried by the motor itself. When external control is used we find that 87% of the motors follow the correct path, and when internal control is used 71% of the motors follow the correct path. Programmable motion will allow the development of computing networks, molecular systems that can sort and process cargoes according to instructions that they carry, and assembly lines that can be reconfigured dynamically in response to changing demands.

  17. Computer simulation models as a tool to investigate the role of microRNAs in osteoarthritis

    PubMed Central

    Smith, Graham R.

    2017-01-01

    The aim of this study was to show how computational models can be used to increase our understanding of the role of microRNAs in osteoarthritis (OA) using miR-140 as an example. Bioinformatics analysis and experimental results from the literature were used to create and calibrate models of gene regulatory networks in OA involving miR-140 along with key regulators such as NF-κB, SMAD3, and RUNX2. The individual models were created with the modelling standard, Systems Biology Markup Language, and integrated to examine the overall effect of miR-140 on cartilage homeostasis. Down-regulation of miR-140 may have either detrimental or protective effects for cartilage, indicating that the role of miR-140 is complex. Studies of individual networks in isolation may therefore lead to different conclusions. This indicated the need to combine the five chosen individual networks involving miR-140 into an integrated model. This model suggests that the overall effect of miR-140 is to change the response to an IL-1 stimulus from a prolonged increase in matrix degrading enzymes to a pulse-like response so that cartilage degradation is temporary. Our current model can easily be modified and extended as more experimental data become available about the role of miR-140 in OA. In addition, networks of other microRNAs that are important in OA could be incorporated. A fully integrated model could not only aid our understanding of the mechanisms of microRNAs in ageing cartilage but could also provide a useful tool to investigate the effect of potential interventions to prevent cartilage loss. PMID:29095952

  18. eLoom and Flatland: specification, simulation and visualization engines for the study of arbitrary hierarchical neural architectures.

    PubMed

    Caudell, Thomas P; Xiao, Yunhai; Healy, Michael J

    2003-01-01

    eLoom is an open source graph simulation software tool, developed at the University of New Mexico (UNM), that enables users to specify and simulate neural network models. Its specification language and libraries enables users to construct and simulate arbitrary, potentially hierarchical network structures on serial and parallel processing systems. In addition, eLoom is integrated with UNM's Flatland, an open source virtual environments development tool to provide real-time visualizations of the network structure and activity. Visualization is a useful method for understanding both learning and computation in artificial neural networks. Through 3D animated pictorially representations of the state and flow of information in the network, a better understanding of network functionality is achieved. ART-1, LAPART-II, MLP, and SOM neural networks are presented to illustrate eLoom and Flatland's capabilities.

  19. Interfacing with in-Situ Data Networks during the Arctic Boreal Vulnerability Experiment (ABoVE)

    NASA Astrophysics Data System (ADS)

    McInerney, M.; Griffith, P. C.; Duffy, D.; Hoy, E.; Schnase, J. L.; Sinno, S.; Thompson, J. H.

    2014-12-01

    The Arctic Boreal Vulnerability Experiment (ABoVE) is designed to improve understanding of the causes and impacts of ecological changes in Arctic/boreal regions, and will integrate field-based studies, modeling, and data from airborne and satellite remote sensing. ABoVE will result in a fuller understanding of ecosystem vulnerability and resilience to environmental change in the Arctic and boreal regions of western North America, and provide scientific information required to develop options for societal responses to the impacts of these changes. The studies sponsored by NASA during ABoVE will be coordinated with research and in-situ monitoring activities being sponsored by a number of national and international partners. The NASA Center for Climate Simulation at the Goddard Space Flight Center has partnered with the NASA Carbon Cycle & Ecosystems Office to create a science cloud designed for this field campaign - the ABoVE Science Cloud (ASC). The ASC combines high performance computing with emerging technologies to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage with integrated data management, and integration of core variables from in-situ networks identified by the ABoVE Science Definition Team. In this talk, we will present the scientific requirements driving the development of the ABoVE Science Cloud, discuss the necessary interfaces, both computational and human, with in-situ monitoring networks, and show examples of how the ASC is being used to meet the needs of the ABoVE campaign.

  20. [Exploiture and application of an internet-based Computation Platform for Integrative Pharmacology of Traditional Chinese Medicine].

    PubMed

    Xu, Hai-Yu; Liu, Zhen-Ming; Fu, Yan; Zhang, Yan-Qiong; Yu, Jian-Jun; Guo, Fei-Fei; Tang, Shi-Huan; Lv, Chuan-Yu; Su, Jin; Cui, Ru-Yi; Yang, Hong-Jun

    2017-09-01

    Recently, integrative pharmacology(IP) has become a pivotal paradigm for the modernization of traditional Chinese medicines(TCM) and combinatorial drugs discovery, which is an interdisciplinary science for establishing the in vitro and in vivo correlation between absorption, distribution, metabolism, and excretion/pharmacokinetic(ADME/PK) profiles of TCM and the molecular networks of disease by the integration of the knowledge of multi-disciplinary and multi-stages. In the present study, an internet-based Computation Platform for IP of TCM(TCM-IP, www.tcmip.cn) is established to promote the development of the emerging discipline. Among them, a big data of TCM is an important resource for TCM-IP including Chinese Medicine Formula Database, Chinese Medical Herbs Database, Chemical Database of Chinese Medicine, Target Database for Disease and Symptoms, et al. Meanwhile, some data mining and bioinformatics approaches are critical technology for TCM-IP including the identification of the TCM constituents, ADME prediction, target prediction for the TCM constituents, network construction and analysis, et al. Furthermore, network beautification and individuation design are employed to meet the consumer's requirement. We firmly believe that TCM-IP is a very useful tool for the identification of active constituents of TCM and their involving potential molecular mechanism for therapeutics, which would wildly applied in quality evaluation, clinical repositioning, scientific discovery based on original thinking, prescription compatibility and new drug of TCM, et al. Copyright© by the Chinese Pharmaceutical Association.

  1. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    PubMed

    Li, Qian; Li, Xudong; Li, Canghai; Chen, Lirong; Song, Jun; Tang, Yalin; Xu, Xiaojie

    2011-03-22

    Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671) between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. This article proposes a network-based multi-target computational estimation method for anticoagulant activities of compounds by combining network efficiency analysis with scoring function from molecular docking.

  2. A Network-Based Multi-Target Computational Estimation Scheme for Anticoagulant Activities of Compounds

    PubMed Central

    Li, Canghai; Chen, Lirong; Song, Jun; Tang, Yalin; Xu, Xiaojie

    2011-01-01

    Background Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. Methodology We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671) between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. Conclusions This article proposes a network-based multi-target computational estimation method for anticoagulant activities of compounds by combining network efficiency analysis with scoring function from molecular docking. PMID:21445339

  3. Integrative models of vascular remodeling during tumor growth

    PubMed Central

    Rieger, Heiko; Welter, Michael

    2015-01-01

    Malignant solid tumors recruit the blood vessel network of the host tissue for nutrient supply, continuous growth, and gain of metastatic potential. Angiogenesis (the formation of new blood vessels), vessel cooption (the integration of existing blood vessels into the tumor vasculature), and vessel regression remodel the healthy vascular network into a tumor-specific vasculature that is in many respects different from the hierarchically organized arterio-venous blood vessel network of the host tissues. Integrative models based on detailed experimental data and physical laws implement in silico the complex interplay of molecular pathways, cell proliferation, migration, and death, tissue microenvironment, mechanical and hydrodynamic forces, and the fine structure of the host tissue vasculature. With the help of computer simulations high-precision information about blood flow patterns, interstitial fluid flow, drug distribution, oxygen and nutrient distribution can be obtained and a plethora of therapeutic protocols can be tested before clinical trials. In this review, we give an overview over the current status of integrative models describing tumor growth, vascular remodeling, blood and interstitial fluid flow, drug delivery, and concomitant transformations of the microenvironment. © 2015 The Authors. WIREs Systems Biology and Medicine published by Wiley Periodicals, Inc. PMID:25808551

  4. Identification of functional modules using network topology and high-throughput data.

    PubMed

    Ulitsky, Igor; Shamir, Ron

    2007-01-26

    With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.

  5. Systematic identification of an integrative network module during senescence from time-series gene expression.

    PubMed

    Park, Chihyun; Yun, So Jeong; Ryu, Sung Jin; Lee, Soyoung; Lee, Young-Sam; Yoon, Youngmi; Park, Sang Chul

    2017-03-15

    Cellular senescence irreversibly arrests growth of human diploid cells. In addition, recent studies have indicated that senescence is a multi-step evolving process related to important complex biological processes. Most studies analyzed only the genes and their functions representing each senescence phase without considering gene-level interactions and continuously perturbed genes. It is necessary to reveal the genotypic mechanism inferred by affected genes and their interaction underlying the senescence process. We suggested a novel computational approach to identify an integrative network which profiles an underlying genotypic signature from time-series gene expression data. The relatively perturbed genes were selected for each time point based on the proposed scoring measure denominated as perturbation scores. Then, the selected genes were integrated with protein-protein interactions to construct time point specific network. From these constructed networks, the conserved edges across time point were extracted for the common network and statistical test was performed to demonstrate that the network could explain the phenotypic alteration. As a result, it was confirmed that the difference of average perturbation scores of common networks at both two time points could explain the phenotypic alteration. We also performed functional enrichment on the common network and identified high association with phenotypic alteration. Remarkably, we observed that the identified cell cycle specific common network played an important role in replicative senescence as a key regulator. Heretofore, the network analysis from time series gene expression data has been focused on what topological structure was changed over time point. Conversely, we focused on the conserved structure but its context was changed in course of time and showed it was available to explain the phenotypic changes. We expect that the proposed method will help to elucidate the biological mechanism unrevealed by the existing approaches.

  6. Increased signaling entropy in cancer requires the scale-free property of protein interaction networks.

    PubMed

    Teschendorff, Andrew E; Banerji, Christopher R S; Severini, Simone; Kuehn, Reimer; Sollich, Peter

    2015-04-28

    One of the key characteristics of cancer cells is an increased phenotypic plasticity, driven by underlying genetic and epigenetic perturbations. However, at a systems-level it is unclear how these perturbations give rise to the observed increased plasticity. Elucidating such systems-level principles is key for an improved understanding of cancer. Recently, it has been shown that signaling entropy, an overall measure of signaling pathway promiscuity, and computable from integrating a sample's gene expression profile with a protein interaction network, correlates with phenotypic plasticity and is increased in cancer compared to normal tissue. Here we develop a computational framework for studying the effects of network perturbations on signaling entropy. We demonstrate that the increased signaling entropy of cancer is driven by two factors: (i) the scale-free (or near scale-free) topology of the interaction network, and (ii) a subtle positive correlation between differential gene expression and node connectivity. Indeed, we show that if protein interaction networks were random graphs, described by Poisson degree distributions, that cancer would generally not exhibit an increased signaling entropy. In summary, this work exposes a deep connection between cancer, signaling entropy and interaction network topology.

  7. Increased signaling entropy in cancer requires the scale-free property of protein interaction networks

    PubMed Central

    Teschendorff, Andrew E.; Banerji, Christopher R. S.; Severini, Simone; Kuehn, Reimer; Sollich, Peter

    2015-01-01

    One of the key characteristics of cancer cells is an increased phenotypic plasticity, driven by underlying genetic and epigenetic perturbations. However, at a systems-level it is unclear how these perturbations give rise to the observed increased plasticity. Elucidating such systems-level principles is key for an improved understanding of cancer. Recently, it has been shown that signaling entropy, an overall measure of signaling pathway promiscuity, and computable from integrating a sample's gene expression profile with a protein interaction network, correlates with phenotypic plasticity and is increased in cancer compared to normal tissue. Here we develop a computational framework for studying the effects of network perturbations on signaling entropy. We demonstrate that the increased signaling entropy of cancer is driven by two factors: (i) the scale-free (or near scale-free) topology of the interaction network, and (ii) a subtle positive correlation between differential gene expression and node connectivity. Indeed, we show that if protein interaction networks were random graphs, described by Poisson degree distributions, that cancer would generally not exhibit an increased signaling entropy. In summary, this work exposes a deep connection between cancer, signaling entropy and interaction network topology. PMID:25919796

  8. End-to-end performance measurement of Internet based medical applications.

    PubMed

    Dev, P; Harris, D; Gutierrez, D; Shah, A; Senger, S

    2002-01-01

    We present a method to obtain an end-to-end characterization of the performance of an application over a network. This method is not dependent on any specific application or type of network. The method requires characterization of network parameters, such as latency and packet loss, between the expected server or client endpoints, as well as characterization of the application's constraints on these parameters. A subjective metric is presented that integrates these characterizations and that operates over a wide range of applications and networks. We believe that this method may be of wide applicability as research and educational applications increasingly make use of computation and data servers that are distributed over the Internet.

  9. Chaos in a neural network circuit

    NASA Astrophysics Data System (ADS)

    Kepler, Thomas B.; Datt, Sumeet; Meyer, Robert B.; Abott, L. F.

    1990-12-01

    We have constructed a neural network circuit of four clipped, high-grain, integrating operational amplifiers coupled to each other through an array of digitally programmable resistor ladders (MDACs). In addition to fixed-point and cyclic behavior, the circuit exhibits chaotic behavior with complex strange attractors which are approached through period doubling, intermittent attractor expansion and/or quasiperiodic pathways. Couplings between the nonlinear circuit elements are controlled by a computer which can automatically search through the space of couplings for interesting phenomena. We report some initial statistical results relating the behavior of the network to properties of its coupling matrix. Through these results and further research the circuit should help resolve fundamental issues concerning chaos in neural networks.

  10. A hybrid linear/nonlinear training algorithm for feedforward neural networks.

    PubMed

    McLoone, S; Brown, M D; Irwin, G; Lightbody, A

    1998-01-01

    This paper presents a new hybrid optimization strategy for training feedforward neural networks. The algorithm combines gradient-based optimization of nonlinear weights with singular value decomposition (SVD) computation of linear weights in one integrated routine. It is described for the multilayer perceptron (MLP) and radial basis function (RBF) networks and then extended to the local model network (LMN), a new feedforward structure in which a global nonlinear model is constructed from a set of locally valid submodels. Simulation results are presented demonstrating the superiority of the new hybrid training scheme compared to second-order gradient methods. It is particularly effective for the LMN architecture where the linear to nonlinear parameter ratio is large.

  11. Display integration for ground combat vehicles

    NASA Astrophysics Data System (ADS)

    Busse, David J.

    1998-09-01

    The United States Army's requirement to employ high resolution target acquisition sensors and information warfare to increase its dominance over enemy forces has led to the need to integrate advanced display devices into ground combat vehicle crew stations. The Army's force structure require the integration of advanced displays on both existing and emerging ground combat vehicle systems. The fielding of second generation target acquisition sensors, color digital terrain maps and high volume digital command and control information networks on these platforms define display performance requirements. The greatest challenge facing the system integrator is the development and integration of advanced displays that meet operational, vehicle and human computer interface performance requirements for the ground combat vehicle fleet. The subject of this paper is to address those challenges: operational and vehicle performance, non-soldier centric crew station configurations, display performance limitations related to human computer interfaces and vehicle physical environments, display technology limitations and the Department of Defense (DOD) acquisition reform initiatives. How the ground combat vehicle Program Manager and system integrator are addressing these challenges are discussed through the integration of displays on fielded, current and future close combat vehicle applications.

  12. The effective application of a discrete transition model to explore cell-cycle regulation in yeast

    PubMed Central

    2013-01-01

    Background Bench biologists often do not take part in the development of computational models for their systems, and therefore, they frequently employ them as “black-boxes”. Our aim was to construct and test a model that does not depend on the availability of quantitative data, and can be directly used without a need for intensive computational background. Results We present a discrete transition model. We used cell-cycle in budding yeast as a paradigm for a complex network, demonstrating phenomena such as sequential protein expression and activity, and cell-cycle oscillation. The structure of the network was validated by its response to computational perturbations such as mutations, and its response to mating-pheromone or nitrogen depletion. The model has a strong predicative capability, demonstrating how the activity of a specific transcription factor, Hcm1, is regulated, and what determines commitment of cells to enter and complete the cell-cycle. Conclusion The model presented herein is intuitive, yet is expressive enough to elucidate the intrinsic structure and qualitative behavior of large and complex regulatory networks. Moreover our model allowed us to examine multiple hypotheses in a simple and intuitive manner, giving rise to testable predictions. This methodology can be easily integrated as a useful approach for the study of networks, enriching experimental biology with computational insights. PMID:23915717

  13. Distributed wireless sensing for methane leak detection technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Levente; van Kesse, Theodor

    Large scale environmental monitoring requires dynamic optimization of data transmission, power management, and distribution of the computational load. In this work, we demonstrate the use of a wireless sensor network for detection of chemical leaks on gas oil well pads. The sensor network consist of chemi-resistive and wind sensors and aggregates all the data and transmits it to the cloud for further analytics processing. The sensor network data is integrated with an inversion model to identify leak location and quantify leak rates. We characterize the sensitivity and accuracy of such system under multiple well controlled methane release experiments. It ismore » demonstrated that even 1 hour measurement with 10 sensors localizes leaks within 1 m and determines leak rate with an accuracy of 40%. This integrated sensing and analytics solution is currently refined to be a robust system for long term remote monitoring of methane leaks, generation of alarms, and tracking regulatory compliance.« less

  14. Distributed wireless sensing for fugitive methane leak detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Levente J.; van Kessel, Theodore; Nair, Dhruv

    Large scale environmental monitoring requires dynamic optimization of data transmission, power management, and distribution of the computational load. In this work, we demonstrate the use of a wireless sensor network for detection of chemical leaks on gas oil well pads. The sensor network consist of chemi-resistive and wind sensors and aggregates all the data and transmits it to the cloud for further analytics processing. The sensor network data is integrated with an inversion model to identify leak location and quantify leak rates. We characterize the sensitivity and accuracy of such system under multiple well controlled methane release experiments. It ismore » demonstrated that even 1 hour measurement with 10 sensors localizes leaks within 1 m and determines leak rate with an accuracy of 40%. This integrated sensing and analytics solution is currently refined to be a robust system for long term remote monitoring of methane leaks, generation of alarms, and tracking regulatory compliance.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J.P.; Bangs, A.L.; Butler, P.L.

    Hetero Helix is a programming environment which simulates shared memory on a heterogeneous network of distributed-memory computers. The machines in the network may vary with respect to their native operating systems and internal representation of numbers. Hetero Helix presents a simple programming model to developers, and also considers the needs of designers, system integrators, and maintainers. The key software technology underlying Hetero Helix is the use of a compiler'' which analyzes the data structures in shared memory and automatically generates code which translates data representations from the format native to each machine into a common format, and vice versa. Themore » design of Hetero Helix was motivated in particular by the requirements of robotics applications. Hetero Helix has been used successfully in an integration effort involving 27 CPUs in a heterogeneous network and a body of software totaling roughly 100,00 lines of code. 25 refs., 6 figs.« less

  16. Distributed wireless sensing for fugitive methane leak detection

    DOE PAGES

    Klein, Levente J.; van Kessel, Theodore; Nair, Dhruv; ...

    2017-12-11

    Large scale environmental monitoring requires dynamic optimization of data transmission, power management, and distribution of the computational load. In this work, we demonstrate the use of a wireless sensor network for detection of chemical leaks on gas oil well pads. The sensor network consist of chemi-resistive and wind sensors and aggregates all the data and transmits it to the cloud for further analytics processing. The sensor network data is integrated with an inversion model to identify leak location and quantify leak rates. We characterize the sensitivity and accuracy of such system under multiple well controlled methane release experiments. It ismore » demonstrated that even 1 hour measurement with 10 sensors localizes leaks within 1 m and determines leak rate with an accuracy of 40%. This integrated sensing and analytics solution is currently refined to be a robust system for long term remote monitoring of methane leaks, generation of alarms, and tracking regulatory compliance.« less

  17. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auld, Joshua; Hope, Michael; Ley, Hubert

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less

  18. Development of Integration Framework for Sensor Network and Satellite Image based on OGC Web Services

    NASA Astrophysics Data System (ADS)

    Ninsawat, Sarawut; Yamamoto, Hirokazu; Kamei, Akihide; Nakamura, Ryosuke; Tsuchida, Satoshi; Maeda, Takahisa

    2010-05-01

    With the availability of network enabled sensing devices, the volume of information being collected by networked sensors has increased dramatically in recent years. Over 100 physical, chemical and biological properties can be sensed using in-situ or remote sensing technology. A collection of these sensor nodes forms a sensor network, which is easily deployable to provide a high degree of visibility into real-world physical processes as events unfold. The sensor observation network could allow gathering of diverse types of data at greater spatial and temporal resolution, through the use of wired or wireless network infrastructure, thus real-time or near-real time data from sensor observation network allow researchers and decision-makers to respond speedily to events. However, in the case of environmental monitoring, only a capability to acquire in-situ data periodically is not sufficient but also the management and proper utilization of data also need to be careful consideration. It requires the implementation of database and IT solutions that are robust, scalable and able to interoperate between difference and distributed stakeholders to provide lucid, timely and accurate update to researchers, planners and citizens. The GEO (Global Earth Observation) Grid is primarily aiming at providing an e-Science infrastructure for the earth science community. The GEO Grid is designed to integrate various kinds of data related to the earth observation using the grid technology, which is developed for sharing data, storage, and computational powers of high performance computing, and is accessible as a set of services. A comprehensive web-based system for integrating field sensor and data satellite image based on various open standards of OGC (Open Geospatial Consortium) specifications has been developed. Web Processing Service (WPS), which is most likely the future direction of Web-GIS, performs the computation of spatial data from distributed data sources and returns the outcome in a standard format. The interoperability capabilities and Service Oriented Architecture (SOA) of web services allow incorporating between sensor network measurement available from Sensor Observation Service (SOS) and satellite remote sensing data from Web Mapping Service (WMS) as distributed data sources for WPS. Various applications have been developed to demonstrate the efficacy of integrating heterogeneous data source. For example, the validation of the MODIS aerosol products (MOD08_D3, the Level-3 MODIS Atmosphere Daily Global Product) by ground-based measurements using the sunphotometer (skyradiometer, Prede POM-02) installed at Phenological Eyes Network (PEN) sites in Japan. Furthermore, the web-based framework system for studying a relationship between calculated Vegetation Index from MODIS satellite image surface reflectance (MOD09GA, the Surface Reflectance Daily L2G Global 1km and 500m Product) and Gross Primary Production (GPP) field measurement at flux tower site in Thailand and Japan has been also developed. The success of both applications will contribute to maximize data utilization and improve accuracy of information by validate MODIS satellite products using high degree of accuracy and temporal measurement of field measurement data.

  19. Generation of computationally predicted Adverse Outcome Pathway networks through integration of publicly available in vivo, in vitro, phenotype, and biological pathway data.

    EPA Science Inventory

    The Adverse Outcome Pathway (AOP) framework is becoming a widely used tool for organizing and summarizing the mechanistic information connecting molecular perturbations by environmental stressors with adverse ecological and human health outcomes. However, the conventional process...

  20. Information Operations and FATA Integration into the National Mainstream

    DTIC Science & Technology

    2012-09-01

    Edward L. Fisher THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this...LEFT BLANK vii TABLE OF CONTENTS I . INTRODUCTION ............................................................................................. 1...39 ix i . Computer Network Operations .................................. 40 j. CNO as an IO Core Capability .................................... 40

Top