2010-06-01
DATES COVEREDAPR 2009 – JAN 2010 (From - To) APR 2009 – JAN 2010 4. TITLE AND SUBTITLE EMERGING NEUROMORPHIC COMPUTING ARCHITECTURES AND ENABLING...14. ABSTRACT The highly cross-disciplinary emerging field of neuromorphic computing architectures for cognitive information processing applications...belief systems, software, computer engineering, etc. In our effort to develop cognitive systems atop a neuromorphic computing architecture, we explored
A learnable parallel processing architecture towards unity of memory and computing
NASA Astrophysics Data System (ADS)
Li, H.; Gao, B.; Chen, Z.; Zhao, Y.; Huang, P.; Ye, H.; Liu, L.; Liu, X.; Kang, J.
2015-08-01
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named “iMemComp”, where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped “iMemComp” with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on “iMemComp” can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
A learnable parallel processing architecture towards unity of memory and computing.
Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J
2015-08-14
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
Information Technology Architectures. New Opportunities for Partnering, CAUSE94. Track VI.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Eight papers are presented from the 1994 CAUSE conference track on information technology architectures as applied to higher education institutions. The papers include: (1) "Reshaping the Enterprise: Building the Next Generation of Information Systems Through Information Architecture and Processing Reengineering," which notes…
An object-oriented software approach for a distributed human tracking motion system
NASA Astrophysics Data System (ADS)
Micucci, Daniela L.
2003-06-01
Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.
Advanced information processing system for advanced launch system: Avionics architecture synthesis
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.
1991-01-01
The Advanced Information Processing System (AIPS) is a fault-tolerant distributed computer system architecture that was developed to meet the real time computational needs of advanced aerospace vehicles. One such vehicle is the Advanced Launch System (ALS) being developed jointly by NASA and the Department of Defense to launch heavy payloads into low earth orbit at one tenth the cost (per pound of payload) of the current launch vehicles. An avionics architecture that utilizes the AIPS hardware and software building blocks was synthesized for ALS. The AIPS for ALS architecture synthesis process starting with the ALS mission requirements and ending with an analysis of the candidate ALS avionics architecture is described.
Reshaping the Enterprise through an Information Architecture and Process Reengineering.
ERIC Educational Resources Information Center
Laudato, Nicholas C.; DeSantis, Dennis J.
1995-01-01
The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…
Information Architecture without Internal Theory: An Inductive Design Process.
ERIC Educational Resources Information Center
Haverty, Marsha
2002-01-01
Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
HYDRA: A Middleware-Oriented Integrated Architecture for e-Procurement in Supply Chains
NASA Astrophysics Data System (ADS)
Alor-Hernandez, Giner; Aguilar-Lasserre, Alberto; Juarez-Martinez, Ulises; Posada-Gomez, Ruben; Cortes-Robles, Guillermo; Garcia-Martinez, Mario Alberto; Gomez-Berbis, Juan Miguel; Rodriguez-Gonzalez, Alejandro
The Service-Oriented Architecture (SOA) development paradigm has emerged to improve the critical issues of creating, modifying and extending solutions for business processes integration, incorporating process automation and automated exchange of information between organizations. Web services technology follows the SOA's principles for developing and deploying applications. Besides, Web services are considered as the platform for SOA, for both intra- and inter-enterprise communication. However, an SOA does not incorporate information about occurring events into business processes, which are the main features of supply chain management. These events and information delivery are addressed in an Event-Driven Architecture (EDA). Taking this into account, we propose a middleware-oriented integrated architecture that offers a brokering service for the procurement of products in a Supply Chain Management (SCM) scenario. As salient contributions, our system provides a hybrid architecture combining features of both SOA and EDA and a set of mechanisms for business processes pattern management, monitoring based on UML sequence diagrams, Web services-based management, event publish/subscription and reliable messaging service.
Information Interaction: Providing a Framework for Information Architecture.
ERIC Educational Resources Information Center
Toms, Elaine G.
2002-01-01
Discussion of information architecture focuses on a model of information interaction that bridges the gap between human and computer and between information behavior and information retrieval. Illustrates how the process of information interaction is affected by the user, the system, and the content. (Contains 93 references.) (LRW)
The architecture of the management system of complex steganographic information
NASA Astrophysics Data System (ADS)
Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.
2017-01-01
The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.
Hadoop-based implementation of processing medical diagnostic records for visual patient system
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo
2018-03-01
We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.
CDC WONDER: a cooperative processing architecture for public health.
Friede, A; Rosen, D H; Reid, J A
1994-01-01
CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813
A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises
NASA Astrophysics Data System (ADS)
Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.
2012-04-01
The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer include providing a framework for information integration on a syntactic and semantic level, leveraging distributed processing resources for a scalable data processing platform, and automating data processing and decision support workflows.
ERIC Educational Resources Information Center
Fific, Mario; Nosofsky, Robert M.; Townsend, James T.
2008-01-01
A growing methodology, known as the systems factorial technology (SFT), is being developed to diagnose the types of information-processing architectures (serial, parallel, or coactive) and stopping rules (exhaustive or self-terminating) that operate in tasks of multidimensional perception. Whereas most previous applications of SFT have been in…
Advanced information processing system: Input/output network management software
NASA Technical Reports Server (NTRS)
Nagle, Gail; Alger, Linda; Kemp, Alexander
1988-01-01
The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.
Technology architecture guidelines for a health care system.
Jones, D T; Duncan, R; Langberg, M L; Shabot, M M
2000-01-01
Although the demand for use of information technology within the healthcare industry is intensifying, relatively little has been written about guidelines to optimize IT investments. A technology architecture is a set of guidelines for technology integration within an enterprise. The architecture is a critical tool in the effort to control information technology (IT) operating costs by constraining the number of technologies supported. A well-designed architecture is also an important aid to integrating disparate applications, data stores and networks. The authors led the development of a thorough, carefully designed technology architecture for a large and rapidly growing health care system. The purpose and design criteria are described, as well as the process for gaining consensus and disseminating the architecture. In addition, the processes for using, maintaining, and handling exceptions are described. The technology architecture is extremely valuable to health care organizations both in controlling costs and promoting integration.
Technology architecture guidelines for a health care system.
Jones, D. T.; Duncan, R.; Langberg, M. L.; Shabot, M. M.
2000-01-01
Although the demand for use of information technology within the healthcare industry is intensifying, relatively little has been written about guidelines to optimize IT investments. A technology architecture is a set of guidelines for technology integration within an enterprise. The architecture is a critical tool in the effort to control information technology (IT) operating costs by constraining the number of technologies supported. A well-designed architecture is also an important aid to integrating disparate applications, data stores and networks. The authors led the development of a thorough, carefully designed technology architecture for a large and rapidly growing health care system. The purpose and design criteria are described, as well as the process for gaining consensus and disseminating the architecture. In addition, the processes for using, maintaining, and handling exceptions are described. The technology architecture is extremely valuable to health care organizations both in controlling costs and promoting integration. PMID:11079913
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner
When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rationalmore » DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201« less
Image Understanding Architecture
1991-09-01
architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers
The NASA Integrated Information Technology Architecture
NASA Technical Reports Server (NTRS)
Baldridge, Tim
1997-01-01
This document defines an Information Technology Architecture for the National Aeronautics and Space Administration (NASA), where Information Technology (IT) refers to the hardware, software, standards, protocols and processes that enable the creation, manipulation, storage, organization and sharing of information. An architecture provides an itemization and definition of these IT structures, a view of the relationship of the structures to each other and, most importantly, an accessible view of the whole. It is a fundamental assumption of this document that a useful, interoperable and affordable IT environment is key to the execution of the core NASA scientific and project competencies and business practices. This Architecture represents the highest level system design and guideline for NASA IT related activities and has been created on the authority of the NASA Chief Information Officer (CIO) and will be maintained under the auspices of that office. It addresses all aspects of general purpose, research, administrative and scientific computing and networking throughout the NASA Agency and is applicable to all NASA administrative offices, projects, field centers and remote sites. Through the establishment of five Objectives and six Principles this Architecture provides a blueprint for all NASA IT service providers: civil service, contractor and outsourcer. The most significant of the Objectives and Principles are the commitment to customer-driven IT implementations and the commitment to a simpler, cost-efficient, standards-based, modular IT infrastructure. In order to ensure that the Architecture is presented and defined in the context of the mission, project and business goals of NASA, this Architecture consists of four layers in which each subsequent layer builds on the previous layer. They are: 1) the Business Architecture: the operational functions of the business, or Enterprise, 2) the Systems Architecture: the specific Enterprise activities within the context of IT systems, 3) the Technical Architecture: a common, vendor-independent framework for design, integration and implementation of IT systems and 4) the Product Architecture: vendor=specific IT solutions. The Systems Architecture is effectively a description of the end-user "requirements". Generalized end-user requirements are discussed and subsequently organized into specific mission and project functions. The Technical Architecture depicts the framework, and relationship, of the specific IT components that enable the end-user functionality as described in the Systems Architecture. The primary components as described in the Technical Architecture are: 1) Applications: Basic Client Component, Object Creation Applications, Collaborative Applications, Object Analysis Applications, 2) Services: Messaging, Information Broker, Collaboration, Distributed Processing, and 3) Infrastructure: Network, Security, Directory, Certificate Management, Enterprise Management and File System. This Architecture also provides specific Implementation Recommendations, the most significant of which is the recognition of IT as core to NASA activities and defines a plan, which is aligned with the NASA strategic planning processes, for keeping the Architecture alive and useful.
A Computerized Architecture Slide Classification for a Small University Collection.
ERIC Educational Resources Information Center
Powell, Richard K.
This paper briefly outlines the process used to organize, classify, and make accessible a collection of architecture slides in the Architecture Resource Center at Andrews University in Michigan. The classification system includes the use of Art and Architecture Thesaurus subject headings, the ERIC (Educational Resources Information Center) concept…
The Perception of Human Resources Enterprise Architecture within the Department of Defense
ERIC Educational Resources Information Center
Delaquis, Richard Serge
2012-01-01
The Clinger Cohen Act of 1996 requires that all major Federal Government Information Technology (IT) systems prepare an Enterprise Architecture prior to IT acquisitions. Enterprise Architecture, like house blueprints, represents the system build, capabilities, processes, and data across the enterprise of IT systems. Enterprise Architecture is used…
Rooney, Kevin K.; Condia, Robert J.; Loschky, Lester C.
2017-01-01
Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one’s fist at arm’s length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the experience of architecture, which can be tested through future experimentation. (298 words) PMID:28360867
Rooney, Kevin K; Condia, Robert J; Loschky, Lester C
2017-01-01
Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one's fist at arm's length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the experience of architecture, which can be tested through future experimentation. (298 words).
Pyramidal neurovision architecture for vision machines
NASA Astrophysics Data System (ADS)
Gupta, Madan M.; Knopf, George K.
1993-08-01
The vision system employed by an intelligent robot must be active; active in the sense that it must be capable of selectively acquiring the minimal amount of relevant information for a given task. An efficient active vision system architecture that is based loosely upon the parallel-hierarchical (pyramidal) structure of the biological visual pathway is presented in this paper. Although the computational architecture of the proposed pyramidal neuro-vision system is far less sophisticated than the architecture of the biological visual pathway, it does retain some essential features such as the converging multilayered structure of its biological counterpart. In terms of visual information processing, the neuro-vision system is constructed from a hierarchy of several interactive computational levels, whereupon each level contains one or more nonlinear parallel processors. Computationally efficient vision machines can be developed by utilizing both the parallel and serial information processing techniques within the pyramidal computing architecture. A computer simulation of a pyramidal vision system for active scene surveillance is presented.
Fiacco, P. A.; Rice, W. H.
1991-01-01
Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732
Security Risk Assessment Process for UAS in the NAS CNPC Architecture
NASA Technical Reports Server (NTRS)
Iannicca, Dennis C.; Young, Dennis P.; Thadani, Suresh K.; Winter, Gilbert A.
2013-01-01
This informational paper discusses the risk assessment process conducted to analyze Control and Non-Payload Communications (CNPC) architectures for integrating civil Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The assessment employs the National Institute of Standards and Technology (NIST) Risk Management framework to identify threats, vulnerabilities, and risks to these architectures and recommends corresponding mitigating security controls. This process builds upon earlier work performed by RTCA Special Committee (SC) 203 and the Federal Aviation Administration (FAA) to roadmap the risk assessment methodology and to identify categories of information security risks that pose a significant impact to aeronautical communications systems. A description of the deviations from the typical process is described in regards to this aeronautical communications system. Due to the sensitive nature of the information, data resulting from the risk assessment pertaining to threats, vulnerabilities, and risks is beyond the scope of this paper.
Security Risk Assessment Process for UAS in the NAS CNPC Architecture
NASA Technical Reports Server (NTRS)
Iannicca, Dennis Christopher; Young, Daniel Paul; Suresh, Thadhani; Winter, Gilbert A.
2013-01-01
This informational paper discusses the risk assessment process conducted to analyze Control and Non-Payload Communications (CNPC) architectures for integrating civil Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The assessment employs the National Institute of Standards and Technology (NIST) Risk Management framework to identify threats, vulnerabilities, and risks to these architectures and recommends corresponding mitigating security controls. This process builds upon earlier work performed by RTCA Special Committee (SC) 203 and the Federal Aviation Administration (FAA) to roadmap the risk assessment methodology and to identify categories of information security risks that pose a significant impact to aeronautical communications systems. A description of the deviations from the typical process is described in regards to this aeronautical communications system. Due to the sensitive nature of the information, data resulting from the risk assessment pertaining to threats, vulnerabilities, and risks is beyond the scope of this paper
ERIC Educational Resources Information Center
Tadesse, Yohannes
2012-01-01
The importance of information security has made many organizations to invest and utilize effective information security controls within the information systems (IS) architecture. An organization's strategic decisions to secure enterprise-wide services often associated with the overall competitive advantages that are attained through the process of…
López, Diego M; Blobel, Bernd; Gonzalez, Carolina
2010-01-01
Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.
Real-Time Cognitive Computing Architecture for Data Fusion in a Dynamic Environment
NASA Technical Reports Server (NTRS)
Duong, Tuan A.; Duong, Vu A.
2012-01-01
A novel cognitive computing architecture is conceptualized for processing multiple channels of multi-modal sensory data streams simultaneously, and fusing the information in real time to generate intelligent reaction sequences. This unique architecture is capable of assimilating parallel data streams that could be analog, digital, synchronous/asynchronous, and could be programmed to act as a knowledge synthesizer and/or an "intelligent perception" processor. In this architecture, the bio-inspired models of visual pathway and olfactory receptor processing are combined as processing components, to achieve the composite function of "searching for a source of food while avoiding the predator." The architecture is particularly suited for scene analysis from visual data and odorant.
Chen, Elizabeth S.; Maloney, Francine L.; Shilmayster, Eugene; Goldberg, Howard S.
2009-01-01
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs. PMID:20351830
Chen, Elizabeth S; Maloney, Francine L; Shilmayster, Eugene; Goldberg, Howard S
2009-11-14
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs.
The informational architecture of the cell.
Walker, Sara Imari; Kim, Hyunju; Davies, Paul C W
2016-03-13
We compare the informational architecture of biological and random networks to identify informational features that may distinguish biological networks from random. The study presented here focuses on the Boolean network model for regulation of the cell cycle of the fission yeast Schizosaccharomyces pombe. We compare calculated values of local and global information measures for the fission yeast cell cycle to the same measures as applied to two different classes of random networks: Erdös-Rényi and scale-free. We report patterns in local information processing and storage that do indeed distinguish biological from random, associated with control nodes that regulate the function of the fission yeast cell-cycle network. Conversely, we find that integrated information, which serves as a global measure of 'emergent' information processing, does not differ from random for the case presented. We discuss implications for our understanding of the informational architecture of the fission yeast cell-cycle network in particular, and more generally for illuminating any distinctive physics that may be operative in life. © 2016 The Author(s).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loparo, Kenneth; Kolacinski, Richard; Threeanaew, Wanchat
A central goal of the work was to enable both the extraction of all relevant information from sensor data, and the application of information gained from appropriate processing and fusion at the system level to operational control and decision-making at various levels of the control hierarchy through: 1. Exploiting the deep connection between information theory and the thermodynamic formalism, 2. Deployment using distributed intelligent agents with testing and validation in a hardware-in-the loop simulation environment. Enterprise architectures are the organizing logic for key business processes and IT infrastructure and, while the generality of current definitions provides sufficient flexibility, the currentmore » architecture frameworks do not inherently provide the appropriate structure. Of particular concern is that existing architecture frameworks often do not make a distinction between ``data'' and ``information.'' This work defines an enterprise architecture for health and condition monitoring of power plant equipment and further provides the appropriate foundation for addressing shortcomings in current architecture definition frameworks through the discovery of the information connectivity between the elements of a power generation plant. That is, to identify the correlative structure between available observations streams using informational measures. The principle focus here is on the implementation and testing of an emergent, agent-based, algorithm based on the foraging behavior of ants for eliciting this structure and on measures for characterizing differences between communication topologies. The elicitation algorithms are applied to data streams produced by a detailed numerical simulation of Alstom’s 1000 MW ultra-super-critical boiler and steam plant. The elicitation algorithm and topology characterization can be based on different informational metrics for detecting connectivity, e.g. mutual information and linear correlation.« less
Information Architecture for Quality Management Support in Hospitals.
Rocha, Álvaro; Freixo, Jorge
2015-10-01
Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.
Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin
2003-09-01
Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.
Iannuzzi, David; Grant, Andrew; Corriveau, Hélène; Boissy, Patrick; Michaud, Francois
2016-12-01
The objective of this study was to design effectively integrated information architecture for a mobile teleoperated robot in remote assistance to the delivery of home health care. Three role classes were identified related to the deployment of a telerobot, namely, engineer, technology integrator, and health professional. Patients and natural caregivers were indirectly considered, this being a component of future field studies. Interviewing representatives of each class provided the functions, and information content and flows for each function. Interview transcripts enabled the formulation of UML (Universal Modeling Language) diagrams for feedback from participants. The proposed information architecture was validated with a use-case scenario. The integrated information architecture incorporates progressive design, ergonomic integration, and the home care needs from medical specialist, nursing, physiotherapy, occupational therapy, and social worker care perspectives. The integrated architecture iterative process promoted insight among participants. The use-case scenario evaluation showed the design's robustness. Complex innovation such as a telerobot must coherently mesh with health-care service delivery needs. The deployment of integrated information architecture bridging development, with specialist and home care applications, is necessary for home care technology innovation. It enables continuing evolution of robot and novel health information design in the same integrated architecture, while accounting for patient ecological need.
Investigation of Carbon Fiber Architecture in Braided Composites Using X-Ray CT Inspection
NASA Technical Reports Server (NTRS)
Rhoads, Daniel J.; Miller, Sandi G.; Roberts, Gary D.; Rauser, Richard W.; Golovaty, Dmitry; Wilber, J. Patrick; Espanol, Malena I.
2017-01-01
During the fabrication of braided carbon fiber composite materials, process variations occur which affect the fiber architecture. Quantitative measurements of local and global fiber architecture variations are needed to determine the potential effect of process variations on mechanical properties of the cured composite. Although non-destructive inspection via X-ray CT imaging is a promising approach, difficulties in quantitative analysis of the data arise due to the similar densities of the material constituents. In an effort to gain more quantitative information about features related to fiber architecture, methods have been explored to improve the details that can be captured by X-ray CT imaging. Metal-coated fibers and thin veils are used as inserts to extract detailed information about fiber orientations and inter-ply behavior from X-ray CT images.
Research of Ancient Architectures in Jin-Fen Area Based on GIS&BIM Technology
NASA Astrophysics Data System (ADS)
Jia, Jing; Zheng, Qiuhong; Gao, Huiying; Sun, Hai
2017-05-01
The number of well-preserved ancient buildings located in Shanxi Province, enjoying the absolute maximum proportion of ancient architectures in China, is about 18418, among which, 9053 buildings have the structural style of wood frame. The value of the application of BIM (Building Information Modeling) and GIS (Geographic Information System) is gradually probed and testified in the corresponding fields of ancient architecture’s spatial distribution information management, routine maintenance and special conservation & restoration, the evaluation and simulation of related disasters, such as earthquake. The research objects are ancient architectures in JIN-FEN area, which were first investigated by Sicheng LIANG and recorded in his work of “Chinese ancient architectures survey report”. The research objects, i.e. the ancient architectures in Jin-Fen area include those in Sicheng LIANG’s investigation, and further adjustments were made through authors’ on-site investigation and literature searching & collection. During this research process, the spatial distributing Geodatabase of research objects is established utilizing GIS. The BIM components library for ancient buildings is formed combining on-site investigation data and precedent classic works, such as “Yingzao Fashi”, a treatise on architectural methods in Song Dynasty, “Yongle Encyclopedia” and “Gongcheng Zuofa Zeli”, case collections of engineering practice, by the Ministry of Construction of Qing Dynasty. A building of Guangsheng temple in Hongtong county is selected as an example to elaborate the BIM model construction process based on the BIM components library for ancient buildings. Based on the foregoing work results of spatial distribution data, attribute data of features, 3D graphic information and parametric building information model, the information management system for ancient architectures in Jin-Fen Area, utilizing GIS&BIM technology, could be constructed to support the further research of seismic disaster analysis and seismic performance simulation.
Linking Neural and Symbolic Representation and Processing of Conceptual Structures
van der Velde, Frank; Forth, Jamie; Nazareth, Deniece S.; Wiggins, Geraint A.
2017-01-01
We compare and discuss representations in two cognitive architectures aimed at representing and processing complex conceptual (sentence-like) structures. First is the Neural Blackboard Architecture (NBA), which aims to account for representation and processing of complex and combinatorial conceptual structures in the brain. Second is IDyOT (Information Dynamics of Thinking), which derives sentence-like structures by learning statistical sequential regularities over a suitable corpus. Although IDyOT is designed at a level more abstract than the neural, so it is a model of cognitive function, rather than neural processing, there are strong similarities between the composite structures developed in IDyOT and the NBA. We hypothesize that these similarities form the basis of a combined architecture in which the individual strengths of each architecture are integrated. We outline and discuss the characteristics of this combined architecture, emphasizing the representation and processing of conceptual structures. PMID:28848460
Modeling and Improving Information Flows in the Development of Large Business Applications
NASA Astrophysics Data System (ADS)
Schneider, Kurt; Lübke, Daniel
Designing a good architecture for an application is a wicked problem. Therefore, experience and knowledge are considered crucial for informing work in software architecture. However, many organizations do not pay sufficient attention to experience exploitation and architectural learning. Many users of information systems are not aware of the options and the needs to report problems and requirements. They often do not have time to describe a problem encountered in sufficient detail for developers to remove it. And there may be a lengthy process for providing feedback. Hence, the knowledge about problems and potential solutions is not shared effectively. Architectural knowledge needs to include evaluative feedback as well as decisions and their reasons (rationale).
NASA Astrophysics Data System (ADS)
Prawata, Albertus Galih
2017-11-01
The architectural design stages in architectural practices or in architectural design studio consist of many aspects. One of them is during the early phases of the design process, where the architects or designers try to interpret the project brief into the design concept. This paper is a report of the procedure of digital tools in the early design process in an architectural practice in Jakarta. It targets principally the use of BIM and digital modeling to generate information and transform them into conceptual forms, which is not very common in Indonesian architectural practices. Traditionally, the project brief is transformed into conceptual forms by using sketches, drawings, and physical model. The new method using digital tools shows that it is possible to do the same thing during the initial stage of the design process to create early architectural design forms. Architect's traditional tools and methods begin to be replaced effectively by digital tools, which would drive bigger opportunities for innovation.
Reflective Subjects in Kant and Architectural Design Education
ERIC Educational Resources Information Center
Rawes, Peg
2007-01-01
In architectural design education, students develop drawing, conceptual, and critical skills which are informed by their ability to reflect upon the production of ideas in design processes and in the urban, environmental, social, historical, and cultural context that define architecture and the built environment. Reflective actions and thinking…
ERIC Educational Resources Information Center
Kerkiri, Tania
2010-01-01
A comprehensive presentation is here made on the modular architecture of an e-learning platform with a distinctive emphasis on content personalization, combining advantages from semantic web technology, collaborative filtering and recommendation systems. Modules of this architecture handle information about both the domain-specific didactic…
ERIC Educational Resources Information Center
Tambouris, Efthimios; Zotou, Maria; Kalampokis, Evangelos; Tarabanis, Konstantinos
2012-01-01
Enterprise architecture (EA) implementation refers to a set of activities ultimately aiming to align business objectives with information technology infrastructure in an organization. EA implementation is a multidisciplinary, complicated and endless process, hence, calls for adequate education and training programs that will build highly skilled…
ERIC Educational Resources Information Center
Henderson, Rebecca M.; Clark, Kim B.
1990-01-01
Using an empirical study of the semiconductor photolithographic alignment equipment industry, this paper shows that architectural innovations destroy the usefulness of established firms' architectural knowledge. Because this knowledge is embedded in the firms' structure and information-processing procedures, the destruction is hard to detect.…
1998-01-24
the Apparel Manufacturing Architecture (AMA), a generic architecture for an apparel enterprise. ARN-AIMS consists of three modules - Order Processing , Order...Tracking and Shipping & Invoicing. The Order Processing Module is designed to facilitate the entry of customer orders for stock and special
NASA Technical Reports Server (NTRS)
Brock, L. D.; Lala, J.
1986-01-01
The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.
NASA Astrophysics Data System (ADS)
Bolton, Richard W.; Dewey, Allen; Horstmann, Paul W.; Laurentiev, John
1997-01-01
This paper examines the role virtual enterprises will have in supporting future business engagements and resulting technology requirements. Two representative end-user scenarios are proposed that define the requirements for 'plug-and-play' information infrastructure frameworks and architectures necessary to enable 'virtual enterprises' in US manufacturing industries. The scenarios provide a high- level 'needs analysis' for identifying key technologies, defining a reference architecture, and developing compliant reference implementations. Virtual enterprises are short- term consortia or alliances of companies formed to address fast-changing opportunities. Members of a virtual enterprise carry out their tasks as if they all worked for a single organization under 'one roof', using 'plug-and-play' information infrastructure frameworks and architectures to access and manage all information needed to support the product cycle. 'Plug-and-play' information infrastructure frameworks and architectures are required to enhance collaboration between companies corking together on different aspects of a manufacturing process. This new form of collaborative computing will decrease cycle-time and increase responsiveness to change.
Role of System Architecture in Architecture in Developing New Drafting Tools
NASA Astrophysics Data System (ADS)
Sorguç, Arzu Gönenç
In this study, the impact of information technologies in architectural design process is discussed. In this discussion, first the differences/nuances between the concept of software engineering and system architecture are clarified. Then, the design process in engineering, and design process in architecture has been compared by considering 3-D models as the center of design process over which the other disciplines involve the design. It is pointed out that in many high-end engineering applications, 3-D solid models and consequently digital mock-up concept has become a common practice. But, architecture as one of the important customers of CAD systems employing these tools has not started to use these 3-D models. It is shown that the reason of this time lag between architecture and engineering lies behind the tradition of design attitude. Therefore, it is proposed a new design scheme a meta-model to develop an integrated design model being centered on 3-D model. It is also proposed a system architecture to achieve the transformation of architectural design process by replacing 2-D thinking with 3-D thinking. It is stated that in the proposed system architecture, the CAD systems are included and adapted for 3-D architectural design in order to provide interfaces for integration of all possible disciplines to design process. It is also shown that such a change will allow to elaborate the intelligent or smart building concept in future.
NASA Astrophysics Data System (ADS)
Gradziński, Piotr
2017-10-01
Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.
A Platform Architecture for Sensor Data Processing and Verification in Buildings
ERIC Educational Resources Information Center
Ortiz, Jorge Jose
2013-01-01
This thesis examines the state of the art of building information systems and evaluates their architecture in the context of emerging technologies and applications for deep analysis of the built environment. We observe that modern building information systems are difficult to extend, do not provide general services for application development, do…
Trust-based information system architecture for personal wellness.
Ruotsalainen, Pekka; Nykänen, Pirkko; Seppälä, Antto; Blobel, Bernd
2014-01-01
Modern eHealth, ubiquitous health and personal wellness systems take place in an unsecure and ubiquitous information space where no predefined trust occurs. This paper presents novel information model and an architecture for trust based privacy management of personal health and wellness information in ubiquitous environment. The architecture enables a person to calculate a dynamic and context-aware trust value for each service provider, and using it to design personal privacy policies for trustworthy use of health and wellness services. For trust calculation a novel set of measurable context-aware and health information-sensitive attributes is developed. The architecture enables a person to manage his or her privacy in ubiquitous environment by formulating context-aware and service provider specific policies. Focus groups and information modelling was used for developing a wellness information model. System analysis method based on sequential steps that enable to combine results of analysis of privacy and trust concerns and the selection of trust and privacy services was used for development of the information system architecture. Its services (e.g. trust calculation, decision support, policy management and policy binding services) and developed attributes enable a person to define situation-aware policies that regulate the way his or her wellness and health information is processed.
A subsumptive, hierarchical, and distributed vision-based architecture for smart robotics.
DeSouza, Guilherme N; Kak, Avinash C
2004-10-01
We present a distributed vision-based architecture for smart robotics that is composed of multiple control loops, each with a specialized level of competence. Our architecture is subsumptive and hierarchical, in the sense that each control loop can add to the competence level of the loops below, and in the sense that the loops can present a coarse-to-fine gradation with respect to vision sensing. At the coarsest level, the processing of sensory information enables a robot to become aware of the approximate location of an object in its field of view. On the other hand, at the finest end, the processing of stereo information enables a robot to determine more precisely the position and orientation of an object in the coordinate frame of the robot. The processing in each module of the control loops is completely independent and it can be performed at its own rate. A control Arbitrator ranks the results of each loop according to certain confidence indices, which are derived solely from the sensory information. This architecture has clear advantages regarding overall performance of the system, which is not affected by the "slowest link," and regarding fault tolerance, since faults in one module does not affect the other modules. At this time we are able to demonstrate the utility of the architecture for stereoscopic visual servoing. The architecture has also been applied to mobile robot navigation and can easily be extended to tasks such as "assembly-on-the-fly."
Trust information-based privacy architecture for ubiquitous health.
Ruotsalainen, Pekka Sakari; Blobel, Bernd; Seppälä, Antto; Nykänen, Pirkko
2013-10-08
Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system analysis. The architecture mimics the way humans use trust information in decision making, and enables the DS to design system-specific privacy policies using computational trust information that is based on systems' measured features. The trust attributes that were developed describe the level systems for support awareness and transparency, and how they follow general and domain-specific regulations and laws. The monitoring component of the architecture offers dynamic feedback concerning how the system enforces the polices of DS. The privacy management architecture developed in this study enables the DS to dynamically manage information privacy in ubiquitous health and to define individual policies for all systems considering their trust value and corresponding attributes. The DS can also set policies for secondary use and reuse of health information. The architecture offers protection against privacy threats existing in ubiquitous environments. Although the architecture is targeted to ubiquitous health, it can easily be modified to other ubiquitous applications.
Trust Information-Based Privacy Architecture for Ubiquitous Health
2013-01-01
Background Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. Objective The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. Methods A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system analysis. Results The architecture mimics the way humans use trust information in decision making, and enables the DS to design system-specific privacy policies using computational trust information that is based on systems’ measured features. The trust attributes that were developed describe the level systems for support awareness and transparency, and how they follow general and domain-specific regulations and laws. The monitoring component of the architecture offers dynamic feedback concerning how the system enforces the polices of DS. Conclusions The privacy management architecture developed in this study enables the DS to dynamically manage information privacy in ubiquitous health and to define individual policies for all systems considering their trust value and corresponding attributes. The DS can also set policies for secondary use and reuse of health information. The architecture offers protection against privacy threats existing in ubiquitous environments. Although the architecture is targeted to ubiquitous health, it can easily be modified to other ubiquitous applications. PMID:25099213
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.
1990-01-01
An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary JO; Quintana, Jorge A.; Soni, Nitin J.
1994-01-01
The NASA Lewis Research Center is developing a multichannel communication signal processing satellite (MCSPS) system which will provide low data rate, direct to user, commercial communications services. The focus of current space segment developments is a flexible, high-throughput, fault tolerant onboard information switching processor. This information switching processor (ISP) is a destination-directed packet switch which performs both space and time switching to route user information among numerous user ground terminals. Through both industry study contracts and in-house investigations, several packet switching architectures were examined. A contention-free approach, the shared memory per beam architecture, was selected for implementation. The shared memory per beam architecture, fault tolerance insertion, implementation, and demonstration plans are described.
All-IP-Ethernet architecture for real-time sensor-fusion processing
NASA Astrophysics Data System (ADS)
Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya
2016-03-01
Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.
Doing It Right: 366 answers to computing questions you didn't know you had
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herring, Stuart Davis
Slides include information on history: version control, version control: branches, version control: Git, releases, requirements, readability, readability control flow, global variables, architecture, architecture redundancy, processes, input/output, unix, etcetera.
Architecture of next-generation information management systems for digital radiology enterprises
NASA Astrophysics Data System (ADS)
Wong, Stephen T. C.; Wang, Huili; Shen, Weimin; Schmidt, Joachim; Chen, George; Dolan, Tom
2000-05-01
Few information systems today offer a clear and flexible means to define and manage the automated part of radiology processes. None of them provide a coherent and scalable architecture that can easily cope with heterogeneity and inevitable local adaptation of applications. Most importantly, they often lack a model that can integrate clinical and administrative information to aid better decisions in managing resources, optimizing operations, and improving productivity. Digital radiology enterprises require cost-effective solutions to deliver information to the right person in the right place and at the right time. We propose a new architecture of image information management systems for digital radiology enterprises. Such a system is based on the emerging technologies in workflow management, distributed object computing, and Java and Web techniques, as well as Philips' domain knowledge in radiology operations. Our design adapts the approach of '4+1' architectural view. In this new architecture, PACS and RIS will become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, it will provide powerful query and statistical functions for managing resources and improving productivity in real time. This work will lead to a new direction of image information management in the next millennium. We will illustrate the innovative design with implemented examples of a working prototype.
Freight data architecture business process, logical data model, and physical data model.
DOT National Transportation Integrated Search
2014-09-01
This document summarizes the study teams efforts to establish data-sharing partnerships : and relay the lessons learned. In addition, it provides information on a prototype freight data : architecture and supporting description and specifications ...
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
PDS4 - Some Principles for Agile Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.
2015-12-01
PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.
Choi, Inyoung; Choi, Ran; Lee, Jonghyun
2010-01-01
Objectives The objective of this research is to introduce the unique approach of the Catholic Medical Center (CMC) integrate network hospitals with organizational and technical methodologies adopted for seamless implementation. Methods The Catholic Medical Center has developed a new hospital information system to connect network hospitals and adopted new information technology architecture which uses single source for multiple distributed hospital systems. Results The hospital information system of the CMC was developed to integrate network hospitals adopting new system development principles; one source, one route and one management. This information architecture has reduced the cost for system development and operation, and has enhanced the efficiency of the management process. Conclusions Integrating network hospital through information system was not simple; it was much more complicated than single organization implementation. We are still looking for more efficient communication channel and decision making process, and also believe that our new system architecture will be able to improve CMC health care system and provide much better quality of health care service to patients and customers. PMID:21818432
The Action Execution Process Implemented in Different Cognitive Architectures: A Review
NASA Astrophysics Data System (ADS)
Dong, Daqi; Franklin, Stan
2014-12-01
An agent achieves its goals by interacting with its environment, cyclically choosing and executing suitable actions. An action execution process is a reasonable and critical part of an entire cognitive architecture, because the process of generating executable motor commands is not only driven by low-level environmental information, but is also initiated and affected by the agent's high-level mental processes. This review focuses on cognitive models of action, or more specifically, of the action execution process, as implemented in a set of popular cognitive architectures. We examine the representations and procedures inside the action execution process, as well as the cooperation between action execution and other high-level cognitive modules. We finally conclude with some general observations regarding the nature of action execution.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
An Airborne Onboard Parallel Processing Testbed
NASA Technical Reports Server (NTRS)
Mandl, Daniel J.
2014-01-01
This presentation provides information on the progress the Intelligent Payload Module (IPM) development effort. In addition, a vision is presented on integration of the IPM architecture with the GeoSocial Application Program Interface (API) architecture to enable efficient distribution of satellite data products.
Learning, memory, and the role of neural network architecture.
Hermundstad, Ann M; Brown, Kevin S; Bassett, Danielle S; Carlson, Jean M
2011-06-01
The performance of information processing systems, from artificial neural networks to natural neuronal ensembles, depends heavily on the underlying system architecture. In this study, we compare the performance of parallel and layered network architectures during sequential tasks that require both acquisition and retention of information, thereby identifying tradeoffs between learning and memory processes. During the task of supervised, sequential function approximation, networks produce and adapt representations of external information. Performance is evaluated by statistically analyzing the error in these representations while varying the initial network state, the structure of the external information, and the time given to learn the information. We link performance to complexity in network architecture by characterizing local error landscape curvature. We find that variations in error landscape structure give rise to tradeoffs in performance; these include the ability of the network to maximize accuracy versus minimize inaccuracy and produce specific versus generalizable representations of information. Parallel networks generate smooth error landscapes with deep, narrow minima, enabling them to find highly specific representations given sufficient time. While accurate, however, these representations are difficult to generalize. In contrast, layered networks generate rough error landscapes with a variety of local minima, allowing them to quickly find coarse representations. Although less accurate, these representations are easily adaptable. The presence of measurable performance tradeoffs in both layered and parallel networks has implications for understanding the behavior of a wide variety of natural and artificial learning systems.
ERIC Educational Resources Information Center
Bae, Kyoung-Il; Kim, Jung-Hyun; Huh, Soon-Young
2003-01-01
Discusses process information sharing among participating organizations in a virtual enterprise and proposes a federated process framework and system architecture that provide a conceptual design for effective implementation of process information sharing supporting the autonomy and agility of the organizations. Develops the framework using an…
Advanced information processing system: Input/output system services
NASA Technical Reports Server (NTRS)
Masotto, Tom; Alger, Linda
1989-01-01
The functional requirements and detailed specifications for the Input/Output (I/O) Systems Services of the Advanced Information Processing System (AIPS) are discussed. The introductory section is provided to outline the overall architecture and functional requirements of the AIPS system. Section 1.1 gives a brief overview of the AIPS architecture as well as a detailed description of the AIPS fault tolerant network architecture, while section 1.2 provides an introduction to the AIPS systems software. Sections 2 and 3 describe the functional requirements and design and detailed specifications of the I/O User Interface and Communications Management modules of the I/O System Services, respectively. Section 4 illustrates the use of the I/O System Services, while Section 5 concludes with a summary of results and suggestions for future work in this area.
A Principled Approach to the Specification of System Architectures for Space Missions
NASA Technical Reports Server (NTRS)
McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad
2015-01-01
Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.
1990-01-25
N Task: UR20 CDRL: 01000 N UR2O--ProcesslEnvironmentx Ada/Xt. Architecture : Design Report ~ ~ fFCp Informal Technical Data I? ,LECp Sofwar Tehoog for...S. FUNDING NUMBERS Ada/Xt Architecture : Design Report STARS Contract 6.AUTHOR(S)_ Ft9628-88-D-0031 6. AUTHOR(S) Kurt Wallnau 7. PERFORMING...of the STARS Prime contract under the Process Environment Integration task (UR20). This document "Ada Xt Architecture : Design Report", type A005
Layered Architectures for Quantum Computers and Quantum Repeaters
NASA Astrophysics Data System (ADS)
Jones, Nathan C.
This chapter examines how to organize quantum computers and repeaters using a systematic framework known as layered architecture, where machine control is organized in layers associated with specialized tasks. The framework is flexible and could be used for analysis and comparison of quantum information systems. To demonstrate the design principles in practice, we develop architectures for quantum computers and quantum repeaters based on optically controlled quantum dots, showing how a myriad of technologies must operate synchronously to achieve fault-tolerance. Optical control makes information processing in this system very fast, scalable to large problem sizes, and extendable to quantum communication.
Constellation's Command, Control, Communications and Information (C3I) Architecture
NASA Technical Reports Server (NTRS)
Breidenthal, Julian C.
2007-01-01
Operations concepts are highly effective for: 1) Developing consensus; 2) Discovering stakeholder needs, goals, objectives; 3) Defining behavior of system components (especially emergent behaviors). An interoperability standard can provide an excellent lever to define the capabilities needed for system evolution. Two categories of architectures are needed in a program of this size are: 1) Generic - Needed for planning, design and construction standards; 2) Specific - Needed for detailed requirement allocations, interface specs. A wide variety of architectural views are needed to address stakeholder concerns, including: 1) Physical; 2) Information (structure, flow, evolution); 3) Processes (design, manufacturing, operations); 4) Performance; 5) Risk.
Examining the architecture of cellular computing through a comparative study with a computer
Wang, Degeng; Gribskov, Michael
2005-01-01
The computer and the cell both use information embedded in simple coding, the binary software code and the quadruple genomic code, respectively, to support system operations. A comparative examination of their system architecture as well as their information storage and utilization schemes is performed. On top of the code, both systems display a modular, multi-layered architecture, which, in the case of a computer, arises from human engineering efforts through a combination of hardware implementation and software abstraction. Using the computer as a reference system, a simplistic mapping of the architectural components between the two is easily detected. This comparison also reveals that a cell abolishes the software–hardware barrier through genomic encoding for the constituents of the biochemical network, a cell's ‘hardware’ equivalent to the computer central processing unit (CPU). The information loading (gene expression) process acts as a major determinant of the encoded constituent's abundance, which, in turn, often determines the ‘bandwidth’ of a biochemical pathway. Cellular processes are implemented in biochemical pathways in parallel manners. In a computer, on the other hand, the software provides only instructions and data for the CPU. A process represents just sequentially ordered actions by the CPU and only virtual parallelism can be implemented through CPU time-sharing. Whereas process management in a computer may simply mean job scheduling, coordinating pathway bandwidth through the gene expression machinery represents a major process management scheme in a cell. In summary, a cell can be viewed as a super-parallel computer, which computes through controlled hardware composition. While we have, at best, a very fragmented understanding of cellular operation, we have a thorough understanding of the computer throughout the engineering process. The potential utilization of this knowledge to the benefit of systems biology is discussed. PMID:16849179
Examining the architecture of cellular computing through a comparative study with a computer.
Wang, Degeng; Gribskov, Michael
2005-06-22
The computer and the cell both use information embedded in simple coding, the binary software code and the quadruple genomic code, respectively, to support system operations. A comparative examination of their system architecture as well as their information storage and utilization schemes is performed. On top of the code, both systems display a modular, multi-layered architecture, which, in the case of a computer, arises from human engineering efforts through a combination of hardware implementation and software abstraction. Using the computer as a reference system, a simplistic mapping of the architectural components between the two is easily detected. This comparison also reveals that a cell abolishes the software-hardware barrier through genomic encoding for the constituents of the biochemical network, a cell's "hardware" equivalent to the computer central processing unit (CPU). The information loading (gene expression) process acts as a major determinant of the encoded constituent's abundance, which, in turn, often determines the "bandwidth" of a biochemical pathway. Cellular processes are implemented in biochemical pathways in parallel manners. In a computer, on the other hand, the software provides only instructions and data for the CPU. A process represents just sequentially ordered actions by the CPU and only virtual parallelism can be implemented through CPU time-sharing. Whereas process management in a computer may simply mean job scheduling, coordinating pathway bandwidth through the gene expression machinery represents a major process management scheme in a cell. In summary, a cell can be viewed as a super-parallel computer, which computes through controlled hardware composition. While we have, at best, a very fragmented understanding of cellular operation, we have a thorough understanding of the computer throughout the engineering process. The potential utilization of this knowledge to the benefit of systems biology is discussed.
Apfelbaum, Keith S; McMurray, Bob
2015-08-01
Traditional studies of human categorization often treat the processes of encoding features and cues as peripheral to the question of how stimuli are categorized. However, in domains where the features and cues are less transparent, how information is encoded prior to categorization may constrain our understanding of the architecture of categorization. This is particularly true in speech perception, where acoustic cues to phonological categories are ambiguous and influenced by multiple factors. Here, it is crucial to consider the joint contributions of the information in the input and the categorization architecture. We contrasted accounts that argue for raw acoustic information encoding with accounts that posit that cues are encoded relative to expectations, and investigated how two categorization architectures-exemplar models and back-propagation parallel distributed processing models-deal with each kind of information. Relative encoding, akin to predictive coding, is a form of noise reduction, so it can be expected to improve model accuracy; however, like predictive coding, the use of relative encoding in speech perception by humans is controversial, so results are compared to patterns of human performance, rather than on the basis of overall accuracy. We found that, for both classes of models, in the vast majority of parameter settings, relative cues greatly helped the models approximate human performance. This suggests that expectation-relative processing is a crucial precursor step in phoneme categorization, and that understanding the information content is essential to understanding categorization processes.
Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J
2008-09-01
The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.
Deep Space Network information system architecture study
NASA Technical Reports Server (NTRS)
Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.
1992-01-01
The purpose of this article is to describe an architecture for the DSN information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990's. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies--i.e., computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.
NASA Astrophysics Data System (ADS)
Higashino, Satoru; Kobayashi, Shoei; Yamagami, Tamotsu
2007-06-01
High data transfer rate has been demanded for data storage devices along increasing the storage capacity. In order to increase the transfer rate, high-speed data processing techniques in read-channel devices are required. Generally, parallel architecture is utilized for the high-speed digital processing. We have developed a new architecture of Interpolated Timing Recovery (ITR) to achieve high-speed data transfer rate and wide capture-range in read-channel devices for the information storage channels. It facilitates the parallel implementation on large-scale-integration (LSI) devices.
Optimal nonlinear information processing capacity in delay-based reservoir computers
NASA Astrophysics Data System (ADS)
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2015-09-01
Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.
Optimal nonlinear information processing capacity in delay-based reservoir computers.
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2015-09-11
Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.
Optimal nonlinear information processing capacity in delay-based reservoir computers
Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo
2015-01-01
Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature. PMID:26358528
An Agile Enterprise Regulation Architecture for Health Information Security Management
Chen, Ying-Pei; Hsieh, Sung-Huai; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie
2010-01-01
Abstract Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital. PMID:20815748
An agile enterprise regulation architecture for health information security management.
Chen, Ying-Pei; Hsieh, Sung-Huai; Cheng, Po-Hsun; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie
2010-09-01
Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital.
Development of Multi-slice Analytical Tool to Support BIM-based Design Process
NASA Astrophysics Data System (ADS)
Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.
2017-03-01
This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.
Alor-Hernández, Giner; Sánchez-Cervantes, José Luis; Juárez-Martínez, Ulises; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Aguilar-Laserre, Alberto
2012-03-01
Emergency healthcare is one of the emerging application domains for information services, which requires highly multimodal information services. The time of consuming pre-hospital emergency process is critical. Therefore, the minimization of required time for providing primary care and consultation to patients is one of the crucial factors when trying to improve the healthcare delivery in emergency situations. In this sense, dynamic location of medical entities is a complex process that needs time and it can be critical when a person requires medical attention. This work presents a multimodal location-based system for locating and assigning medical entities called ITOHealth. ITOHealth provides a multimodal middleware-oriented integrated architecture using a service-oriented architecture in order to provide information of medical entities in mobile devices and web browsers with enriched interfaces providing multimodality support. ITOHealth's multimodality is based on the use of Microsoft Agent Characters, the integration of natural language voice to the characters, and multi-language and multi-characters support providing an advantage for users with visual impairments.
Enterprise application architecture development based on DoDAF and TOGAF
NASA Astrophysics Data System (ADS)
Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng
2017-05-01
For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.
2006-07-21
This fundermental architecture can be illustrated as bow-tie architeture , that is claimed to be the fundermenal architecture of robust systems (Fig...sequenced genomes86,87. The Gene Ontology’s biological process hierarchy88 was used to annotate functional categories to each gene, and proportions of... proportions in the early Drosophila embryo. Nature 415, 798-802 (2002). 14. Kitano, H. Cancer robustness: tumour tactics. Nature 426, 125 (2003). 15
2008-07-21
size. This fundermental architecture can be illustrated as bow-tie architeture , that is claimed to be the fundermenal architecture of robust systems...sequenced genomes86,87. The Gene Ontology’s biological process hierarchy88 was used to annotate functional categories to each gene, and proportions ...and proportions in the early Drosophila embryo. Nature 415, 798-802 (2002). 14. Kitano, H. Cancer robustness: tumour tactics. Nature 426, 125 (2003
Description and Simulation of a Fast Packet Switch Architecture for Communication Satellites
NASA Technical Reports Server (NTRS)
Quintana, Jorge A.; Lizanich, Paul J.
1995-01-01
The NASA Lewis Research Center has been developing the architecture for a multichannel communications signal processing satellite (MCSPS) as part of a flexible, low-cost meshed-VSAT (very small aperture terminal) network. The MCSPS architecture is based on a multifrequency, time-division-multiple-access (MF-TDMA) uplink and a time-division multiplex (TDM) downlink. There are eight uplink MF-TDMA beams, and eight downlink TDM beams, with eight downlink dwells per beam. The information-switching processor, which decodes, stores, and transmits each packet of user data to the appropriate downlink dwell onboard the satellite, has been fully described by using VHSIC (Very High Speed Integrated-Circuit) Hardware Description Language (VHDL). This VHDL code, which was developed in-house to simulate the information switching processor, showed that the architecture is both feasible and viable. This paper describes a shared-memory-per-beam architecture, its VHDL implementation, and the simulation efforts.
NASA Astrophysics Data System (ADS)
Cui, Gaoying; Fan, Jie; Qin, Yuchen; Wang, Dong; Chen, Guangyan
2017-05-01
In order to promote the effective use of demand response load side resources, promote the interaction between supply and demand, enhance the level of customer service and achieve the overall utilization of energy, this paper briefly explain the background significance of design demand response information platform and current situation of domestic and foreign development; Analyse the new demand of electricity demand response combined with the application of Internet and big data technology; Design demand response information platform architecture, construct demand responsive system, analyse process of demand response strategy formulate and intelligent execution implement; study application which combined with the big data, Internet and demand response technology; Finally, from information interaction architecture, control architecture and function design perspective design implementation of demand response information platform, illustrate the feasibility of the proposed platform design scheme implemented in a certain extent.
Mechanisms for Human Spatial Competence
NASA Astrophysics Data System (ADS)
Gunzelmann, Glenn; Lyon, Don R.
Research spanning decades has generated a long list of phenomena associated with human spatial information processing. Additionally, a number of theories have been proposed about the representation, organization and processing of spatial information by humans. This paper presents a broad account of human spatial competence, integrated with the ACT-R cognitive architecture. Using a cognitive architecture grounds the research in a validated theory of human cognition, enhancing the plausibility of the overall account. This work posits a close link of aspects of spatial information processing to vision and motor planning, and integrates theoretical perspectives that have been proposed over the history of research in this area. In addition, the account is supported by evidence from neuropsychological investigations of human spatial ability. The mechanisms provide a means of accounting for a broad range of phenomena described in the experimental literature.
Automated monitoring of medical protocols: a secure and distributed architecture.
Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F
2003-03-01
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Functional Network Architecture of Reading-Related Regions across Development
ERIC Educational Resources Information Center
Vogel, Alecia C.; Church, Jessica A.; Power, Jonathan D.; Miezin, Fran M.; Petersen, Steven E.; Schlaggar, Bradley L.
2013-01-01
Reading requires coordinated neural processing across a large number of brain regions. Studying relationships between reading-related regions informs the specificity of information processing performed in each region. Here, regions of interest were defined from a meta-analysis of reading studies, including a developmental study. Relationships…
A broadband multimedia TeleLearning system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Ruiping; Karmouch, A.
1996-12-31
In this paper we discuss a broadband multimedia TeleLearning system under development in the Multimedia Information Research Laboratory at the University of Ottawa. The system aims at providing a seamless environment for TeleLearning using the latest telecommunication and multimedia information processing technology. It basically consists of a media production center, a courseware author site, a courseware database, a courseware user site, and an on-line facilitator site. All these components are distributed over an ATM network and work together to offer a multimedia interactive courseware service. An MHEG-based model is exploited in designing the system architecture to achieve the real-time, interactive,more » and reusable information interchange through heterogeneous platforms. The system architecture, courseware processing strategies, courseware document models are presented.« less
Predefined three tier business intelligence architecture in healthcare enterprise.
Wang, Meimei
2013-04-01
Business Intelligence (BI) has caused extensive concerns and widespread use in gathering, processing and analyzing data and providing enterprise users the methodology to make decisions. Different from traditional BI architecture, this paper proposes a new BI architecture, Top-Down Scalable BI architecture with defining mechanism for enterprise decision making solutions and aims at establishing a rapid, consistent, and scalable multiple applications on multiple platforms of BI mechanism. The two opposite information flows in our BI architecture offer the merits of having the high level of organizational prospects and making full use of the existing resources. We also introduced the avg-bed-waiting-time factor to evaluate hospital care capacity.
Systems Architecture for a Nationwide Healthcare System.
Abin, Jorge; Nemeth, Horacio; Friedmann, Ignacio
2015-01-01
From a national level to give Internet technology support, the Nationwide Integrated Healthcare System in Uruguay requires a model of Information Systems Architecture. This system has multiple healthcare providers (public and private), and a strong component of supplementary services. Thus, the data processing system should have an architecture that considers this fact, while integrating the central services provided by the Ministry of Public Health. The national electronic health record, as well as other related data processing systems, should be based on this architecture. The architecture model described here conceptualizes a federated framework of electronic health record systems, according to the IHE affinity model, HL7 standards, local standards on interoperability and security, as well as technical advice provided by AGESIC. It is the outcome of the research done by AGESIC and Systems Integration Laboratory (LINS) on the development and use of the e-Government Platform since 2008, as well as the research done by the team Salud.uy since 2013.
Advanced Computing Architectures for Cognitive Processing
2009-07-01
Evolution ................................................................................. 20 Figure 9: Logic diagram smart block-based neuron...48 Figure 21: Naive Grid Potential Kernel...processing would be helpful for Air Force systems acquisition. Specific cognitive processing approaches addressed herein include global information grid
NASA Astrophysics Data System (ADS)
Levchenko, N. G.; Glushkov, S. V.; Sobolevskaya, E. Yu; Orlov, A. P.
2018-05-01
The method of modeling the transport and logistics process using fuzzy neural network technologies has been considered. The analysis of the implemented fuzzy neural network model of the information management system of transnational multimodal transportation of the process showed the expediency of applying this method to the management of transport and logistics processes in the Arctic and Subarctic conditions. The modular architecture of this model can be expanded by incorporating additional modules, since the working conditions in the Arctic and the subarctic themselves will present more and more realistic tasks. The architecture allows increasing the information management system, without affecting the system or the method itself. The model has a wide range of application possibilities, including: analysis of the situation and behavior of interacting elements; dynamic monitoring and diagnostics of management processes; simulation of real events and processes; prediction and prevention of critical situations.
Workflow management systems in radiology
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim
1998-07-01
In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.
Visual search, visual streams, and visual architectures.
Green, M
1991-10-01
Most psychological, physiological, and computational models of early vision suggest that retinal information is divided into a parallel set of feature modules. The dominant theories of visual search assume that these modules form a "blackboard" architecture: a set of independent representations that communicate only through a central processor. A review of research shows that blackboard-based theories, such as feature-integration theory, cannot easily explain the existing data. The experimental evidence is more consistent with a "network" architecture, which stresses that: (1) feature modules are directly connected to one another, (2) features and their locations are represented together, (3) feature detection and integration are not distinct processing stages, and (4) no executive control process, such as focal attention, is needed to integrate features. Attention is not a spotlight that synthesizes objects from raw features. Instead, it is better to conceptualize attention as an aperture which masks irrelevant visual information.
Takeda, Shuntaro; Furusawa, Akira
2017-09-22
We propose a scalable scheme for optical quantum computing using measurement-induced continuous-variable quantum gates in a loop-based architecture. Here, time-bin-encoded quantum information in a single spatial mode is deterministically processed in a nested loop by an electrically programmable gate sequence. This architecture can process any input state and an arbitrary number of modes with almost minimum resources, and offers a universal gate set for both qubits and continuous variables. Furthermore, quantum computing can be performed fault tolerantly by a known scheme for encoding a qubit in an infinite-dimensional Hilbert space of a single light mode.
NASA Astrophysics Data System (ADS)
Takeda, Shuntaro; Furusawa, Akira
2017-09-01
We propose a scalable scheme for optical quantum computing using measurement-induced continuous-variable quantum gates in a loop-based architecture. Here, time-bin-encoded quantum information in a single spatial mode is deterministically processed in a nested loop by an electrically programmable gate sequence. This architecture can process any input state and an arbitrary number of modes with almost minimum resources, and offers a universal gate set for both qubits and continuous variables. Furthermore, quantum computing can be performed fault tolerantly by a known scheme for encoding a qubit in an infinite-dimensional Hilbert space of a single light mode.
NASA Technical Reports Server (NTRS)
Bonanne, Kevin H.
2011-01-01
Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.
Deep Space Network information system architecture study
NASA Technical Reports Server (NTRS)
Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.
1992-01-01
The purpose of this article is to describe an architecture for the Deep Space Network (DSN) information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990s. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies, such as the following: computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.
Modeling interdependencies between business and communication processes in hospitals.
Brigl, Birgit; Wendt, Thomas; Winter, Alfred
2003-01-01
The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.
Silicon photonics for neuromorphic information processing
NASA Astrophysics Data System (ADS)
Bienstman, Peter; Dambre, Joni; Katumba, Andrew; Freiberger, Matthias; Laporte, Floris; Lugnan, Alessio
2018-02-01
We present our latest results on silicon photonics neuromorphic information processing based a.o. on techniques like reservoir computing. We will discuss aspects like scalability, novel architectures for enhanced power efficiency, as well as all-optical readout. Additionally, we will touch upon new machine learning techniques to operate these integrated readouts. Finally, we will show how these systems can be used for high-speed low-power information processing for applications like recognition of biological cells.
EHR standards--A comparative study.
Blobel, Bernd; Pharow, Peter
2006-01-01
For ensuring quality and efficiency of patient's care, the care paradigm moves from organization-centered over process-controlled towards personal care. Such health system paradigm change leads to new paradigms for analyzing, designing, implementing and deploying supporting health information systems including EHR systems as core application in a distributed eHealth environment. The paper defines the architectural paradigm for future-proof EHR systems. It compares advanced EHR architectures referencing them at the Generic Component Model. The paper introduces the evolving paradigm of autonomous computing for self-organizing health information systems.
Recent advances in nuclear magnetic resonance quantum information processing.
Criger, Ben; Passante, Gina; Park, Daniel; Laflamme, Raymond
2012-10-13
Quantum information processors have the potential to drastically change the way we communicate and process information. Nuclear magnetic resonance (NMR) has been one of the first experimental implementations of quantum information processing (QIP) and continues to be an excellent testbed to develop new QIP techniques. We review the recent progress made in NMR QIP, focusing on decoupling, pulse engineering and indirect nuclear control. These advances have enhanced the capabilities of NMR QIP, and have useful applications in both traditional NMR and other QIP architectures.
Interaction between Task Oriented and Affective Information Processing in Cognitive Robotics
NASA Astrophysics Data System (ADS)
Haazebroek, Pascal; van Dantzig, Saskia; Hommel, Bernhard
There is an increasing interest in endowing robots with emotions. Robot control however is still often very task oriented. We present a cognitive architecture that allows the combination of and interaction between task representations and affective information processing. Our model is validated by comparing simulation results with empirical data from experimental psychology.
Business Value of Information Sharing and the Role of Emerging Technologies
ERIC Educational Resources Information Center
Kumar, Sanjeev
2009-01-01
Information Technology has brought significant benefits to organizations by allowing greater information sharing within and across firm boundaries leading to performance improvements. Emerging technologies such as Service Oriented Architecture (SOA) and Web2.0 have transformed the volume and process of information sharing. However, a comprehensive…
Information fusion via isocortex-based Area 37 modeling
NASA Astrophysics Data System (ADS)
Peterson, James K.
2004-08-01
A simplified model of information processing in the brain can be constructed using primary sensory input from two modalities (auditory and visual) and recurrent connections to the limbic subsystem. Information fusion would then occur in Area 37 of the temporal cortex. The creation of meta concepts from the low order primary inputs is managed by models of isocortex processing. Isocortex algorithms are used to model parietal (auditory), occipital (visual), temporal (polymodal fusion) cortex and the limbic system. Each of these four modules is constructed out of five cortical stacks in which each stack consists of three vertically oriented six layer isocortex models. The input to output training of each cortical model uses the OCOS (on center - off surround) and FFP (folded feedback pathway) circuitry of (Grossberg, 1) which is inherently a recurrent network type of learning characterized by the identification of perceptual groups. Models of this sort are thus closely related to cognitive models as it is difficult to divorce the sensory processing subsystems from the higher level processing in the associative cortex. The overall software architecture presented is biologically based and is presented as a potential architectural prototype for the development of novel sensory fusion strategies. The algorithms are motivated to some degree by specific data from projects on musical composition and autonomous fine art painting programs, but only in the sense that these projects use two specific types of auditory and visual cortex data. Hence, the architectures are presented for an artificial information processing system which utilizes two disparate sensory sources. The exact nature of the two primary sensory input streams is irrelevant.
Do Intelligent Robots Need Emotion?
Pessoa, Luiz
2017-11-01
What is the place of emotion in intelligent robots? Researchers have advocated the inclusion of some emotion-related components in the information-processing architecture of autonomous agents. It is argued here that emotion needs to be merged with all aspects of the architecture: cognitive-emotional integration should be a key design principle. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Model Based Framework for Semantic Interpretation of Architectural Construction Drawings
ERIC Educational Resources Information Center
Babalola, Olubi Oluyomi
2011-01-01
The study addresses the automated translation of architectural drawings from 2D Computer Aided Drafting (CAD) data into a Building Information Model (BIM), with emphasis on the nature, possible role, and limitations of a drafting language Knowledge Representation (KR) on the problem and process. The central idea is that CAD to BIM translation is a…
The Future of Architecture Collaborative Information Sharing: DoDAF Version 2.03 Updates
2012-04-30
Salamander x Select Solution Factory Select Business Solutions BPMN , UML x SimonTool Simon Labs x SimProcess CACI BPMN x System Architecture Management...for DoDAF Mega UML x Metastorm ProVision Metastorm BPMN x Naval Simulation System - 4 Aces METRON x NetViz CA x OPNET OPNET x Tool Name Vendor Primary
NASA Astrophysics Data System (ADS)
Sliva, Amy L.; Gorman, Joe; Voshell, Martin; Tittle, James; Bowman, Christopher
2016-05-01
The Dual Node Decision Wheels (DNDW) architecture concept was previously described as a novel approach toward integrating analytic and decision-making processes in joint human/automation systems in highly complex sociotechnical settings. In this paper, we extend the DNDW construct with a description of components in this framework, combining structures of the Dual Node Network (DNN) for Information Fusion and Resource Management with extensions on Rasmussen's Decision Ladder (DL) to provide guidance on constructing information systems that better serve decision-making support requirements. The DNN takes a component-centered approach to system design, decomposing each asset in terms of data inputs and outputs according to their roles and interactions in a fusion network. However, to ensure relevancy to and organizational fitment within command and control (C2) processes, principles from cognitive systems engineering emphasize that system design must take a human-centered systems view, integrating information needs and decision making requirements to drive the architecture design and capabilities of network assets. In the current work, we present an approach for structuring and assessing DNDW systems that uses a unique hybrid DNN top-down system design with a human-centered process design, combining DNN node decomposition with artifacts from cognitive analysis (i.e., system abstraction decomposition models, decision ladders) to provide work domain and task-level insights at different levels in an example intelligence, surveillance, and reconnaissance (ISR) system setting. This DNDW structure will ensure not only that the information fusion technologies and processes are structured effectively, but that the resulting information products will align with the requirements of human decision makers and be adaptable to different work settings .
Advanced information processing system: Inter-computer communication services
NASA Technical Reports Server (NTRS)
Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.
1991-01-01
The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.
NASA Astrophysics Data System (ADS)
Garcia-Gonzalez, Juan P.; Gacitua-Decar, Geronica; Pahl, Claus
Providing mobility to participants of business processes is an increasing trend in the banking sector. Independence of a physical place to interact with clients, while been able to use the information managed in the banking applications is one, of the benefits of mobile business processes. Challenges arising from this approach include to deal with a scenario of occasionally connected communication; security issues regarding the exposition of internal information on devices-that could be lost-; and restrictions on the capacity of mobile devices. This paper presents our experience in implementing a service-based architecture solution to extend centralised resources from a financial institution to a mobile platform.
Vadillo, Miguel A; Luque, David
2013-03-01
Previous research on causal learning has usually made strong claims about the relative complexity and temporal priority of some processes over others based on evidence about dissociations between several types of judgments. In particular, it has been argued that the dissociation between causal judgments and trial-type frequency information is incompatible with the general cognitive architecture proposed by associative models. In contrast with this view, we conduct an associative analysis of this process showing that this need not be the case. We conclude that any attempt to gain a better insight on the cognitive architecture involved in contingency learning cannot rely solely on data about these dissociations.
Zhou, Li; Friedman, Carol; Parsons, Simon; Hripcsak, George
2005-01-01
Exploring temporal information in narrative Electronic Medical Records (EMRs) is essential and challenging. We propose an architecture for an integrated approach to process temporal information in clinical narrative reports. The goal is to initiate and build a foundation that supports applications which assist healthcare practice and research by including the ability to determine the time of clinical events (e.g., past vs. present). Key components include: (1) a temporal constraint structure for temporal expressions and the development of an associated tagger; (2) a Natural Language Processing (NLP) system for encoding and extracting medical events and associating them with formalized temporal data; (3) a post-processor, with a knowledge-based subsystem to help discover implicit information, that resolves temporal expressions and deals with issues such as granularity and vagueness; and (4) a reasoning mechanism which models clinical reports as Simple Temporal Problems (STPs). PMID:16779164
NASA Astrophysics Data System (ADS)
Hodijah, A.; Sundari, S.; Nugraha, A. C.
2018-05-01
As a Local Government Agencies who perform public services, General Government Office already has utilized Reporting Information System of Local Government Implementation (E-LPPD). However, E-LPPD has upgrade limitation for the integration processes that cannot accommodate General Government Offices’ needs in order to achieve Good Government Governance (GGG), while success stories of the ultimate goal of e-government implementation requires good governance practices. Currently, citizen demand public services as private sector do, which needs service innovation by utilizing the legacy system as a service based e-government implementation, while Service Oriented Architecture (SOA) to redefine a business processes as a set of IT enabled services and Enterprise Architecture from the Open Group Architecture Framework (TOGAF) as a comprehensive approach in redefining business processes as service innovation towards GGG. This paper takes a case study on Performance Evaluation of Local Government Implementation (EKPPD) system on General Government Office. The results show that TOGAF will guide the development of integrated business processes of EKPPD system that fits good governance practices to attain GGG with SOA methodology as technical approach.
Marceglia, S; Fontelo, P; Rossi, E; Ackerman, M J
2015-01-01
Mobile health Applications (mHealth Apps) are opening the way to patients' responsible and active involvement with their own healthcare management. However, apart from Apps allowing patient's access to their electronic health records (EHRs), mHealth Apps are currently developed as dedicated "island systems". Although much work has been done on patient's access to EHRs, transfer of information from mHealth Apps to EHR systems is still low. This study proposes a standards-based architecture that can be adopted by mHealth Apps to exchange information with EHRs to support better quality of care. Following the definition of requirements for the EHR/mHealth App information exchange recently proposed, and after reviewing current standards, we designed the architecture for EHR/mHealth App integration. Then, as a case study, we modeled a system based on the proposed architecture aimed to support home monitoring for congestive heart failure patients. We simulated such process using, on the EHR side, OpenMRS, an open source longitudinal EHR and, on the mHealth App side, the iOS platform. The integration architecture was based on the bi-directional exchange of standard documents (clinical document architecture rel2 - CDA2). In the process, the clinician "prescribes" the home monitoring procedures by creating a CDA2 prescription in the EHR that is sent, encrypted and de-identified, to the mHealth App to create the monitoring calendar. At the scheduled time, the App alerts the patient to start the monitoring. After the measurements are done, the App generates a structured CDA2-compliant monitoring report and sends it to the EHR, thus avoiding local storage. The proposed architecture, even if validated only in a simulation environment, represents a step forward in the integration of personal mHealth Apps into the larger health-IT ecosystem, allowing the bi-directional data exchange between patients and healthcare professionals, supporting the patient's engagement in self-management and self-care.
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.
2013-09-01
Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.
A unified architecture for biomedical search engines based on semantic web technologies.
Jalali, Vahid; Matash Borujerdi, Mohammad Reza
2011-04-01
There is a huge growth in the volume of published biomedical research in recent years. Many medical search engines are designed and developed to address the over growing information needs of biomedical experts and curators. Significant progress has been made in utilizing the knowledge embedded in medical ontologies and controlled vocabularies to assist these engines. However, the lack of common architecture for utilized ontologies and overall retrieval process, hampers evaluating different search engines and interoperability between them under unified conditions. In this paper, a unified architecture for medical search engines is introduced. Proposed model contains standard schemas declared in semantic web languages for ontologies and documents used by search engines. Unified models for annotation and retrieval processes are other parts of introduced architecture. A sample search engine is also designed and implemented based on the proposed architecture in this paper. The search engine is evaluated using two test collections and results are reported in terms of precision vs. recall and mean average precision for different approaches used by this search engine.
Eidels, Ami; Houpt, Joseph W.; Altieri, Nicholas; Pei, Lei; Townsend, James T.
2011-01-01
Systems Factorial Technology is a powerful framework for investigating the fundamental properties of human information processing such as architecture (i.e., serial or parallel processing) and capacity (how processing efficiency is affected by increased workload). The Survivor Interaction Contrast (SIC) and the Capacity Coefficient are effective measures in determining these underlying properties, based on response-time data. Each of the different architectures, under the assumption of independent processing, predicts a specific form of the SIC along with some range of capacity. In this study, we explored SIC predictions of discrete-state (Markov process) and continuous-state (Linear Dynamic) models that allow for certain types of cross-channel interaction. The interaction can be facilitatory or inhibitory: one channel can either facilitate, or slow down processing in its counterpart. Despite the relative generality of these models, the combination of the architecture-oriented plus the capacity oriented analyses provide for precise identification of the underlying system. PMID:21516183
Eidels, Ami; Houpt, Joseph W; Altieri, Nicholas; Pei, Lei; Townsend, James T
2011-04-01
Systems Factorial Technology is a powerful framework for investigating the fundamental properties of human information processing such as architecture (i.e., serial or parallel processing) and capacity (how processing efficiency is affected by increased workload). The Survivor Interaction Contrast (SIC) and the Capacity Coefficient are effective measures in determining these underlying properties, based on response-time data. Each of the different architectures, under the assumption of independent processing, predicts a specific form of the SIC along with some range of capacity. In this study, we explored SIC predictions of discrete-state (Markov process) and continuous-state (Linear Dynamic) models that allow for certain types of cross-channel interaction. The interaction can be facilitatory or inhibitory: one channel can either facilitate, or slow down processing in its counterpart. Despite the relative generality of these models, the combination of the architecture-oriented plus the capacity oriented analyses provide for precise identification of the underlying system.
NASA Technical Reports Server (NTRS)
Dehghani, Navid; Tankenson, Michael
2006-01-01
This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.
Inter-computer communication architecture for a mixed redundancy distributed system
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Adams, Stuart J.
1987-01-01
The triply redundant intercomputer network for the Advanced Information Processing System (AIPS), an architecture developed to serve as the core avionics system for a broad range of aerospace vehicles, is discussed. The AIPS intercomputer network provides a high-speed, Byzantine-fault-resilient communication service between processing sites, even in the presence of arbitrary failures of simplex and duplex processing sites on the IC network. The IC network contention poll has evolved from the Laning Poll. An analysis of the failure modes and effects and a simulation of the AIPS contention poll, demonstrate the robustness of the system.
Battlefield Object Control via Internet Architecture
2002-01-01
superiority is the best way to reach the goal of competition superiority. Using information technology (IT) in data processing, including computer hardware... technologies : Global Positioning System (GPS), Geographic Information System (GIS), Battlefield Information Transmission System (BITS), and Intelligent...operational environment. Keywords: C4ISR Systems, Information Superiority, Battlefield Objects, Computer - Aided Prototyping System (CAPS), IP-based
Integrated Air Surveillance Concept of Operations
2011-11-01
information, intelligence, weather data, and other situational awareness-related information. 4.2.4 Shared Services Automated processing of sensor and...other surveillance information will occur through shared services , accessible through an enterprise network infrastructure, that provide for collecting...also be provided, such as information discovery and translation. The IS architecture effort will identify specific shared services . Shared
Early Influences on Brain Architecture: An Interview with Neuroscientist Eric Knudsen. Perspectives
ERIC Educational Resources Information Center
National Scientific Council on the Developing Child, 2006
2006-01-01
Early experience has a powerful and lasting influence on how the brain develops. The physical and chemical conditions that encourage the building of a strong, adaptive brain architecture are present early in life. As brains age, a number of changes lock in the ways information is processed, making it more difficult for the brain to change to other…
NASA Astrophysics Data System (ADS)
Fiorani, D.; Acierno, M.
2017-05-01
The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.
A resilient and secure software platform and architecture for distributed spacecraft
NASA Astrophysics Data System (ADS)
Otte, William R.; Dubey, Abhishek; Karsai, Gabor
2014-06-01
A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.
Rule-based graph theory to enable exploration of the space system architecture design space
NASA Astrophysics Data System (ADS)
Arney, Dale Curtis
The primary goal of this research is to improve upon system architecture modeling in order to enable the exploration of design space options. A system architecture is the description of the functional and physical allocation of elements and the relationships, interactions, and interfaces between those elements necessary to satisfy a set of constraints and requirements. The functional allocation defines the functions that each system (element) performs, and the physical allocation defines the systems required to meet those functions. Trading the functionality between systems leads to the architecture-level design space that is available to the system architect. The research presents a methodology that enables the modeling of complex space system architectures using a mathematical framework. To accomplish the goal of improved architecture modeling, the framework meets five goals: technical credibility, adaptability, flexibility, intuitiveness, and exhaustiveness. The framework is technically credible, in that it produces an accurate and complete representation of the system architecture under consideration. The framework is adaptable, in that it provides the ability to create user-specified locations, steady states, and functions. The framework is flexible, in that it allows the user to model system architectures to multiple destinations without changing the underlying framework. The framework is intuitive for user input while still creating a comprehensive mathematical representation that maintains the necessary information to completely model complex system architectures. Finally, the framework is exhaustive, in that it provides the ability to explore the entire system architecture design space. After an extensive search of the literature, graph theory presents a valuable mechanism for representing the flow of information or vehicles within a simple mathematical framework. Graph theory has been used in developing mathematical models of many transportation and network flow problems in the past, where nodes represent physical locations and edges represent the means by which information or vehicles travel between those locations. In space system architecting, expressing the physical locations (low-Earth orbit, low-lunar orbit, etc.) and steady states (interplanetary trajectory) as nodes and the different means of moving between the nodes (propulsive maneuvers, etc.) as edges formulates a mathematical representation of this design space. The selection of a given system architecture using graph theory entails defining the paths that the systems take through the space system architecture graph. A path through the graph is defined as a list of edges that are traversed, which in turn defines functions performed by the system. A structure to compactly represent this information is a matrix, called the system map, in which the column indices are associated with the systems that exist and row indices are associated with the edges, or functions, to which each system has access. Several contributions have been added to the state of the art in space system architecture analysis. The framework adds the capability to rapidly explore the design space without the need to limit trade options or the need for user interaction during the exploration process. The unique mathematical representation of a system architecture, through the use of the adjacency, incidence, and system map matrices, enables automated design space exploration using stochastic optimization processes. The innovative rule-based graph traversal algorithm ensures functional feasibility of each system architecture that is analyzed, and the automatic generation of the system hierarchy eliminates the need for the user to manually determine the relationships between systems during or before the design space exploration process. Finally, the rapid evaluation of system architectures for various mission types enables analysis of the system architecture design space for multiple destinations within an evolutionary exploration program. (Abstract shortened by UMI.).
Power System Information Delivering System Based on Distributed Object
NASA Astrophysics Data System (ADS)
Tanaka, Tatsuji; Tsuchiya, Takehiko; Tamura, Setsuo; Seki, Tomomichi; Kubota, Kenji
In recent years, improvement in computer performance and development of computer network technology or the distributed information processing technology has a remarkable thing. Moreover, the deregulation is starting and will be spreading in the electric power industry in Japan. Consequently, power suppliers are required to supply low cost power with high quality services to customers. Corresponding to these movements the authors have been proposed SCOPE (System Configuration Of PowEr control system) architecture for distributed EMS/SCADA (Energy Management Systems / Supervisory Control and Data Acquisition) system based on distributed object technology, which offers the flexibility and expandability adapting those movements. In this paper, the authors introduce a prototype of the power system information delivering system, which was developed based on SCOPE architecture. This paper describes the architecture and the evaluation results of this prototype system. The power system information delivering system supplies useful power systems information such as electric power failures to the customers using Internet and distributed object technology. This system is new type of SCADA system which monitors failure of power transmission system and power distribution system with geographic information integrated way.
NASA Astrophysics Data System (ADS)
Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois
2017-04-01
Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.
Marshall Application Realignment System (MARS) Architecture
NASA Technical Reports Server (NTRS)
Belshe, Andrea; Sutton, Mandy
2010-01-01
The Marshall Application Realignment System (MARS) Architecture project was established to meet the certification requirements of the Department of Defense Architecture Framework (DoDAF) V2.0 Federal Enterprise Architecture Certification (FEAC) Institute program and to provide added value to the Marshall Space Flight Center (MSFC) Application Portfolio Management process. The MARS Architecture aims to: (1) address the NASA MSFC Chief Information Officer (CIO) strategic initiative to improve Application Portfolio Management (APM) by optimizing investments and improving portfolio performance, and (2) develop a decision-aiding capability by which applications registered within the MSFC application portfolio can be analyzed and considered for retirement or decommission. The MARS Architecture describes a to-be target capability that supports application portfolio analysis against scoring measures (based on value) and overall portfolio performance objectives (based on enterprise needs and policies). This scoring and decision-aiding capability supports the process by which MSFC application investments are realigned or retired from the application portfolio. The MARS Architecture is a multi-phase effort to: (1) conduct strategic architecture planning and knowledge development based on the DoDAF V2.0 six-step methodology, (2) describe one architecture through multiple viewpoints, (3) conduct portfolio analyses based on a defined operational concept, and (4) enable a new capability to support the MSFC enterprise IT management mission, vision, and goals. This report documents Phase 1 (Strategy and Design), which includes discovery, planning, and development of initial architecture viewpoints. Phase 2 will move forward the process of building the architecture, widening the scope to include application realignment (in addition to application retirement), and validating the underlying architecture logic before moving into Phase 3. The MARS Architecture key stakeholders are most interested in Phase 3 because this is where the data analysis, scoring, and recommendation capability is realized. Stakeholders want to see the benefits derived from reducing the steady-state application base and identify opportunities for portfolio performance improvement and application realignment.
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Preheim, Larry E.
1990-01-01
Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.
Advanced Information Processing System - Fault detection and error handling
NASA Technical Reports Server (NTRS)
Lala, J. H.
1985-01-01
The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles, including tactical and transport aircraft, and manned and autonomous spacecraft. A proof-of-concept (POC) system is now in the detailed design and fabrication phase. This paper gives an overview of a preliminary fault detection and error handling philosophy in AIPS.
Mecklinger, Axel; Kriukova, Olga; Mühlmann, Heiner; Grunwald, Thomas
2014-01-01
Visual object identification is modulated by perceptual experience. In a cross-cultural ERP study we investigated whether cultural expertise determines how buildings that vary in their ranking between high and low according to the Western architectural decorum are perceived. Two groups of German and Chinese participants performed an object classification task in which high- and low-ranking Western buildings had to be discriminated from everyday life objects. ERP results indicate that an early stage of visual object identification (i.e., object model selection) is facilitated for high-ranking buildings for the German participants, only. At a later stage of object identification, in which object knowledge is complemented by information from semantic and episodic long-term memory, no ERP evidence for cultural differences was obtained. These results suggest that the identification of architectural ranking is modulated by culturally specific expertise with Western-style architecture already at an early processing stage.
OSD CALS Architecture Master Plan Study. Concept Paper. Configuration Management. Volume 28
DOT National Transportation Integrated Search
1989-10-01
The mission of CALS is to enhance operational readiness of DoD weapon systems through application of information technology to the management of technical information. CALS will automate the current paper-intensive processes involved in weapon system...
The Architectural and Interior Design Planning Process.
ERIC Educational Resources Information Center
Cohen, Elaine
1994-01-01
Explains the planning process in designing effective library facilities and discusses library building requirements that result from electronic information technologies. Highlights include historical structures; Americans with Disabilities Act; resource allocation; electrical power; interior spaces; lighting; design development; the roles of…
A Proposed Information Architecture for Telehealth System Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.
1999-04-20
We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that amore » single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.« less
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
An Extensible Information Grid for Risk Management
NASA Technical Reports Server (NTRS)
Maluf, David A.; Bell, David G.
2003-01-01
This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.
Parallel Architectures for Planetary Exploration Requirements (PAPER)
NASA Technical Reports Server (NTRS)
Cezzar, Ruknet; Sen, Ranjan K.
1989-01-01
The Parallel Architectures for Planetary Exploration Requirements (PAPER) project is essentially research oriented towards technology insertion issues for NASA's unmanned planetary probes. It was initiated to complement and augment the long-term efforts for space exploration with particular reference to NASA/LaRC's (NASA Langley Research Center) research needs for planetary exploration missions of the mid and late 1990s. The requirements for space missions as given in the somewhat dated Advanced Information Processing Systems (AIPS) requirements document are contrasted with the new requirements from JPL/Caltech involving sensor data capture and scene analysis. It is shown that more stringent requirements have arisen as a result of technological advancements. Two possible architectures, the AIPS Proof of Concept (POC) configuration and the MAX Fault-tolerant dataflow multiprocessor, were evaluated. The main observation was that the AIPS design is biased towards fault tolerance and may not be an ideal architecture for planetary and deep space probes due to high cost and complexity. The MAX concepts appears to be a promising candidate, except that more detailed information is required. The feasibility for adding neural computation capability to this architecture needs to be studied. Key impact issues for architectural design of computing systems meant for planetary missions were also identified.
Issues in Defining Software Architectures in a GIS Environment
NASA Technical Reports Server (NTRS)
Acosta, Jesus; Alvorado, Lori
1997-01-01
The primary mission of the Pan-American Center for Earth and Environmental Studies (PACES) is to advance the research areas that are relevant to NASA's Mission to Planet Earth program. One of the activities at PACES is the establishment of a repository for geographical, geological and environmental information that covers various regions of Mexico and the southwest region of the U.S. and that is acquired from NASA and other sources through remote sensing, ground studies or paper-based maps. The center will be providing access of this information to other government entities in the U.S. and Mexico, and research groups from universities, national laboratories and industry. Geographical Information Systems(GIS) provide the means to manage, manipulate, analyze and display geographically referenced information that will be managed by PACES. Excellent off-the-shelf software exists for a complete GIS as well as software for storing and managing spatial databases, processing images, networking and viewing maps with layered information. This allows the user flexibility in combining systems to create a GIS or to mix these software packages with custom-built application programs. Software architectural languages provide the ability to specify the computational components and interactions among these components, an important topic in the domain of GIS because of the need to integrate numerous software packages. This paper discusses the characteristics that architectural languages address with respect to the issues relating to the data that must be communicated between software systems and components when systems interact. The paper presents a background on GIS in section 2. Section 3 gives an overview of software architecture and architectural languages. Section 4 suggests issues that may be of concern when defining the software architecture of a GIS. The last section discusses the future research effort and finishes with a summary.
FIA: An Open Forensic Integration Architecture for Composing Digital Evidence
NASA Astrophysics Data System (ADS)
Raghavan, Sriram; Clark, Andrew; Mohay, George
The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.
Chakraborty, Chiranjib; Sarkar, Bimal Kumar; Patel, Pratiksha; Agoramoorthy, Govindasamy
2012-01-01
In this paper, Shannon information theory has been applied to elaborate cell signaling. It is proposed that in the cellular network architecture, four components viz. source (DNA), transmitter (mRNA), receiver (protein) and destination (another protein) are involved. The message transmits from source (DNA) to transmitter (mRNA) and then passes through a noisy channel reaching finally the receiver (protein). The protein synthesis process is here considered as the noisy channel. Ultimately, signal is transmitted from receiver to destination (another protein). The genome network architecture elements were compared with genetic alphabet L = {A, C, G, T} with a biophysical model based on the popular Shannon information theory. This study found the channel capacity as maximum for zero error (sigma = 0) and at this condition, transition matrix becomes a unit matrix with rank 4. The transition matrix will be erroneous and finally at sigma = 1 channel capacity will be localized maxima with a value of 0.415 due to the increased value at sigma. On the other hand, minima exists at sigma = 0.75, where all transition probabilities become 0.25 and uncertainty will be maximum resulting in channel capacity with the minima value of zero.
Semantic Analysis of Military Relevant Texts for Intelligence Purposes
2011-06-01
Topic 8: Architectures, Technologies, and Tools Topic 4: Information and Knowledge Exploitation Topic 3: Information and Knowledge... Information Processing and Ergonomics FKIE Fraunhofer Institute for Communication, Dr. Matthias Hecking Sandra Noubours (point of contact...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour
Application of Advanced Multi-Core Processor Technologies to Oceanographic Research
2013-09-30
STM32 NXP LPC series No Proprietary Microchip PIC32/DSPIC No > 500 mW; < 5 W ARM Cortex TI OMAP TI Sitara Broadcom BCM2835 Varies FPGA...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Application of Advanced Multi-Core Processor Technologies...state-of-the-art information processing architectures. OBJECTIVES Next-generation processor architectures (multi-core, multi-threaded) hold the
Historic Properties Report. Indiana Army Ammunition Plant, Charleston, Indiana.
1984-08-01
Overview A combined architectural, historical, and technological overview was prepared frcn information developed frcmn the locunentary research and the...architectural, historical, and technological resources identified on DARCOM installations nationwide, four criteria were developed to help determine the...standard industrial design, embodying a technology developed by du Pont in the mid-1920s. In the du Pont process, liquid ammonia was vaporized and mixed with
NASA Technical Reports Server (NTRS)
Estefan, Jeff A.; Giovannoni, Brian J.
2014-01-01
The Advanced Multi-Mission Operations Systems (AMMOS) is NASA's premier space mission operations product line offering for use in deep-space robotic and astrophysics missions. The general approach to AMMOS modernization over the course of its 29-year history exemplifies a continual, evolutionary approach with periods of sponsor investment peaks and valleys in between. Today, the Multimission Ground Systems and Services (MGSS) office-the program office that manages the AMMOS for NASA-actively pursues modernization initiatives and continues to evolve the AMMOS by incorporating enhanced capabilities and newer technologies into its end-user tool and service offerings. Despite the myriad of modernization investments that have been made over the evolutionary course of the AMMOS, pain points remain. These pain points, based on interviews with numerous flight project mission operations personnel, can be classified principally into two major categories: 1) information-related issues, and 2) process-related issues. By information-related issues, we mean pain points associated with the management and flow of MOS data across the various system interfaces. By process-related issues, we mean pain points associated with the MOS activities performed by mission operators (i.e., humans) and supporting software infrastructure used in support of those activities. In this paper, three foundational concepts-Timeline, Closed Loop Control, and Separation of Concerns-collectively form the basis for expressing a set of core architectural tenets that provides a multifaceted approach to AMMOS system architecture modernization intended to address the information- and process-related issues. Each of these architectural tenets will be further explored in this paper. Ultimately, we envision the application of these core tenets resulting in a unified vision of a future-state architecture for the AMMOS-one that is intended to result in a highly adaptable, highly efficient, and highly cost-effective set of multimission MOS products and services.
An open architecture for medical image workstation
NASA Astrophysics Data System (ADS)
Liang, Liang; Hu, Zhiqiang; Wang, Xiangyun
2005-04-01
Dealing with the difficulties of integrating various medical image viewing and processing technologies with a variety of clinical and departmental information systems and, in the meantime, overcoming the performance constraints in transferring and processing large-scale and ever-increasing image data in healthcare enterprise, we design and implement a flexible, usable and high-performance architecture for medical image workstations. This architecture is not developed for radiology only, but for any workstations in any application environments that may need medical image retrieving, viewing, and post-processing. This architecture contains an infrastructure named Memory PACS and different kinds of image applications built on it. The Memory PACS is in charge of image data caching, pre-fetching and management. It provides image applications with a high speed image data access and a very reliable DICOM network I/O. In dealing with the image applications, we use dynamic component technology to separate the performance-constrained modules from the flexibility-constrained modules so that different image viewing or processing technologies can be developed and maintained independently. We also develop a weakly coupled collaboration service, through which these image applications can communicate with each other or with third party applications. We applied this architecture in developing our product line and it works well. In our clinical sites, this architecture is applied not only in Radiology Department, but also in Ultrasonic, Surgery, Clinics, and Consultation Center. Giving that each concerned department has its particular requirements and business routines along with the facts that they all have different image processing technologies and image display devices, our workstations are still able to maintain high performance and high usability.
Using enterprise architecture artefacts in an organisation
NASA Astrophysics Data System (ADS)
Niemi, Eetu; Pekkola, Samuli
2017-03-01
As a tool for management and planning, Enterprise Architecture (EA) can potentially align organisations' business processes, information, information systems and technology towards a common goal, and supply the information required within this journey. However, an explicit view on why, how, when and by whom EA artefacts are used in order to realise its full potential is not defined. Utilising the features of information systems use studies and data from a case study with 14 EA stakeholder interviews, we identify and describe 15 EA artefact use situations that are then reflected in the related literature. Their analysis enriches understanding of what are EA artefacts, how and why they are used and when are they used, and results in a theoretical framework for understanding their use in general.
An eConsent-based System Architecture Supporting Cooperation in Integrated Healthcare Networks.
Bergmann, Joachim; Bott, Oliver J; Hoffmann, Ina; Pretschner, Dietrich P
2005-01-01
The economical need for efficient healthcare leads to cooperative shared care networks. A virtual electronic health record is required, which integrates patient related information but reflects the distributed infrastructure and restricts access only to those health professionals involved into the care process. Our work aims on specification and development of a system architecture fulfilling these requirements to be used in concrete regional pilot studies. Methodical analysis and specification have been performed in a healthcare network using the formal method and modelling tool MOSAIK-M. The complexity of the application field was reduced by focusing on the scenario of thyroid disease care, which still includes various interdisciplinary cooperation. Result is an architecture for a secure distributed electronic health record for integrated care networks, specified in terms of a MOSAIK-M-based system model. The architecture proposes business processes, application services, and a sophisticated security concept, providing a platform for distributed document-based, patient-centred, and secure cooperation. A corresponding system prototype has been developed for pilot studies, using advanced application server technologies. The architecture combines a consolidated patient-centred document management with a decentralized system structure without needs for replication management. An eConsent-based approach assures, that access to the distributed health record remains under control of the patient. The proposed architecture replaces message-based communication approaches, because it implements a virtual health record providing complete and current information. Acceptance of the new communication services depends on compatibility with the clinical routine. Unique and cross-institutional identification of a patient is also a challenge, but will loose significance with establishing common patient cards.
ERIC Educational Resources Information Center
Parker, Marilyn M.
1993-01-01
Discusses what Office Information Systems and other Information Technology organizations, in concert with the business organizations they support, must do to exploit the opportunities and support the transition to the next generation enterprise: its business processes, its organizations and architectures, and its strategies. (Author/JOW)
VASSAR: Value assessment of system architectures using rules
NASA Astrophysics Data System (ADS)
Selva, D.; Crawley, E. F.
A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.
NASA Astrophysics Data System (ADS)
Arestova, M. L.; Bykovskii, A. Yu
1995-10-01
An architecture is proposed for a specialised optoelectronic multivalued logic processor based on the Allen—Givone algebra. The processor is intended for multiparametric processing of data arriving from a large number of sensors or for tackling spectral analysis tasks. The processor architecture makes it possible to obtain an approximate general estimate of the state of an object being diagnosed on a p-level scale. Optoelectronic systems are proposed for MAXIMUM, MINIMUM, and LITERAL logic gates, based on optical-frequency encoding of logic levels. Corresponding logic gates form a complete set of logic functions in the Allen—Givone algebra.
NASA Technical Reports Server (NTRS)
Murray, N. D.
1985-01-01
Current technology projections indicate a lack of availability of special purpose computing for Space Station applications. Potential functions for video image special purpose processing are being investigated, such as smoothing, enhancement, restoration and filtering, data compression, feature extraction, object detection and identification, pixel interpolation/extrapolation, spectral estimation and factorization, and vision synthesis. Also, architectural approaches are being identified and a conceptual design generated. Computationally simple algorithms will be research and their image/vision effectiveness determined. Suitable algorithms will be implimented into an overall architectural approach that will provide image/vision processing at video rates that are flexible, selectable, and programmable. Information is given in the form of charts, diagrams and outlines.
NASA Technical Reports Server (NTRS)
Premkumar, A. B.; Purviance, J. E.
1990-01-01
A simplified model for the SAR imaging problem is presented. The model is based on the geometry of the SAR system. Using this model an expression for the entire phase history of the received SAR signal is formulated. From the phase history, it is shown that the range and the azimuth coordinates for a point target image can be obtained by processing the phase information during the intrapulse and interpulse periods respectively. An architecture for a VLSI implementation for the SAR signal processor is presented which generates images in real time. The architecture uses a small number of chips, a new correlation processor, and an efficient azimuth correlation process.
Information architecture for a patient-specific dashboard in head and neck tumor boards.
Oeser, Alexander; Gaebel, Jan; Dietz, Andreas; Wiegand, Susanne; Oeltze-Jafra, Steffen
2018-03-28
Overcoming the flaws of current data management conditions in head and neck oncology could enable integrated information systems specifically tailored to the needs of medical experts in a tumor board meeting. Clinical dashboards are a promising method to assist various aspects of the decision-making process in such cognitively demanding scenarios. However, in order to provide extensive and intuitive assistance to the participating physicians, the design and development of such a system have to be user-centric. To accomplish this task, conceptual methods need to be performed prior to the technical development and integration stages. We have conducted a qualitative survey including eight clinical experts with different levels of expertise in the field of head and neck oncology. According to the principles of information architecture, the survey focused on the identification and causal interconnection of necessary metrics for information assessment in the tumor board. Based on the feedback by the clinical experts, we have constructed a detailed map of the required information items for a tumor board dashboard in head and neck oncology. Furthermore, we have identified three distinct groups of metrics (patient, disease and therapy metrics) as well as specific recommendations for their structural and graphical implementation. By using the information architecture, we were able to gather valuable feedback about the requirements and cognitive processes of the tumor board members. Those insights have helped us to develop a dashboard application that closely adapts to the specified needs and characteristics, and thus is primarily user-centric.
NASA Astrophysics Data System (ADS)
Beardsley, Sara; Stochetti, Alejandro; Cerone, Marc
2018-03-01
Akhmat Tower is a 435m supertall building designed by Adrian Smith + Gordon Gill Architecture. It is currently under construction in the city of Grozny, in the Chechen Republic, in the North Caucasus region of Russia. The design of the tower was done during a collaborative process by a multi-disciplinary architectural and engineering team, based primarily in the United States and Russia. During this process, the designers considered many factors including, most primarily, the cultural and historical context, the structural requirements given the high seismicity of the region, and the client's programmatic needs. The resulting crystalline-shaped tower is both an aesthetic statement and a performative architectural solution which will be a new landmark for Chechnya. "The Design of Akhmat Tower" describes in detail the design process including structural considerations, exterior wall design, building program, interior design, the tuned mass damper, and the use of building information modeling.
Photonic Quantum Networks formed from NV− centers
Nemoto, Kae; Trupke, Michael; Devitt, Simon J.; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J.
2016-01-01
In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV−, with one nuclear spin from 15N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology. PMID:27215433
Photonic Quantum Networks formed from NV(-) centers.
Nemoto, Kae; Trupke, Michael; Devitt, Simon J; Scharfenberger, Burkhard; Buczak, Kathrin; Schmiedmayer, Jörg; Munro, William J
2016-05-24
In this article we present a simple repeater scheme based on the negatively-charged nitrogen vacancy centre in diamond. Each repeater node is built from modules comprising an optical cavity containing a single NV(-), with one nuclear spin from (15)N as quantum memory. The module uses only deterministic processes and interactions to achieve high fidelity operations (>99%), and modules are connected by optical fiber. In the repeater node architecture, the processes between modules by photons can be in principle deterministic, however current limitations on optical components lead the processes to be probabilistic but heralded. Our resource-modest repeater architecture contains two modules at each node, and the repeater nodes are then connected by entangled photon pairs. We discuss the performance of such a quantum repeater network with modest resources and then incorporate more resource-intense strategies step by step. Our architecture should allow large-scale quantum information networks with existing or near future technology.
Demand Activated Manufacturing Architecture (DAMA) model for supply chain collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
CHAPMAN,LEON D.; PETERSEN,MARJORIE B.
The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise architecture and collaborative model for supply chains. This model will enable improved collaborative business across any supply chain. The DAMA Model for Supply Chain Collaboration is a high-level model for collaboration to achieve Demand Activated Manufacturing. The five major elements of the architecture to support collaboration are (1) activity or process, (2) information, (3) application, (4) data, and (5) infrastructure. These five elements are tied to the application of themore » DAMA architecture to three phases of collaboration - prepare, pilot, and scale. There are six collaborative activities that may be employed in this model: (1) Develop Business Planning Agreements, (2) Define Products, (3) Forecast and Plan Capacity Commitments, (4) Schedule Product and Product Delivery, (5) Expedite Production and Delivery Exceptions, and (6) Populate Supply Chain Utility. The Supply Chain Utility is a set of applications implemented to support collaborative product definition, forecast visibility, planning, scheduling, and execution. The DAMA architecture and model will be presented along with the process for implementing this DAMA model.« less
A biopolymer transistor: electrical amplification by microtubules.
Priel, Avner; Ramos, Arnolt J; Tuszynski, Jack A; Cantiello, Horacio F
2006-06-15
Microtubules (MTs) are important cytoskeletal structures engaged in a number of specific cellular activities, including vesicular traffic, cell cyto-architecture and motility, cell division, and information processing within neuronal processes. MTs have also been implicated in higher neuronal functions, including memory and the emergence of "consciousness". How MTs handle and process electrical information, however, is heretofore unknown. Here we show new electrodynamic properties of MTs. Isolated, taxol-stabilized MTs behave as biomolecular transistors capable of amplifying electrical information. Electrical amplification by MTs can lead to the enhancement of dynamic information, and processivity in neurons can be conceptualized as an "ionic-based" transistor, which may affect, among other known functions, neuronal computational capabilities.
Information Processing in Cognition Process and New Artificial Intelligent Systems
NASA Astrophysics Data System (ADS)
Zheng, Nanning; Xue, Jianru
In this chapter, we discuss, in depth, visual information processing and a new artificial intelligent (AI) system that is based upon cognitive mechanisms. The relationship between a general model of intelligent systems and cognitive mechanisms is described, and in particular we explore visual information processing with selective attention. We also discuss a methodology for studying the new AI system and propose some important basic research issues that have emerged in the intersecting fields of cognitive science and information science. To this end, a new scheme for associative memory and a new architecture for an AI system with attractors of chaos are addressed.
Computer Sciences and Data Systems, volume 2
NASA Technical Reports Server (NTRS)
1987-01-01
Topics addressed include: data storage; information network architecture; VHSIC technology; fiber optics; laser applications; distributed processing; spaceborne optical disk controller; massively parallel processors; and advanced digital SAR processors.
Health level 7 development framework for medication administration.
Kim, Hwa Sun; Cho, Hune
2009-01-01
We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
Using a virtual world for robot planning
NASA Astrophysics Data System (ADS)
Benjamin, D. Paul; Monaco, John V.; Lin, Yixia; Funk, Christopher; Lyons, Damian
2012-06-01
We are building a robot cognitive architecture that constructs a real-time virtual copy of itself and its environment, including people, and uses the model to process perceptual information and to plan its movements. This paper describes the structure of this architecture. The software components of this architecture include PhysX for the virtual world, OpenCV and the Point Cloud Library for visual processing, and the Soar cognitive architecture that controls the perceptual processing and task planning. The RS (Robot Schemas) language is implemented in Soar, providing the ability to reason about concurrency and time. This Soar/RS component controls visual processing, deciding which objects and dynamics to render into PhysX, and the degree of detail required for the task. As the robot runs, its virtual model diverges from physical reality, and errors grow. The Match-Mediated Difference component monitors these errors by comparing the visual data with corresponding data from virtual cameras, and notifies Soar/RS of significant differences, e.g. a new object that appears, or an object that changes direction unexpectedly. Soar/RS can then run PhysX much faster than real-time and search among possible future world paths to plan the robot's actions. We report experimental results in indoor environments.
A Role for Semantic Web Technologies in Patient Record Data Collection
NASA Astrophysics Data System (ADS)
Ogbuji, Chimezie
Business Process Management Systems (BPMS) are a component of the stack of Web standards that comprise Service Oriented Architecture (SOA). Such systems are representative of the architectural framework of modern information systems built in an enterprise intranet and are in contrast to systems built for deployment on the larger World Wide Web. The REST architectural style is an emerging style for building loosely coupled systems based purely on the native HTTP protocol. It is a coordinated set of architectural constraints with a goal to minimize latency, maximize the independence and scalability of distributed components, and facilitate the use of intermediary processors.Within the development community for distributed, Web-based systems, there has been a debate regarding themerits of both approaches. In some cases, there are legitimate concerns about the differences in both architectural styles. In other cases, the contention seems to be based on concerns that are marginal at best. In this chapter, we will attempt to contribute to this debate by focusing on a specific, deployed use case that emphasizes the role of the Semantic Web, a simple Web application architecture that leverages the use of declarative XML processing, and the needs of a workflow system. The use case involves orchestrating a work process associated with the data entry of structured patient record content into a research registry at the Cleveland Clinic's Clinical Investigation department in the Heart and Vascular Institute.
A New Method for Conceptual Modelling of Information Systems
NASA Astrophysics Data System (ADS)
Gustas, Remigijus; Gustiene, Prima
Service architecture is not necessarily bound to the technical aspects of information system development. It can be defined by using conceptual models that are independent of any implementation technology. Unfortunately, the conventional information system analysis and design methods cover just a part of required modelling notations for engineering of service architectures. They do not provide effective support to maintain semantic integrity between business processes and data. Service orientation is a paradigm that can be applied for conceptual modelling of information systems. The concept of service is rather well understood in different domains. It can be applied equally well for conceptualization of organizational and technical information system components. This chapter concentrates on analysis of the differences between service-oriented modelling and object-oriented modelling. Service-oriented method is used for semantic integration of information system static and dynamic aspects.
Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Daniel G.
In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less
NASA Technical Reports Server (NTRS)
Bhasin, Kul; Hayden, Jeffrey L.
2005-01-01
For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.
Timeline-Based Mission Operations Architecture: An Overview
NASA Technical Reports Server (NTRS)
Chung, Seung H.; Bindschadler, Duane L.
2012-01-01
Some of the challenges in developing a mission operations system and operating a mission can be traced back to the challenge of integrating a mission operations system from its many components and to the challenge of maintaining consistent and accountable information throughout the operations processes. An important contributing factor to both of these challenges is the file-centric nature of today's systems. In this paper, we provide an overview of these challenges and argue the need to move toward an information-centric mission operations system. We propose an information representation called Timeline as an approach to enable such a move, and we provide an overview of a Timeline-based Mission Operations System architecture.
Project Integration Architecture: Application Architecture
NASA Technical Reports Server (NTRS)
Jones, William Henry
2005-01-01
The Project Integration Architecture (PIA) implements a flexible, object-oriented, wrapping architecture which encapsulates all of the information associated with engineering applications. The architecture allows the progress of a project to be tracked and documented in its entirety. Additionally, by bringing all of the information sources and sinks of a project into a single architectural space, the ability to transport information between those applications is enabled.
Distributed and parallel approach for handle and perform huge datasets
NASA Astrophysics Data System (ADS)
Konopko, Joanna
2015-12-01
Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.
Medical Data Architecture (MDA) Project Status
NASA Technical Reports Server (NTRS)
Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.
2018-01-01
The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.
Medical Data Architecture Project Status
NASA Technical Reports Server (NTRS)
Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.
2018-01-01
The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.
Advanced and secure architectural EHR approaches.
Blobel, Bernd
2006-01-01
Electronic Health Records (EHRs) provided as a lifelong patient record advance towards core applications of distributed and co-operating health information systems and health networks. For meeting the challenge of scalable, flexible, portable, secure EHR systems, the underlying EHR architecture must be based on the component paradigm and model driven, separating platform-independent and platform-specific models. Allowing manageable models, real systems must be decomposed and simplified. The resulting modelling approach has to follow the ISO Reference Model - Open Distributing Processing (RM-ODP). The ISO RM-ODP describes any system component from different perspectives. Platform-independent perspectives contain the enterprise view (business process, policies, scenarios, use cases), the information view (classes and associations) and the computational view (composition and decomposition), whereas platform-specific perspectives concern the engineering view (physical distribution and realisation) and the technology view (implementation details from protocols up to education and training) on system components. Those views have to be established for components reflecting aspects of all domains involved in healthcare environments including administrative, legal, medical, technical, etc. Thus, security-related component models reflecting all view mentioned have to be established for enabling both application and communication security services as integral part of the system's architecture. Beside decomposition and simplification of system regarding the different viewpoint on their components, different levels of systems' granularity can be defined hiding internals or focusing on properties of basic components to form a more complex structure. The resulting models describe both structure and behaviour of component-based systems. The described approach has been deployed in different projects defining EHR systems and their underlying architectural principles. In that context, the Australian GEHR project, the openEHR initiative, the revision of CEN ENV 13606 "Electronic Health Record communication", all based on Archetypes, but also the HL7 version 3 activities are discussed in some detail. The latter include the HL7 RIM, the HL7 Development Framework, the HL7's clinical document architecture (CDA) as well as the set of models from use cases, activity diagrams, sequence diagrams up to Domain Information Models (DMIMs) and their building blocks Common Message Element Types (CMET) Constraining Models to their underlying concepts. The future-proof EHR architecture as open, user-centric, user-friendly, flexible, scalable, portable core application in health information systems and health networks has to follow advanced architectural paradigms.
Introduction to Message-Bus Architectures for Space Systems
NASA Technical Reports Server (NTRS)
Smith, Dan; Gregory, Brian
2005-01-01
This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system development process are presented. Benefits and lessons learned will be discussed and time for questions and answers will be provided.
Information Quality Evaluation of C2 Systems at Architecture Level
2014-06-01
based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is
New scaling relation for information transfer in biological networks
Kim, Hyunju; Davies, Paul; Walker, Sara Imari
2015-01-01
We quantify characteristics of the informational architecture of two representative biological networks: the Boolean network model for the cell-cycle regulatory network of the fission yeast Schizosaccharomyces pombe (Davidich et al. 2008 PLoS ONE 3, e1672 (doi:10.1371/journal.pone.0001672)) and that of the budding yeast Saccharomyces cerevisiae (Li et al. 2004 Proc. Natl Acad. Sci. USA 101, 4781–4786 (doi:10.1073/pnas.0305937101)). We compare our results for these biological networks with the same analysis performed on ensembles of two different types of random networks: Erdös–Rényi and scale-free. We show that both biological networks share features in common that are not shared by either random network ensemble. In particular, the biological networks in our study process more information than the random networks on average. Both biological networks also exhibit a scaling relation in information transferred between nodes that distinguishes them from random, where the biological networks stand out as distinct even when compared with random networks that share important topological properties, such as degree distribution, with the biological network. We show that the most biologically distinct regime of this scaling relation is associated with a subset of control nodes that regulate the dynamics and function of each respective biological network. Information processing in biological networks is therefore interpreted as an emergent property of topology (causal structure) and dynamics (function). Our results demonstrate quantitatively how the informational architecture of biologically evolved networks can distinguish them from other classes of network architecture that do not share the same informational properties. PMID:26701883
Tello-Leal, Edgar; Chiotti, Omar; Villarreal, Pablo David
2012-12-01
The paper presents a methodology that follows a top-down approach based on a Model-Driven Architecture for integrating and coordinating healthcare services through cross-organizational processes to enable organizations providing high quality healthcare services and continuous process improvements. The methodology provides a modeling language that enables organizations conceptualizing an integration agreement, and identifying and designing cross-organizational process models. These models are used for the automatic generation of: the private view of processes each organization should perform to fulfill its role in cross-organizational processes, and Colored Petri Net specifications to implement these processes. A multi-agent system platform provides agents able to interpret Colored Petri-Nets to enable the communication between the Healthcare Information Systems for executing the cross-organizational processes. Clinical documents are defined using the HL7 Clinical Document Architecture. This methodology guarantees that important requirements for healthcare services integration and coordination are fulfilled: interoperability between heterogeneous Healthcare Information Systems; ability to cope with changes in cross-organizational processes; guarantee of alignment between the integrated healthcare service solution defined at the organizational level and the solution defined at technological level; and the distributed execution of cross-organizational processes keeping the organizations autonomy.
Development of Targeting UAVs Using Electric Helicopters and Yamaha RMAX
2007-05-17
including the QNX real - time operating system . The video overlay board is useful to display the onboard camera’s image with important information such as... real - time operating system . Fully utilizing the built-in multi-processing architecture with inter-process synchronization and communication
ERIC Educational Resources Information Center
Draze, Dianne; Palouda, Annelise
This book presents information about 10 areas of design, with the main emphasis on graphic design. One section presents the creative problem solving process and provides practice in using this process to solve design problems. Students are given a glimpse of other areas of design, including fashion, industrial, architectural, decorative,…
Artificial Neural Networks for Processing Graphs with Application to Image Understanding: A Survey
NASA Astrophysics Data System (ADS)
Bianchini, Monica; Scarselli, Franco
In graphical pattern recognition, each data is represented as an arrangement of elements, that encodes both the properties of each element and the relations among them. Hence, patterns are modelled as labelled graphs where, in general, labels can be attached to both nodes and edges. Artificial neural networks able to process graphs are a powerful tool for addressing a great variety of real-world problems, where the information is naturally organized in entities and relationships among entities and, in fact, they have been widely used in computer vision, f.i. in logo recognition, in similarity retrieval, and for object detection. In this chapter, we propose a survey of neural network models able to process structured information, with a particular focus on those architectures tailored to address image understanding applications. Starting from the original recursive model (RNNs), we subsequently present different ways to represent images - by trees, forests of trees, multiresolution trees, directed acyclic graphs with labelled edges, general graphs - and, correspondingly, neural network architectures appropriate to process such structures.
Fontelo, P.; Rossi, E.; Ackerman, MJ
2015-01-01
Summary Background Mobile health Applications (mHealth Apps) are opening the way to patients’ responsible and active involvement with their own healthcare management. However, apart from Apps allowing patient’s access to their electronic health records (EHRs), mHealth Apps are currently developed as dedicated “island systems”. Objective Although much work has been done on patient’s access to EHRs, transfer of information from mHealth Apps to EHR systems is still low. This study proposes a standards-based architecture that can be adopted by mHealth Apps to exchange information with EHRs to support better quality of care. Methods Following the definition of requirements for the EHR/mHealth App information exchange recently proposed, and after reviewing current standards, we designed the architecture for EHR/mHealth App integration. Then, as a case study, we modeled a system based on the proposed architecture aimed to support home monitoring for congestive heart failure patients. We simulated such process using, on the EHR side, OpenMRS, an open source longitudinal EHR and, on the mHealth App side, the iOS platform. Results The integration architecture was based on the bi-directional exchange of standard documents (clinical document architecture rel2 – CDA2). In the process, the clinician “prescribes” the home monitoring procedures by creating a CDA2 prescription in the EHR that is sent, encrypted and de-identified, to the mHealth App to create the monitoring calendar. At the scheduled time, the App alerts the patient to start the monitoring. After the measurements are done, the App generates a structured CDA2-compliant monitoring report and sends it to the EHR, thus avoiding local storage. Conclusions The proposed architecture, even if validated only in a simulation environment, represents a step forward in the integration of personal mHealth Apps into the larger health-IT ecosystem, allowing the bi-directional data exchange between patients and healthcare professionals, supporting the patient’s engagement in self-management and self-care. PMID:26448794
Local, regional and national interoperability in hospital-level systems architecture.
Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J
2007-01-01
Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.
Monfort, M; Martin, S A; Frederickson, W
1990-02-01
1023 college students were assessed for hemispheric brain dominance using the paper-and-pencil test, the Human Information Processing Survey. Analysis of scores of students majoring in Advertising, Interior Design, Music, Journalism, Art, Oral Communication, and Architecture suggested a preference for right-brain hemispheric processing, while scores of students majoring in Accounting, Management, Finance, Computer Science, Mathematics, Nursing, Funeral Service, Criminal Justice, and Elementary Education suggested a preference for left-hemispheric strategies for processing information. The differential effects of hemispheric processing in an educational system emphasizing the left-hemispheric activities of structured logic and sequential processing suggests repression of the intellectual development of those students who may be genetically favorable to right-hemispheric processing.
Geology and Design: Formal and Rational Connections
NASA Astrophysics Data System (ADS)
Eriksson, S. C.; Brewer, J.
2016-12-01
Geological forms and the manmade environment have always been inextricably linked. From the time that Upper Paleolithic man created drawings in the Lascaux Caves in the southwest of France, geology has provided a critical and dramatic spoil for human creativity. This inspiration has manifested itself in many different ways, and the history of architecture is rife with examples of geologically derived buildings. During the early 20th Century, German Expressionist art and architecture was heavily influenced by the natural and often translucent quality of minerals. Architects like Bruno Taut drew and built crystalline forms that would go on to inspire the more restrained Bauhaus movement. Even within the context of Contemporary architecture, geology has been a fertile source for inspiration. Architectural practices across the globe leverage the rationality and grounding found in geology to inform a process that is otherwise dominated by computer-driven parametric design. The connection between advanced design technology and the beautifully realized geo natural forms insures that geology will be a relevant source of architectural inspiration well into the 21st century. The sometimes hidden relationship of geology to the various sub-disciplines of Design such as Architecture, Interiors, Landscape Architecture, and Historic Preservation is explored in relation to curriculum and the practice of design. Topics such as materials, form, history, the cultural and physical landscape, natural hazards, and global design enrich and inform curriculum across the college. Commonly, these help define place-based education.
Papale, Paolo; Chiesi, Leonardo; Rampinini, Alessandra C; Pietrini, Pietro; Ricciardi, Emiliano
2016-01-01
In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting 'neuro-architecture' as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people-environment relationships, and even provide empirical foundations for a renewed evidence-based design theory.
E-Governance and Service Oriented Computing Architecture Model
NASA Astrophysics Data System (ADS)
Tejasvee, Sanjay; Sarangdevot, S. S.
2010-11-01
E-Governance is the effective application of information communication and technology (ICT) in the government processes to accomplish safe and reliable information lifecycle management. Lifecycle of the information involves various processes as capturing, preserving, manipulating and delivering information. E-Governance is meant to transform of governance in better manner to the citizens which is transparent, reliable, participatory, and accountable in point of view. The purpose of this paper is to attempt e-governance model, focus on the Service Oriented Computing Architecture (SOCA) that includes combination of information and services provided by the government, innovation, find out the way of optimal service delivery to citizens and implementation in transparent and liable practice. This paper also try to enhance focus on the E-government Service Manager as a essential or key factors service oriented and computing model that provides a dynamically extensible structural design in which all area or branch can bring in innovative services. The heart of this paper examine is an intangible model that enables E-government communication for trade and business, citizen and government and autonomous bodies.
A novel architecture for information retrieval system based on semantic web
NASA Astrophysics Data System (ADS)
Zhang, Hui
2011-12-01
Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.
NASA Integrated Model Centric Architecture (NIMA) Model Use and Re-Use
NASA Technical Reports Server (NTRS)
Conroy, Mike; Mazzone, Rebecca; Lin, Wei
2012-01-01
This whitepaper accepts the goals, needs and objectives of NASA's Integrated Model-centric Architecture (NIMA); adds experience and expertise from the Constellation program as well as NASA's architecture development efforts; and provides suggested concepts, practices and norms that nurture and enable model use and re-use across programs, projects and other complex endeavors. Key components include the ability to effectively move relevant information through a large community, process patterns that support model reuse and the identification of the necessary meta-information (ex. history, credibility, and provenance) to safely use and re-use that information. In order to successfully Use and Re-Use Models and Simulations we must define and meet key organizational and structural needs: 1. We must understand and acknowledge all the roles and players involved from the initial need identification through to the final product, as well as how they change across the lifecycle. 2. We must create the necessary structural elements to store and share NIMA-enabled information throughout the Program or Project lifecycle. 3. We must create the necessary organizational processes to stand up and execute a NIMA-enabled Program or Project throughout its lifecycle. NASA must meet all three of these needs to successfully use and re-use models. The ability to Reuse Models a key component of NIMA and the capabilities inherent in NIMA are key to accomplishing NASA's space exploration goals. 11
A Reference Architecture for Space Information Management
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.
2006-01-01
We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
Architectural Improvements and New Processing Tools for the Open XAL Online Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M
The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less
Assured Mission Support Space Architecture (AMSSA) study
NASA Technical Reports Server (NTRS)
Hamon, Rob
1993-01-01
The assured mission support space architecture (AMSSA) study was conducted with the overall goal of developing a long-term requirements-driven integrated space architecture to provide responsive and sustained space support to the combatant commands. Although derivation of an architecture was the focus of the study, there are three significant products from the effort. The first is a philosophy that defines the necessary attributes for the development and operation of space systems to ensure an integrated, interoperable architecture that, by design, provides a high degree of combat utility. The second is the architecture itself; based on an interoperable system-of-systems strategy, it reflects a long-range goal for space that will evolve as user requirements adapt to a changing world environment. The third product is the framework of a process that, when fully developed, will provide essential information to key decision makers for space systems acquisition in order to achieve the AMSSA goal. It is a categorical imperative that military space planners develop space systems that will act as true force multipliers. AMSSA provides the philosophy, process, and architecture that, when integrated with the DOD requirements and acquisition procedures, can yield an assured mission support capability from space to the combatant commanders. An important feature of the AMSSA initiative is the participation by every organization that has a role or interest in space systems development and operation. With continued community involvement, the concept of the AMSSA will become a reality. In summary, AMSSA offers a better way to think about space (philosophy) that can lead to the effective utilization of limited resources (process) with an infrastructure designed to meet the future space needs (architecture) of our combat forces.
Another HISA--the new standard: health informatics--service architecture.
Klein, Gunnar O; Sottile, Pier Angelo; Endsleff, Frederik
2007-01-01
In addition to the meaning as Health Informatics Society of Australia, HISA is the acronym used for the new European Standard: Health Informatics - Service Architecture. This EN 12967 standard has been developed by CEN - the federation of 29 national standards bodies in Europe. This standard defines the essential elements of a Service Oriented Architecture and a methodology for localization particularly useful for large healthcare organizations. It is based on the Open Distributed Processing (ODP) framework from ISO 10746 and contains the following parts: Part 1: Enterprise viewpoint. Part 2: Information viewpoint. Part 3: Computational viewpoint. This standard is now also the starting point for the consideration for an International standard in ISO/TC 215. The basic principles with a set of health specific middleware services as a common platform for various applications for regional health information systems, or large integrated hospital information systems, are well established following a previous prestandard. Examples of large scale deployments in Sweden, Denmark and Italy are described.
A secure and efficiently searchable health information architecture.
Yasnoff, William A
2016-06-01
Patient-centric repositories of health records are an important component of health information infrastructure. However, patient information in a single repository is potentially vulnerable to loss of the entire dataset from a single unauthorized intrusion. A new health record storage architecture, the personal grid, eliminates this risk by separately storing and encrypting each person's record. The tradeoff for this improved security is that a personal grid repository must be sequentially searched since each record must be individually accessed and decrypted. To allow reasonable search times for large numbers of records, parallel processing with hundreds (or even thousands) of on-demand virtual servers (now available in cloud computing environments) is used. Estimated search times for a 10 million record personal grid using 500 servers vary from 7 to 33min depending on the complexity of the query. Since extremely rapid searching is not a critical requirement of health information infrastructure, the personal grid may provide a practical and useful alternative architecture that eliminates the large-scale security vulnerabilities of traditional databases by sacrificing unnecessary searching speed. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Lum, Henry, Jr.
1988-01-01
Information on systems autonomy is given in viewgraph form. Information is given on space systems integration, intelligent autonomous systems, automated systems for in-flight mission operations, the Systems Autonomy Demonstration Project on the Space Station Thermal Control System, the architecture of an autonomous intelligent system, artificial intelligence research issues, machine learning, and real-time image processing.
Identity and Access Management: Technological Implementation of Policy
ERIC Educational Resources Information Center
von Munkwitz-Smith, Jeff; West, Ann
2004-01-01
Navigating the multiple processes for accessing ever-multiplying campus information systems can be a daunting task for students, faculty, and staff. This article provides a brief overview of Identity and Access Management Services. The authors review key characteristics and components of this new information architecture and address the issue of…
2013-09-01
processes used in space system acquisitions, simply implementing a data exchange specification would not fundamentally improve how information is...instruction, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing the collection of information ...and manage the configuration of all critical program models, processes , and tools used throughout the DoD. Second, mandate a data exchange
NASA Technical Reports Server (NTRS)
Schutte, P. C.; Abbott, K. H.
1986-01-01
Real-time onboard fault monitoring and diagnosis for aircraft applications, whether performed by the human pilot or by automation, presents many difficult problems. Quick response to failures may be critical, the pilot often must compensate for the failure while diagnosing it, his information about the state of the aircraft is often incomplete, and the behavior of the aircraft changes as the effect of the failure propagates through the system. A research effort was initiated to identify guidelines for automation of onboard fault monitoring and diagnosis and associated crew interfaces. The effort began by determining the flight crew's information requirements for fault monitoring and diagnosis and the various reasoning strategies they use. Based on this information, a conceptual architecture was developed for the fault monitoring and diagnosis process. This architecture represents an approach and a framework which, once incorporated with the necessary detail and knowledge, can be a fully operational fault monitoring and diagnosis system, as well as providing the basis for comparison of this approach to other fault monitoring and diagnosis concepts. The architecture encompasses all aspects of the aircraft's operation, including navigation, guidance and controls, and subsystem status. The portion of the architecture that encompasses subsystem monitoring and diagnosis was implemented for an aircraft turbofan engine to explore and demonstrate the AI concepts involved. This paper describes the architecture and the implementation for the engine subsystem.
Visual information processing II; Proceedings of the Meeting, Orlando, FL, Apr. 14-16, 1993
NASA Technical Reports Server (NTRS)
Huck, Friedrich O. (Editor); Juday, Richard D. (Editor)
1993-01-01
Various papers on visual information processing are presented. Individual topics addressed include: aliasing as noise, satellite image processing using a hammering neural network, edge-detetion method using visual perception, adaptive vector median filters, design of a reading test for low-vision image warping, spatial transformation architectures, automatic image-enhancement method, redundancy reduction in image coding, lossless gray-scale image compression by predictive GDF, information efficiency in visual communication, optimizing JPEG quantization matrices for different applications, use of forward error correction to maintain image fidelity, effect of peanoscanning on image compression. Also discussed are: computer vision for autonomous robotics in space, optical processor for zero-crossing edge detection, fractal-based image edge detection, simulation of the neon spreading effect by bandpass filtering, wavelet transform (WT) on parallel SIMD architectures, nonseparable 2D wavelet image representation, adaptive image halftoning based on WT, wavelet analysis of global warming, use of the WT for signal detection, perfect reconstruction two-channel rational filter banks, N-wavelet coding for pattern classification, simulation of image of natural objects, number-theoretic coding for iconic systems.
NASA Astrophysics Data System (ADS)
Zhang, L.; Cong, Y.; Wu, C.; Bai, C.; Wu, C.
2017-08-01
The recording of Architectural heritage information is the foundation of research, conservation, management, and the display of architectural heritage. In other words, the recording of architectural heritage information supports heritage research, conservation, management and architectural heritage display. What information do we record and collect and what technology do we use for information recording? How do we determine the level of accuracy required when recording architectural information? What method do we use for information recording? These questions should be addressed in relation to the nature of the particular heritage site and the specific conditions for the conservation work. In recent years, with the rapid development of information acquisition technology such as Close Range Photogrammetry, 3D Laser Scanning as well as high speed and high precision Aerial Photogrammetry, many Chinese universities, research institutes and heritage management bureaux have purchased considerable equipment for information recording. However, the lack of understanding of both the nature of architectural heritage and the purpose for which the information is being collected has led to several problems. For example: some institutions when recording architectural heritage information aim solely at high accuracy. Some consider that advanced measuring methods must automatically replace traditional measuring methods. Information collection becomes the purpose, rather than the means, of architectural heritage conservation. Addressing these issues, this paper briefly reviews the history of architectural heritage information recording at the Summer Palace (Yihe Yuan, first built in 1750), Beijing. Using the recording practices at the Summer Palace during the past ten years as examples, we illustrate our achievements and lessons in recording architectural heritage information with regard to the following aspects: (buildings') ideal status desired, (buildings') current status, structural distortion analysis, display, statue restoration and thematic research. Three points will be highlighted in our discussion: 1. Understanding of the heritage is more important than the particular technology used: Architectural heritage information collection and recording are based on an understanding of the value and nature of the architectural heritage. Understanding is the purpose, whereas information collection and recording are the means. 2. Demand determines technology: Collecting and recording architectural heritage information is to serve the needs of heritage research, conservation, management and display. These different needs determine the different technologies that we use. 3. Set the level of accuracy appropriately: For information recording, high accuracy is not the key criterion; rather an appropriate level of accuracy is key. There is considerable deviation between the nominal accuracy of any instrument and the accuracy of any particular measurement.
E-health and healthcare enterprise information system leveraging service-oriented architecture.
Hsieh, Sung-Huai; Hsieh, Sheau-Ling; Cheng, Po-Hsun; Lai, Feipei
2012-04-01
To present the successful experiences of an integrated, collaborative, distributed, large-scale enterprise healthcare information system over a wired and wireless infrastructure in National Taiwan University Hospital (NTUH). In order to smoothly and sequentially transfer from the complex relations among the old (legacy) systems to the new-generation enterprise healthcare information system, we adopted the multitier framework based on service-oriented architecture to integrate the heterogeneous systems as well as to interoperate among many other components and multiple databases. We also present mechanisms of a logical layer reusability approach and data (message) exchange flow via Health Level 7 (HL7) middleware, DICOM standard, and the Integrating the Healthcare Enterprise workflow. The architecture and protocols of the NTUH enterprise healthcare information system, especially in the Inpatient Information System (IIS), are discussed in detail. The NTUH Inpatient Healthcare Information System is designed and deployed on service-oriented architecture middleware frameworks. The mechanisms of integration as well as interoperability among the components and the multiple databases apply the HL7 standards for data exchanges, which are embedded in XML formats, and Microsoft .NET Web services to integrate heterogeneous platforms. The preliminary performance of the current operation IIS is evaluated and analyzed to verify the efficiency and effectiveness of the designed architecture; it shows reliability and robustness in the highly demanding traffic environment of NTUH. The newly developed NTUH IIS provides an open and flexible environment not only to share medical information easily among other branch hospitals, but also to reduce the cost of maintenance. The HL7 message standard is widely adopted to cover all data exchanges in the system. All services are independent modules that enable the system to be deployed and configured to the highest degree of flexibility. Furthermore, we can conclude that the multitier Inpatient Healthcare Information System has been designed successfully and in a collaborative manner, based on the index of performance evaluations, central processing unit, and memory utilizations.
Information processing using a single dynamical node as complex system
Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.
2011-01-01
Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110
Enterprise Information Architecture for Mission Development
NASA Technical Reports Server (NTRS)
Dutra, Jayne
2007-01-01
This slide presentation reviews the concept of an information architecture to assist in mission development. The integrate information architecture will create a unified view of the information using metadata and the values (i.e., taxonomy).
NASA Astrophysics Data System (ADS)
Strotov, Valery V.; Taganov, Alexander I.; Konkin, Yuriy V.; Kolesenkov, Aleksandr N.
2017-10-01
Task of processing and analysis of obtained Earth remote sensing data on ultra-small spacecraft board is actual taking into consideration significant expenditures of energy for data transfer and low productivity of computers. Thereby, there is an issue of effective and reliable storage of the general information flow obtained from onboard systems of information collection, including Earth remote sensing data, into a specialized data base. The paper has considered peculiarities of database management system operation with the multilevel memory structure. For storage of data in data base the format has been developed that describes a data base physical structure which contains required parameters for information loading. Such structure allows reducing a memory size occupied by data base because it is not necessary to store values of keys separately. The paper has shown architecture of the relational database management system oriented into embedment into the onboard ultra-small spacecraft software. Data base for storage of different information, including Earth remote sensing data, can be developed by means of such database management system for its following processing. Suggested database management system architecture has low requirements to power of the computer systems and memory resources on the ultra-small spacecraft board. Data integrity is ensured under input and change of the structured information.
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon
2010-01-01
This slide presentation reviews the evolution of risk management (RM) at NASA. The aim of the RM approach at NASA is to promote an approach that is heuristic, proactive, and coherent across all of NASA. Risk Informed Decision Making (RIDM) is a decision making process that uses a diverse set of performance measures along with other considerations within a deliberative process to inform decision making. RIDM is invoked for key decisions such as architecture and design decisions, make-buy decisions, and budget reallocation. The RIDM process and how it relates to the continuous Risk Management (CRM) process is reviewed.
Cavity-Mediated Coherent Coupling between Distant Quantum Dots
NASA Astrophysics Data System (ADS)
Nicolí, Giorgio; Ferguson, Michael Sven; Rössler, Clemens; Wolfertz, Alexander; Blatter, Gianni; Ihn, Thomas; Ensslin, Klaus; Reichl, Christian; Wegscheider, Werner; Zilberberg, Oded
2018-06-01
Scalable architectures for quantum information technologies require one to selectively couple long-distance qubits while suppressing environmental noise and cross talk. In semiconductor materials, the coherent coupling of a single spin on a quantum dot to a cavity hosting fermionic modes offers a new solution to this technological challenge. Here, we demonstrate coherent coupling between two spatially separated quantum dots using an electronic cavity design that takes advantage of whispering-gallery modes in a two-dimensional electron gas. The cavity-mediated, long-distance coupling effectively minimizes undesirable direct cross talk between the dots and defines a scalable architecture for all-electronic semiconductor-based quantum information processing.
The Deep Space Network information system in the year 2000
NASA Technical Reports Server (NTRS)
Markley, R. W.; Beswick, C. A.
1992-01-01
The Deep Space Network (DSN), the largest, most sensitive scientific communications and radio navigation network in the world, is considered. Focus is made on the telemetry processing, monitor and control, and ground data transport architectures of the DSN ground information system envisioned for the year 2000. The telemetry architecture will be unified from the front-end area to the end user. It will provide highly automated monitor and control of the DSN, automated configuration of support activities, and a vastly improved human interface. Automated decision support systems will be in place for DSN resource management, performance analysis, fault diagnosis, and contingency management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, D.N.
1997-02-01
The Information Engineering thrust area develops information technology to support the programmatic needs of Lawrence Livermore National Laboratory`s Engineering Directorate. Progress in five programmatic areas are described in separate reports contained herein. These are entitled Three-dimensional Object Creation, Manipulation, and Transport, Zephyr:A Secure Internet-Based Process to Streamline Engineering Procurements, Subcarrier Multiplexing: Optical Network Demonstrations, Parallel Optical Interconnect Technology Demonstration, and Intelligent Automation Architecture.
Complete all-optical processing polarization-based binary logic gates and optical processors.
Zaghloul, Y A; Zaghloul, A R M
2006-10-16
We present a complete all-optical-processing polarization-based binary-logic system, by which any logic gate or processor can be implemented. Following the new polarization-based logic presented in [Opt. Express 14, 7253 (2006)], we develop a new parallel processing technique that allows for the creation of all-optical-processing gates that produce a unique output either logic 1 or 0 only once in a truth table, and those that do not. This representation allows for the implementation of simple unforced OR, AND, XOR, XNOR, inverter, and more importantly NAND and NOR gates that can be used independently to represent any Boolean expression or function. In addition, the concept of a generalized gate is presented which opens the door for reconfigurable optical processors and programmable optical logic gates. Furthermore, the new design is completely compatible with the old one presented in [Opt. Express 14, 7253 (2006)], and with current semiconductor based devices. The gates can be cascaded, where the information is always on the laser beam. The polarization of the beam, and not its intensity, carries the information. The new methodology allows for the creation of multiple-input-multiple-output processors that implement, by itself, any Boolean function, such as specialized or non-specialized microprocessors. Three all-optical architectures are presented: orthoparallel optical logic architecture for all known and unknown binary gates, singlebranch architecture for only XOR and XNOR gates, and the railroad (RR) architecture for polarization optical processors (POP). All the control inputs are applied simultaneously leading to a single time lag which leads to a very-fast and glitch-immune POP. A simple and easy-to-follow step-by-step algorithm is provided for the POP, and design reduction methodologies are briefly discussed. The algorithm lends itself systematically to software programming and computer-assisted design. As examples, designs of all binary gates, multiple-input gates, and sequential and non-sequential Boolean expressions are presented and discussed. The operation of each design is simply understood by a bullet train traveling at the speed of light on a railroad system preconditioned by the crossover states predetermined by the control inputs. The presented designs allow for optical processing of the information eliminating the need to convert it, back and forth, to an electronic signal for processing purposes. All gates with a truth table, including for example Fredkin, Toffoli, testable reversible logic, and threshold logic gates, can be designed and implemented using the railroad architecture. That includes any future gates not known today. Those designs and the quantum gates are not discussed in this paper.
Project Integration Architecture: Architectural Overview
NASA Technical Reports Server (NTRS)
Jones, William Henry
2001-01-01
The Project Integration Architecture (PIA) implements a flexible, object-oriented, wrapping architecture which encapsulates all of the information associated with engineering applications. The architecture allows the progress of a project to be tracked and documented in its entirety. By being a single, self-revealing architecture, the ability to develop single tools, for example a single graphical user interface, to span all applications is enabled. Additionally, by bringing all of the information sources and sinks of a project into a single architectural space, the ability to transport information between those applications becomes possible, Object-encapsulation further allows information to become in a sense self-aware, knowing things such as its own dimensionality and providing functionality appropriate to its kind.
Framewise phoneme classification with bidirectional LSTM and other neural network architectures.
Graves, Alex; Schmidhuber, Jürgen
2005-01-01
In this paper, we present bidirectional Long Short Term Memory (LSTM) networks, and a modified, full gradient version of the LSTM learning algorithm. We evaluate Bidirectional LSTM (BLSTM) and several other network architectures on the benchmark task of framewise phoneme classification, using the TIMIT database. Our main findings are that bidirectional networks outperform unidirectional ones, and Long Short Term Memory (LSTM) is much faster and also more accurate than both standard Recurrent Neural Nets (RNNs) and time-windowed Multilayer Perceptrons (MLPs). Our results support the view that contextual information is crucial to speech processing, and suggest that BLSTM is an effective architecture with which to exploit it.
NASA Technical Reports Server (NTRS)
Harper, R. E.; Alger, L. S.; Babikyan, C. A.; Butler, B. P.; Friend, S. A.; Ganska, R. J.; Lala, J. H.; Masotto, T. K.; Meyer, A. J.; Morton, D. P.
1992-01-01
Described here is the Army Fault Tolerant Architecture (AFTA) hardware architecture and components and the operating system. The architectural and operational theory of the AFTA Fault Tolerant Data Bus is discussed. The test and maintenance strategy developed for use in fielded AFTA installations is presented. An approach to be used in reducing the probability of AFTA failure due to common mode faults is described. Analytical models for AFTA performance, reliability, availability, life cycle cost, weight, power, and volume are developed. An approach is presented for using VHSIC Hardware Description Language (VHDL) to describe and design AFTA's developmental hardware. A plan is described for verifying and validating key AFTA concepts during the Dem/Val phase. Analytical models and partial mission requirements are used to generate AFTA configurations for the TF/TA/NOE and Ground Vehicle missions.
McMurray, Bob
2014-01-01
Traditional studies of human categorization often treat the processes of encoding features and cues as peripheral to the question of how stimuli are categorized. However, in domains where the features and cues are less transparent, how information is encoded prior to categorization may constrain our understanding of the architecture of categorization. This is particularly true in speech perception, where acoustic cues to phonological categories are ambiguous and influenced by multiple factors. Here, it is crucial to consider the joint contributions of the information in the input and the categorization architecture. We contrasted accounts that argue for raw acoustic information encoding with accounts that posit that cues are encoded relative to expectations, and investigated how two categorization architectures—exemplar models and back-propagation parallel distributed processing models—deal with each kind of information. Relative encoding, akin to predictive coding, is a form of noise reduction, so it can be expected to improve model accuracy; however, like predictive coding, the use of relative encoding in speech perception by humans is controversial, so results are compared to patterns of human performance, rather than on the basis of overall accuracy. We found that, for both classes of models, in the vast majority of parameter settings, relative cues greatly helped the models approximate human performance. This suggests that expectation-relative processing is a crucial precursor step in phoneme categorization, and that understanding the information content is essential to understanding categorization processes. PMID:25475048
Real-time traffic sign detection and recognition
NASA Astrophysics Data System (ADS)
Herbschleb, Ernst; de With, Peter H. N.
2009-01-01
The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput
A hybrid silicon membrane spatial light modulator for optical information processing
NASA Technical Reports Server (NTRS)
Pape, D. R.; Hornbeck, L. J.
1984-01-01
A new two dimensional, fast, analog, electrically addressable, silicon based membrane spatial light modulator (SLM) was developed for optical information processing applications. Coherent light reflected from the mirror elements is phase modulated producing an optical Fourier transform of an analog signal input to the device. The DMD architecture and operating parameters related to this application are presented. A model is developed that describes the optical Fourier transform properties of the DMD.
NASA Astrophysics Data System (ADS)
Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.
2015-06-01
The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.
Clinical results of HIS, RIS, PACS integration using data integration CASE tools
NASA Astrophysics Data System (ADS)
Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.
1995-05-01
Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.
Akbarzadeh, Rosa; Yousefi, Azizeh-Mitra
2014-08-01
Tissue engineering makes use of 3D scaffolds to sustain three-dimensional growth of cells and guide new tissue formation. To meet the multiple requirements for regeneration of biological tissues and organs, a wide range of scaffold fabrication techniques have been developed, aiming to produce porous constructs with the desired pore size range and pore morphology. Among different scaffold fabrication techniques, thermally induced phase separation (TIPS) method has been widely used in recent years because of its potential to produce highly porous scaffolds with interconnected pore morphology. The scaffold architecture can be closely controlled by adjusting the process parameters, including polymer type and concentration, solvent composition, quenching temperature and time, coarsening process, and incorporation of inorganic particles. The objective of this review is to provide information pertaining to the effect of these parameters on the architecture and properties of the scaffolds fabricated by the TIPS technique. © 2014 Wiley Periodicals, Inc.
Empowerment of Patients with Hypertension through BPM, IoT and Remote Sensing.
Ruiz-Fernández, Daniel; Marcos-Jorquera, Diego; Gilart-Iglesias, Virgilio; Vives-Boix, Víctor; Ramírez-Navarro, Javier
2017-10-04
Hypertension affects one in five adults worldwide. Healthcare processes require interdisciplinary cooperation and coordination between medical teams, clinical processes, and patients. The lack of patients' empowerment and adherence to treatment makes necessary to integrate patients, data collecting devices and clinical processes. For this reason, in this paper we propose a model based on Business Process Management paradigm, together with a group of technologies, techniques and IT principles which increase the benefits of the paradigm. To achieve the proposed model, the clinical process of the hypertension is analyzed with the objective of detecting weaknesses and improving the process. Once the process is analyzed, an architecture that joins health devices and environmental sensors, together with an information system, has been developed. To test the architecture, a web system connected with health monitors and environment sensors, and with a mobile app have been implemented.
Empowerment of Patients with Hypertension through BPM, IoT and Remote Sensing
Ramírez-Navarro, Javier
2017-01-01
Hypertension affects one in five adults worldwide. Healthcare processes require interdisciplinary cooperation and coordination between medical teams, clinical processes, and patients. The lack of patients’ empowerment and adherence to treatment makes necessary to integrate patients, data collecting devices and clinical processes. For this reason, in this paper we propose a model based on Business Process Management paradigm, together with a group of technologies, techniques and IT principles which increase the benefits of the paradigm. To achieve the proposed model, the clinical process of the hypertension is analyzed with the objective of detecting weaknesses and improving the process. Once the process is analyzed, an architecture that joins health devices and environmental sensors, together with an information system, has been developed. To test the architecture, a web system connected with health monitors and environment sensors, and with a mobile app have been implemented. PMID:28976940
Information systems in healthcare - state and steps towards sustainability.
Lenz, R
2009-01-01
To identify core challenges and first steps on the way to sustainable information systems in healthcare. Recent articles on healthcare information technology and related articles from Medical Informatics and Computer Science were reviewed and analyzed. Core challenges that couldn't be solved over the years are identified. The two core problem areas are process integration, meaning to effectively embed IT-systems into routine workflows, and systems integration, meaning to reduce the effort for interconnecting independently developed IT-components. Standards for systems integration have improved a lot, but their usefulness is limited where system evolution is needed. Sustainable Healthcare Information Systems should be based on system architectures that support system evolution and avoid costly system replacements every five to ten years. Some basic principles for the design of such systems are separation of concerns, loose coupling, deferred systems design, and service oriented architectures.
An optoelectronic system for fringe pattern analysis
NASA Astrophysics Data System (ADS)
Sciammarella, C. A.; Ahmadshahi, M.
A system capable of retrieving and processing information recorded in fringe patterns is reported. The principal components are described as well as the architecture in which they are assembled. An example of application is given.
The Multimission Image Processing Laboratory's virtual frame buffer interface
NASA Technical Reports Server (NTRS)
Wolfe, T.
1984-01-01
Large image processing systems use multiple frame buffers with differing architectures and vendor supplied interfaces. This variety of architectures and interfaces creates software development, maintenance and portability problems for application programs. Several machine-dependent graphics standards such as ANSI Core and GKS are available, but none of them are adequate for image processing. Therefore, the Multimission Image Processing laboratory project has implemented a programmer level virtual frame buffer interface. This interface makes all frame buffers appear as a generic frame buffer with a specified set of characteristics. This document defines the virtual frame uffer interface and provides information such as FORTRAN subroutine definitions, frame buffer characteristics, sample programs, etc. It is intended to be used by application programmers and system programmers who are adding new frame buffers to a system.
Malinin, Laura H
2015-01-01
Memoires by eminently creative people often describe architectural spaces and qualities they believe instrumental for their creativity. However, places designed to encourage creativity have had mixed results, with some found to decrease creative productivity for users. This may be due, in part, to lack of suitable empirical theory or model to guide design strategies. Relationships between creative cognition and features of the physical environment remain largely uninvestigated in the scientific literature, despite general agreement among researchers that human cognition is physically and socially situated. This paper investigates what role architectural settings may play in creative processes by examining documented first person and biographical accounts of creativity with respect to three central theories of situated cognition. First, the embodied thesis argues that cognition encompasses both the mind and the body. Second, the embedded thesis maintains that people exploit features of the physical and social environment to increase their cognitive capabilities. Third, the enaction thesis describes cognition as dependent upon a person's interactions with the world. Common themes inform three propositions, illustrated in a new theoretical framework describing relationships between people and their architectural settings with respect to different cognitive processes of creativity. The framework is intended as a starting point toward an ecological model of creativity, which may be used to guide future creative process research and architectural design strategies to support user creative productivity.
Malinin, Laura H.
2016-01-01
Memoires by eminently creative people often describe architectural spaces and qualities they believe instrumental for their creativity. However, places designed to encourage creativity have had mixed results, with some found to decrease creative productivity for users. This may be due, in part, to lack of suitable empirical theory or model to guide design strategies. Relationships between creative cognition and features of the physical environment remain largely uninvestigated in the scientific literature, despite general agreement among researchers that human cognition is physically and socially situated. This paper investigates what role architectural settings may play in creative processes by examining documented first person and biographical accounts of creativity with respect to three central theories of situated cognition. First, the embodied thesis argues that cognition encompasses both the mind and the body. Second, the embedded thesis maintains that people exploit features of the physical and social environment to increase their cognitive capabilities. Third, the enaction thesis describes cognition as dependent upon a person’s interactions with the world. Common themes inform three propositions, illustrated in a new theoretical framework describing relationships between people and their architectural settings with respect to different cognitive processes of creativity. The framework is intended as a starting point toward an ecological model of creativity, which may be used to guide future creative process research and architectural design strategies to support user creative productivity. PMID:26779087
ELISA, a demonstrator environment for information systems architecture design
NASA Technical Reports Server (NTRS)
Panem, Chantal
1994-01-01
This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.
Cortical network architecture for context processing in primate brain
Chao, Zenas C; Nagasaka, Yasuo; Fujii, Naotaka
2015-01-01
Context is information linked to a situation that can guide behavior. In the brain, context is encoded by sensory processing and can later be retrieved from memory. How context is communicated within the cortical network in sensory and mnemonic forms is unknown due to the lack of methods for high-resolution, brain-wide neuronal recording and analysis. Here, we report the comprehensive architecture of a cortical network for context processing. Using hemisphere-wide, high-density electrocorticography, we measured large-scale neuronal activity from monkeys observing videos of agents interacting in situations with different contexts. We extracted five context-related network structures including a bottom-up network during encoding and, seconds later, cue-dependent retrieval of the same network with the opposite top-down connectivity. These findings show that context is represented in the cortical network as distributed communication structures with dynamic information flows. This study provides a general methodology for recording and analyzing cortical network neuronal communication during cognition. DOI: http://dx.doi.org/10.7554/eLife.06121.001 PMID:26416139
Fabry-Perot confocal resonator optical associative memory
NASA Astrophysics Data System (ADS)
Burns, Thomas J.; Rogers, Steven K.; Vogel, George A.
1993-03-01
A unique optical associative memory architecture is presented that combines the optical processing environment of a Fabry-Perot confocal resonator with the dynamic storage and recall properties of volume holograms. The confocal resonator reduces the size and complexity of previous associative memory architectures by folding a large number of discrete optical components into an integrated, compact optical processing environment. Experimental results demonstrate the system is capable of recalling a complete object from memory when presented with partial information about the object. A Fourier optics model of the system's operation shows it implements a spatially continuous version of a discrete, binary Hopfield neural network associative memory.
Information Model Translation to Support a Wider Science Community
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Daniel; Ritschel, Bernd; Hardman, Sean; Joyner, Ronald
2014-05-01
The Planetary Data System (PDS), NASA's long-term archive for solar system exploration data, has just released PDS4, a modernization of the PDS architecture, data standards, and technical infrastructure. This next generation system positions the PDS to meet the demands of the coming decade, including big data, international cooperation, distributed nodes, and multiple ways of analysing and interpreting data. It also addresses three fundamental project goals: providing more efficient data delivery by data providers to the PDS, enabling a stable, long-term usable planetary science data archive, and enabling services for the data consumer to find, access, and use the data they require in contemporary data formats. The PDS4 information architecture is used to describe all PDS data using a common model. Captured in an ontology modeling tool it supports a hierarchy of data dictionaries built to the ISO/IEC 11179 standard and is designed to increase flexibility, enable complex searches at the product level, and to promote interoperability that facilitates data sharing both nationally and internationally. A PDS4 information architecture design requirement stipulates that the content of the information model must be translatable to external data definition languages such as XML Schema, XMI/XML, and RDF/XML. To support the semantic Web standards we are now in the process of mapping the contents into RDF/XML to support SPARQL capable databases. We are also building a terminological ontology to support virtually unified data retrieval and access. This paper will provide an overview of the PDS4 information architecture focusing on its domain information model and how the translation and mapping are being accomplished.
A semantically-aided architecture for a web-based monitoring system for carotid atherosclerosis.
Kolias, Vassileios D; Stamou, Giorgos; Golemati, Spyretta; Stoitsis, Giannis; Gkekas, Christos D; Liapis, Christos D; Nikita, Konstantina S
2015-08-01
Carotid atherosclerosis is a multifactorial disease and its clinical diagnosis depends on the evaluation of heterogeneous clinical data, such as imaging exams, biochemical tests and the patient's clinical history. The lack of interoperability between Health Information Systems (HIS) does not allow the physicians to acquire all the necessary data for the diagnostic process. In this paper, a semantically-aided architecture is proposed for a web-based monitoring system for carotid atherosclerosis that is able to gather and unify heterogeneous data with the use of an ontology and to create a common interface for data access enhancing the interoperability of HIS. The architecture is based on an application ontology of carotid atherosclerosis that is used to (a) integrate heterogeneous data sources on the basis of semantic representation and ontological reasoning and (b) access the critical information using SPARQL query rewriting and ontology-based data access services. The architecture was tested over a carotid atherosclerosis dataset consisting of the imaging exams and the clinical profile of 233 patients, using a set of complex queries, constructed by the physicians. The proposed architecture was evaluated with respect to the complexity of the queries that the physicians could make and the retrieval speed. The proposed architecture gave promising results in terms of interoperability, data integration of heterogeneous sources with an ontological way and expanded capabilities of query and retrieval in HIS.
Scalable and Resilient Middleware to Handle Information Exchange during Environment Crisis
NASA Astrophysics Data System (ADS)
Tao, R.; Poslad, S.; Moßgraber, J.; Middleton, S.; Hammitzsch, M.
2012-04-01
The EU FP7 TRIDEC project focuses on enabling real-time, intelligent, information management of collaborative, complex, critical decision processes for earth management. A key challenge is to promote a communication infrastructure to facilitate interoperable environment information services during environment events and crises such as tsunamis and drilling, during which increasing volumes and dimensionality of disparate information sources, including sensor-based and human-based ones, can result, and need to be managed. Such a system needs to support: scalable, distributed messaging; asynchronous messaging; open messaging to handling changing clients such as new and retired automated system and human information sources becoming online or offline; flexible data filtering, and heterogeneous access networks (e.g., GSM, WLAN and LAN). In addition, the system needs to be resilient to handle the ICT system failures, e.g. failure, degradation and overloads, during environment events. There are several system middleware choices for TRIDEC based upon a Service-oriented-architecture (SOA), Event-driven-Architecture (EDA), Cloud Computing, and Enterprise Service Bus (ESB). In an SOA, everything is a service (e.g. data access, processing and exchange); clients can request on demand or subscribe to services registered by providers; more often interaction is synchronous. In an EDA system, events that represent significant changes in state can be processed simply, or as streams or more complexly. Cloud computing is a virtualization, interoperable and elastic resource allocation model. An ESB, a fundamental component for enterprise messaging, supports synchronous and asynchronous message exchange models and has inbuilt resilience against ICT failure. Our middleware proposal is an ESB based hybrid architecture model: an SOA extension supports more synchronous workflows; EDA assists the ESB to handle more complex event processing; Cloud computing can be used to increase and decrease the ESB resources on demand. To reify this hybrid ESB centric architecture, we will adopt two complementary approaches: an open source one for scalability and resilience improvement while a commercial one can be used for ultra-speed messaging, whilst we can bridge between these two to support interoperability. In TRIDEC, to manage such a hybrid messaging system, overlay and underlay management techniques will be adopted. The managers (both global and local) will collect, store and update status information (e.g. CPU utilization, free space, number of clients) and balance the usage, throughput, and delays to improve resilience and scalability. The expected resilience improvement includes dynamic failover, self-healing, pre-emptive load balancing, and bottleneck prediction while the expected improvement for scalability includes capacity estimation, Http Bridge, and automatic configuration and reconfiguration (e.g. add or delete clients and servers).
Hybridization of Architectural Styles for Integrated Enterprise Information Systems
NASA Astrophysics Data System (ADS)
Bagusyte, Lina; Lupeikiene, Audrone
Current enterprise systems engineering theory does not provide adequate support for the development of information systems on demand. To say more precisely, it is forming. This chapter proposes the main architectural decisions that underlie the design of integrated enterprise information systems. This chapter argues for the extending service-oriented architecture - for merging it with component-based paradigm at the design stage and using connectors of different architectural styles. The suitability of general-purpose language SysML for the modeling of integrated enterprise information systems architectures is described and arguments pros are presented.
Research of Manufacture Time Management System Based on PLM
NASA Astrophysics Data System (ADS)
Jing, Ni; Juan, Zhu; Liangwei, Zhong
This system is targeted by enterprises manufacturing machine shop, analyzes their business needs and builds the plant management information system of Manufacture time and Manufacture time information management. for manufacturing process Combined with WEB technology, based on EXCEL VBA development of methods, constructs a hybrid model based on PLM workshop Manufacture time management information system framework, discusses the functionality of the system architecture, database structure.
An Architecture for Integrated Regional Health Telematics Networks
2001-10-25
that enables informed citizens to have an impact on the healthcare system and to be more concerned and care for their own health . The current...resource, educational, integrated electronic health record (I- EHR ), and added value services [2]. These classes of telematic services are applica...cally distributed clinical information systems . 5) Finally, added-value services (e.g. image processing, information indexing, data pre-fetching
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-19
... Buildings Service; Information Collection; Art-in- Architecture Program National Artist Registry (GSA Form... regarding Art-in Architecture Program National Artist Registry (GSA Form 7437). The Art-in-Architecture...-Architecture & Fine Arts Division (PCAC), 1800 F Street NW., Room 3305, Washington, DC 20405, at telephone(202...
Information Architecture: Looking Ahead.
ERIC Educational Resources Information Center
Rosenfeld, Louis
2002-01-01
Considers the future of the field of information architecture. Highlights include a comparison with the growth of the field of professional management; the design of information systems since the Web; more demanding users; the need for an interdisciplinary approach; and how to define information architecture. (LRW)
Respecting Relations: Memory Access and Antecedent Retrieval in Incremental Sentence Processing
ERIC Educational Resources Information Center
Kush, Dave W.
2013-01-01
This dissertation uses the processing of anaphoric relations to probe how linguistic information is encoded in and retrieved from memory during real-time sentence comprehension. More specifically, the dissertation attempts to resolve a tension between the demands of a linguistic processor implemented in a general-purpose cognitive architecture and…
Goldman-Rakic, P S
1996-10-29
The functional architecture of prefrontal cortex is central to our understanding of human mentation and cognitive prowess. This region of the brain is often treated as an undifferentiated structure, on the one hand, or as a mosaic of psychological faculties, on the other. This paper focuses on the working memory processor as a specialization of prefrontal cortex and argues that the different areas within prefrontal cortex represent iterations of this function for different information domains, including spatial cognition, object cognition and additionally, in humans, semantic processing. According to this parallel processing architecture, the 'central executive' could be considered an emergent property of multiple domain-specific processors operating interactively. These processors are specializations of different prefrontal cortical areas, each interconnected both with the domain-relevant long-term storage sites in posterior regions of the cortex and with appropriate output pathways.
NASA Technical Reports Server (NTRS)
1985-01-01
The second task in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make key design/programmatic decisions. This volume identifies the preferred options in the technology category and characterizes these options with respect to performance attributes, constraints, cost, and risk. The technology category includes advanced materials, processes, and techniques that can be used to enhance the implementation of SSDS design structures. The specific areas discussed are mass storage, including space and round on-line storage and off-line storage; man/machine interface; data processing hardware, including flight computers and advanced/fault tolerant computer architectures; and software, including data compression algorithms, on-board high level languages, and software tools. Also discussed are artificial intelligence applications and hard-wire communications.
Modulation of the brain's functional network architecture in the transition from wake to sleep
Larson-Prior, Linda J.; Power, Jonathan D.; Vincent, Justin L.; Nolan, Tracy S.; Coalson, Rebecca S.; Zempel, John; Snyder, Abraham Z.; Schlaggar, Bradley L.; Raichle, Marcus E.; Petersen, Steven E.
2013-01-01
The transition from quiet wakeful rest to sleep represents a period over which attention to the external environment fades. Neuroimaging methodologies have provided much information on the shift in neural activity patterns in sleep, but the dynamic restructuring of human brain networks in the transitional period from wake to sleep remains poorly understood. Analysis of electrophysiological measures and functional network connectivity of these early transitional states shows subtle shifts in network architecture that are consistent with reduced external attentiveness and increased internal and self-referential processing. Further, descent to sleep is accompanied by the loss of connectivity in anterior and posterior portions of the default-mode network and more locally organized global network architecture. These data clarify the complex and dynamic nature of the transitional period between wake and sleep and suggest the need for more studies investigating the dynamics of these processes. PMID:21854969
GEARS: An Enterprise Architecture Based On Common Ground Services
NASA Astrophysics Data System (ADS)
Petersen, S.
2014-12-01
Earth observation satellites collect a broad variety of data used in applications that range from weather forecasting to climate monitoring. Within NOAA the National Environmental Satellite Data and Information Service (NESDIS) supports these applications by operating satellites in both geosynchronous and polar orbits. Traditionally NESDIS has acquired and operated its satellites as stand-alone systems with their own command and control, mission management, processing, and distribution systems. As the volume, velocity, veracity, and variety of sensor data and products produced by these systems continues to increase, NESDIS is migrating to a new concept of operation in which it will operate and sustain the ground infrastructure as an integrated Enterprise. Based on a series of common ground services, the Ground Enterprise Architecture System (GEARS) approach promises greater agility, flexibility, and efficiency at reduced cost. This talk describes the new architecture and associated development activities, and presents the results of initial efforts to improve product processing and distribution.
WaveJava: Wavelet-based network computing
NASA Astrophysics Data System (ADS)
Ma, Kun; Jiao, Licheng; Shi, Zhuoer
1997-04-01
Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.
Eye-fixation behavior, lexical storage, and visual word recognition in a split processing model.
Shillcock, R; Ellison, T M; Monaghan, P
2000-10-01
Some of the implications of a model of visual word recognition in which processing is conditioned by the anatomical splitting of the visual field between the two hemispheres of the brain are explored. The authors investigate the optimal processing of visually presented words within such an architecture, and, for a realistically sized lexicon of English, characterize a computationally optimal fixation point in reading. They demonstrate that this approach motivates a range of behavior observed in reading isolated words and text, including the optimal viewing position and its relationship with the preferred viewing location, the failure to fixate smaller words, asymmetries in hemisphere-specific processing, and the priority given to the exterior letters of words. The authors also show that split architectures facilitate the uptake of all the letter-position information necessary for efficient word recognition and that this information may be less specific than is normally assumed. A split model of word recognition captures a range of behavior in reading that is greater than that covered by existing models of visual word recognition.
NASA Astrophysics Data System (ADS)
Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun
2014-08-01
In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.
Modeling aspects of human memory for scientific study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caudell, Thomas P.; Watson, Patrick; McDaniel, Mark A.
Working with leading experts in the field of cognitive neuroscience and computational intelligence, SNL has developed a computational architecture that represents neurocognitive mechanisms associated with how humans remember experiences in their past. The architecture represents how knowledge is organized and updated through information from individual experiences (episodes) via the cortical-hippocampal declarative memory system. We compared the simulated behavioral characteristics with those of humans measured under well established experimental standards, controlling for unmodeled aspects of human processing, such as perception. We used this knowledge to create robust simulations of & human memory behaviors that should help move the scientific community closermore » to understanding how humans remember information. These behaviors were experimentally validated against actual human subjects, which was published. An important outcome of the validation process will be the joining of specific experimental testing procedures from the field of neuroscience with computational representations from the field of cognitive modeling and simulation.« less
2015-12-01
response time re- quirements and in additional calibration requirements for DCFM that may create unexpected la - tency and latency jitter that can...manage the flight path of the aircraft. For more information about sensor correlation and fusion processes, the Air University New World Vistas ...request/reply actions. We specify its la - tency as a minimum and maximum of 300 ms. SADataServiceProtocol: an abstraction of the SA data service as a
Electronic processing of informed consents in a global pharmaceutical company environment.
Vishnyakova, Dina; Gobeill, Julien; Oezdemir-Zaech, Fatma; Kreim, Olivier; Vachon, Therese; Clade, Thierry; Haenning, Xavier; Mikhailov, Dmitri; Ruch, Patrick
2014-01-01
We present an electronic capture tool to process informed consents, which are mandatory recorded when running a clinical trial. This tool aims at the extraction of information expressing the duration of the consent given by the patient to authorize the exploitation of biomarker-related information collected during clinical trials. The system integrates a language detection module (LDM) to route a document into the appropriate information extraction module (IEM). The IEM is based on language-specific sets of linguistic rules for the identification of relevant textual facts. The achieved accuracy of both the LDM and IEM is 99%. The architecture of the system is described in detail.
Developing the architecture for the Climate Information Portal for Copernicus
NASA Astrophysics Data System (ADS)
Som de Cerff, Wim; Thijsse, Peter; Plieger, Maarten; Pascoe, Stephen; Jukes, Martin; Leadbetter, Adam; Goosen, Hasse; de Vreede, Ernst
2015-04-01
Climate change is impacting the environment, society and policy decisions. Information about climate change is available from many sources, but not all of them are reliable. The CLIPC project is developing a portal to provide a single point of access for authoritative scientific information on climate change. This ambitious objective is made possible through the Copernicus Earth Observation Programme for Europe, which will deliver a new generation of environmental measurements of climate quality. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses (syntheses of all available observations constrained with numerical weather prediction systems). These data categories are managed by different communities: CLIPC will provide a single point of access for the whole range of data. Information on data value and limitations will be provided as part of a knowledge base of authoritative climate information. The impacts of climate change on society will generally reflect a range of different environmental and climate system changes, and different sectors and actors within society will react differently to these changes. The CLIPC portal will provide some a number of indicators showing impacts on specific sectors which have been generated using a range of factors selected through structured expert consultation. It will also, as part of the transformation services, allow users to explore the consequences of using different combinations of driving factors which they consider to be of particular relevance to their work or life. The portal will provide information on the scientific quality and pitfalls of such transformations to prevent misleading usage of the results. The CLIPC project will not be able to process a comprehensive range of climate change impacts on the physical environment and society, but will develop an end to end processing chain (indicator toolkit), from comprehensive information on the climate state through to highly aggregated decision relevant products. This processing chain will be demonstrated within three thematic areas: water, rural and urban. Indicators of climate change and climate change impact will be provided, and a toolkit to update and post process the collection of indicators will be integrated into the portal. For the indicators three levels (Tiers) have been loosely defined: Tier 1: field summarising properties of the climate system; e.g. temperature change; Tier 2: expressed in terms of environmental properties outside the climate system; e.g. flooding change; Tier 3: expressed in social and economic impact. For the architecture, CLIPC has two interlocked themes: 1. Harmonised access to climate datasets derived from models, observations and re-analyses 2. A climate impact toolkit to evaluate, rank and aggregate indicators For development of the CLIPC architecture an Agile 'storyline' approach is taken. The storyline is a real world use case and consists of producing a Tier 3 indicator (Urban Heat Vulnerability) and making it available through the CLIPC infrastructure for a user group. In this way architecture concepts can be directly tested and improved. Also, the produced indicator can be shown to users to refine requirements. Main components of the CLIPC architecture are 1) Data discovery and access, 2) Data processing, 3) Data visualization, 4) Knowledge base and 5) User Management. The Data discovery and access component main challenge is to provide harmonized access to various sources of climate data (ngEO, EMODNET/SeaDataNet, ESGF, MyOcean). The discovery service concept will be provided using a CLIPC data and data product catalogue and via a structured data search on selected infrastructures, using NERC vocabulary services and mappings. Data processing will be provided using OGC WPS services, linking/re-using existing processing services from climate4impact.eu. The processing services will allow users to calculate climate impact indicators (Tier 1, 2 and 3). Processing wizards will guide users in processing indicators. The PyWPS framework will be used. The CLIPC portal will have its own central viewing service, using OGC standards for interoperability. For the WMS server side the ADAGUC framework will be used. For Tier 3 visualizations specific tailored visualisations will be developed. Tier 3 can be complicated to build and require manual work from specialists to provide meaningful results before they can be published as e.g. interactive maps. The CLIPC knowledge base is a set of services that supply explanatory information to the users when working with CLIPC services. It is structured around 1) a catalogue, containing ISO standardized metadata, citations, background information, links to data; 2) Commentary information, e.g. FAQ, annotation URLs , version information, disclaimers; 3) Technical documents, e.g. using vocabularies and mappings 4) Glossaries, adding and using existing glossaries from e.g. EUPORIAS/IS-ENES, IPCC; 5) literature references. CLIPC will have a very light weight user management system, providing as little barriers to the user as possible. We will make use of OpenID, accepting from selected OpenID providers such as Google and ESGF. In the presentation we will show the storyline implementation: the first results of the Tier 3 indicator, the architecture in development and the lessons learned.
Design and Decorative Art in Shaping of Architectural Environment Image
NASA Astrophysics Data System (ADS)
Shabalina, N. M.
2017-11-01
The relevance of the topic is determined by the dynamic development of the promising branch, i.e. the architectural environment design, which requires, on the one hand, consideration of the morphology and typology of this art form, on the other hand, the specificity of the architectural environment artistic image. The intensive development of innovative computer technologies and materials in modern engineering, improvement of the information communications forms in their totality has led to the application of new methods in design and construction which, in their turn, have required the development of additional methods for content and context analysis in the integrated assessment of socially significant architectural environments. In the modern culture, correlative processes are steadily developing leading us to a new understanding of the interaction of architecture, decorative art and design. Their rapprochement at the morphological level has been noted which makes it possible to reveal a specific method of synthesis and similarity. The architecture of postmodern styles differs in its bionic form becoming an interactive part of the society and approaching its structural qualities with painting, sculpture, and design. In the modern world, these processes acquire multi-valued semantic nuances, expand the importance of associativity and dynamic processuality in the perception of environmental objects and demand the development of new approaches to the assessment of the architectural design environment. Within the framework of the universal paradigm of modern times the concept of the world develops as a set of systems that live according to the self-organization laws.
Lewis, Richard L; Shvartsman, Michael; Singh, Satinder
2013-07-01
We explore the idea that eye-movement strategies in reading are precisely adapted to the joint constraints of task structure, task payoff, and processing architecture. We present a model of saccadic control that separates a parametric control policy space from a parametric machine architecture, the latter based on a small set of assumptions derived from research on eye movements in reading (Engbert, Nuthmann, Richter, & Kliegl, 2005; Reichle, Warren, & McConnell, 2009). The eye-control model is embedded in a decision architecture (a machine and policy space) that is capable of performing a simple linguistic task integrating information across saccades. Model predictions are derived by jointly optimizing the control of eye movements and task decisions under payoffs that quantitatively express different desired speed-accuracy trade-offs. The model yields distinct eye-movement predictions for the same task under different payoffs, including single-fixation durations, frequency effects, accuracy effects, and list position effects, and their modulation by task payoff. The predictions are compared to-and found to accord with-eye-movement data obtained from human participants performing the same task under the same payoffs, but they are found not to accord as well when the assumptions concerning payoff optimization and processing architecture are varied. These results extend work on rational analysis of oculomotor control and adaptation of reading strategy (Bicknell & Levy, ; McConkie, Rayner, & Wilson, 1973; Norris, 2009; Wotschack, 2009) by providing evidence for adaptation at low levels of saccadic control that is shaped by quantitatively varying task demands and the dynamics of processing architecture. Copyright © 2013 Cognitive Science Society, Inc.
Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning
2007-01-01
The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.
Optical resonators and neural networks
NASA Astrophysics Data System (ADS)
Anderson, Dana Z.
1986-08-01
It may be possible to implement neural network models using continuous field optical architectures. These devices offer the inherent parallelism of propagating waves and an information density in principle dictated by the wavelength of light and the quality of the bulk optical elements. Few components are needed to construct a relatively large equivalent network. Various associative memories based on optical resonators have been demonstrated in the literature, a ring resonator design is discussed in detail here. Information is stored in a holographic medium and recalled through a competitive processes in the gain medium supplying energy to the ring rsonator. The resonator memory is the first realized example of a neural network function implemented with this kind of architecture.
New Course Design: Classification Schemes and Information Architecture.
ERIC Educational Resources Information Center
Weinberg, Bella Hass
2002-01-01
Describes a course developed at St. John's University (New York) in the Division of Library and Information Science that relates traditional classification schemes to information architecture and Web sites. Highlights include functional aspects of information architecture, that is, the way content is structured; assignments; student reactions; and…
ERIC Educational Resources Information Center
Suwal, Sunil; Singh, Vishal
2018-01-01
Building Information Modelling (BIM) tools and processes are increasingly adopted and implemented in the construction industry. Consequently, BIM education is considered increasingly important in Architecture, Engineering and Construction (AEC) education. While most of the research and literature on BIM education in engineering studies has focused…
Assessing and Managing Quality of Information Assurance
2010-11-01
such as firewalls, antivirus scanning tools and mechanisms for user authentication and authorization. Advanced mission-critical systems often...imply increased risk to DoD information systems. The Process and Organizational Maturity (POM) class focuses on the maturity of the software and...include architectural quality. Common Weakness Enumeration (CWE) is a recent example that highlights the connection between software quality and
Multimedia And Internetworking Architecture Infrastructure On Interactive E-Learning System
NASA Astrophysics Data System (ADS)
Indah, K. A. T.; Sukarata, G.
2018-01-01
Interactive e-learning is a distance learning method that involves information technology, electronic system or computer as one means of learning system used for teaching and learning process that is implemented without having face to face directly between teacher and student. A strong dependence on emerging technologies greatly influences the way in which the architecture is designed to produce a powerful interactive e-learning network. In this paper analyzed an architecture model where learning can be done interactively, involving many participants (N-way synchronized distance learning) using video conferencing technology. Also used broadband internet network as well as multicast techniques as a troubleshooting method for bandwidth usage can be efficient.
Signaling Architectures that Transmit Unidirectional Information Despite Retroactivity.
Shah, Rushina; Del Vecchio, Domitilla
2017-08-08
A signaling pathway transmits information from an upstream system to downstream systems, ideally in a unidirectional fashion. A key obstacle to unidirectional transmission is retroactivity, the additional reaction flux that affects a system once its species interact with those of downstream systems. This raises the fundamental question of whether signaling pathways have developed specialized architectures that overcome retroactivity and transmit unidirectional signals. Here, we propose a general procedure based on mathematical analysis that provides an answer to this question. Using this procedure, we analyze the ability of a variety of signaling architectures to transmit one-way (from upstream to downstream) signals, as key biological parameters are tuned. We find that single stage phosphorylation and phosphotransfer systems that transmit signals from a kinase show a stringent design tradeoff that hampers their ability to overcome retroactivity. Interestingly, cascades of these architectures, which are highly represented in nature, can overcome this tradeoff and thus enable unidirectional transmission. By contrast, phosphotransfer systems, and single and double phosphorylation cycles that transmit signals from a substrate, are unable to mitigate retroactivity effects, even when cascaded, and hence are not well suited for unidirectional information transmission. These results are largely independent of the specific reaction-rate constant values, and depend on the topology of the architectures. Our results therefore identify signaling architectures that, allowing unidirectional transmission of signals, embody modular processes that conserve their input/output behavior across multiple contexts. These findings can be used to decompose natural signal transduction networks into modules, and at the same time, they establish a library of devices that can be used in synthetic biology to facilitate modular circuit design. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Conceptual information processing: A robust approach to KBS-DBMS integration
NASA Technical Reports Server (NTRS)
Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond
1987-01-01
Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.
On-chip visual perception of motion: a bio-inspired connectionist model on FPGA.
Torres-Huitzil, César; Girau, Bernard; Castellanos-Sánchez, Claudio
2005-01-01
Visual motion provides useful information to understand the dynamics of a scene to allow intelligent systems interact with their environment. Motion computation is usually restricted by real time requirements that need the design and implementation of specific hardware architectures. In this paper, the design of hardware architecture for a bio-inspired neural model for motion estimation is presented. The motion estimation is based on a strongly localized bio-inspired connectionist model with a particular adaptation of spatio-temporal Gabor-like filtering. The architecture is constituted by three main modules that perform spatial, temporal, and excitatory-inhibitory connectionist processing. The biomimetic architecture is modeled, simulated and validated in VHDL. The synthesis results on a Field Programmable Gate Array (FPGA) device show the potential achievement of real-time performance at an affordable silicon area.
Architectural design of the science complex at Elizabeth City State University
NASA Technical Reports Server (NTRS)
Jahromi, Soheila
1993-01-01
This paper gives an overall view of the architectural design process and elements in taking an idea from conception to execution. The project presented is an example for this process. Once the need for a new structure is established, an architect studies the requirements, opinions and limits in creating a structure that people will exist in, move through, and use. Elements in designing a building include factors such as volume and surface, light and form changes of scale and view, movement and stasis. Some of the other factors are functions and physical conditions of construction. Based on experience, intuition, and boundaries, an architect will utilize all elements in creating a new building. In general, the design process begins with studying the spatial needs which develop into an architectural program. A comprehensive and accurate architectural program is essential for having a successful building. The most attractive building which does not meet the functional needs of its users has failed at the primary reason for its existence. To have a good program an architect must have a full understanding of the daily functions that will take place in the building. The architectural program along with site characteristics are among a few of the important guidelines in studying the form, adjacencies, and circulation for the structure itself and also in relation to the adjacent structures. Conceptual studies are part of the schematic design, which is the first milestone in the design process. The other reference points are design development and construction documents. At each milestone, review and coordination with all the consultants is established, and the user is essential in refining the project. In design development phase, conceptual diagrams take shape, and architectural, structural, mechanical, and electrical systems are developed. The final phase construction documents convey all the information required to construct the building. The design process and elements described were applied in the following project.
An Information Technology Architecture for Pharmaceutical Research and Development
Klingler, Daniel E.; Jaffe, Marvin E.
1990-01-01
Rationale for and development of an information technology architecture are presented. The architectural approach described produces a technology environment that is integrating, flexible, robust, productive, and future-oriented. Issues accompanying architecture development and potential impediments to success are discussed.
NASA Astrophysics Data System (ADS)
Koshkina, S.; Ostrinskaya, L.
2018-04-01
An information model for “key” quality indicators of goods has been developed. This model is based on the assessment of f standardization existing state and the product labeling quality. According to the authors’ opinion, the proposed “key” indicators are the most significant for purchasing decision making. Customers will be able to use this model through their mobile technical devices. The developed model allows to decompose existing processes in data flows and to reveal the levels of possible architectural solutions. In-depth analysis of the presented information model decomposition levels will allow determining the stages of its improvement and to reveal additional indicators of the goods quality that are of interest to customers in the further research. Examining the architectural solutions for the customer’s information environment functioning when integrating existing databases will allow us to determine the boundaries of the model flexibility and customizability.
Adaptive neuro-heuristic hybrid model for fruit peel defects detection.
Woźniak, Marcin; Połap, Dawid
2018-02-01
Fusion of machine learning methods benefits in decision support systems. A composition of approaches gives a possibility to use the most efficient features composed into one solution. In this article we would like to present an approach to the development of adaptive method based on fusion of proposed novel neural architecture and heuristic search into one co-working solution. We propose a developed neural network architecture that adapts to processed input co-working with heuristic method used to precisely detect areas of interest. Input images are first decomposed into segments. This is to make processing easier, since in smaller images (decomposed segments) developed Adaptive Artificial Neural Network (AANN) processes less information what makes numerical calculations more precise. For each segment a descriptor vector is composed to be presented to the proposed AANN architecture. Evaluation is run adaptively, where the developed AANN adapts to inputs and their features by composed architecture. After evaluation, selected segments are forwarded to heuristic search, which detects areas of interest. As a result the system returns the image with pixels located over peel damages. Presented experimental research results on the developed solution are discussed and compared with other commonly used methods to validate the efficacy and the impact of the proposed fusion in the system structure and training process on classification results. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fuselets: an agent based architecture for fusion of heterogeneous information and data
NASA Astrophysics Data System (ADS)
Beyerer, Jürgen; Heizmann, Michael; Sander, Jennifer
2006-04-01
A new architecture for fusing information and data from heterogeneous sources is proposed. The approach takes criminalistics as a model. In analogy to the work of detectives, who attempt to investigate crimes, software agents are initiated that pursue clues and try to consolidate or to dismiss hypotheses. Like their human pendants, they can, if questions beyond their competences arise, consult expert agents. Within the context of a certain task, region, and time interval, specialized operations are applied to each relevant information source, e.g. IMINT, SIGINT, ACINT,..., HUMINT, data bases etc. in order to establish hit lists of first clues. Each clue is described by its pertaining facts, uncertainties, and dependencies in form of a local degree-of-belief (DoB) distribution in a Bayesian sense. For each clue an agent is initiated which cooperates with other agents and experts. Expert agents support to make use of different information sources. Consultations of experts, capable to access certain information sources, result in changes of the DoB of the pertaining clue. According to the significance of concentration of their DoB distribution clues are abandoned or pursued further to formulate task specific hypotheses. Communications between the agents serve to find out whether different clues belong to the same cause and thus can be put together. At the end of the investigation process, the different hypotheses are evaluated by a jury and a final report is created that constitutes the fusion result. The approach proposed avoids calculating global DoB distributions by adopting a local Bayesian approximation and thus reduces the complexity of the exact problem essentially. Different information sources are transformed into DoB distributions using the maximum entropy paradigm and considering known facts as constraints. Nominal, ordinal and cardinal quantities can be treated within this framework equally. The architecture is scalable by tailoring the number of agents according to the available computer resources, to the priority of tasks, and to the maximum duration of the fusion process. Furthermore, the architecture allows cooperative work of human and automated agents and experts, as long as not all subtasks can be accomplished automatically.
Supporting diagnosis and treatment in medical care based on Big Data processing.
Lupşe, Oana-Sorina; Crişan-Vida, Mihaela; Stoicu-Tivadar, Lăcrămioara; Bernard, Elena
2014-01-01
With information and data in all domains growing every day, it is difficult to manage and extract useful knowledge for specific situations. This paper presents an integrated system architecture to support the activity in the Ob-Gin departments with further developments in using new technology to manage Big Data processing - using Google BigQuery - in the medical domain. The data collected and processed with Google BigQuery results from different sources: two Obstetrics & Gynaecology Departments, the TreatSuggest application - an application for suggesting treatments, and a home foetal surveillance system. Data is uploaded in Google BigQuery from Bega Hospital Timişoara, Romania. The analysed data is useful for the medical staff, researchers and statisticians from public health domain. The current work describes the technological architecture and its processing possibilities that in the future will be proved based on quality criteria to lead to a better decision process in diagnosis and public health.
A new flight control and management system architecture and configuration
NASA Astrophysics Data System (ADS)
Kong, Fan-e.; Chen, Zongji
2006-11-01
The advanced fighter should possess the performance such as super-sound cruising, stealth, agility, STOVL(Short Take-Off Vertical Landing),powerful communication and information processing. For this purpose, it is not enough only to improve the aerodynamic and propulsion system. More importantly, it is necessary to enhance the control system. A complete flight control system provides not only autopilot, auto-throttle and control augmentation, but also the given mission management. F-22 and JSF possess considerably outstanding flight control system on the basis of pave pillar and pave pace avionics architecture. But their control architecture is not enough integrated. The main purpose of this paper is to build a novel fighter control system architecture. The control system constructed on this architecture should be enough integrated, inexpensive, fault-tolerant, high safe, reliable and effective. And it will take charge of both the flight control and mission management. Starting from this purpose, this paper finishes the work as follows: First, based on the human nervous control, a three-leveled hierarchical control architecture is proposed. At the top of the architecture, decision level is in charge of decision-making works. In the middle, organization & coordination level will schedule resources, monitor the states of the fighter and switch the control modes etc. And the bottom is execution level which holds the concrete drive and measurement; then, according to their function and resources all the tasks involving flight control and mission management are sorted to individual level; at last, in order to validate the three-leveled architecture, a physical configuration is also showed. The configuration is distributed and applies some new advancement in information technology industry such line replaced module and cluster technology.
Velsko, Stephan; Bates, Thomas
2016-01-01
Despite numerous calls for improvement, the US biosurveillance enterprise remains a patchwork of uncoordinated systems that fail to take advantage of the rapid progress in information processing, communication, and analytics made in the past decade. By synthesizing components from the extensive biosurveillance literature, we propose a conceptual framework for a national biosurveillance architecture and provide suggestions for implementation. The framework differs from the current federal biosurveillance development pathway in that it is not focused on systems useful for "situational awareness" but is instead focused on the long-term goal of having true warning capabilities. Therefore, a guiding design objective is the ability to digitally detect emerging threats that span jurisdictional boundaries, because attempting to solve the most challenging biosurveillance problem first provides the strongest foundation to meet simpler surveillance objectives. Core components of the vision are: (1) a whole-of-government approach to support currently disparate federal surveillance efforts that have a common data need, including those for food safety, vaccine and medical product safety, and infectious disease surveillance; (2) an information architecture that enables secure national access to electronic health records, yet does not require that data be sent to a centralized location for surveillance analysis; (3) an inference architecture that leverages advances in "big data" analytics and learning inference engines-a significant departure from the statistical process control paradigm that underpins nearly all current syndromic surveillance systems; and (4) an organizational architecture with a governance model aimed at establishing national biosurveillance as a critical part of the US national infrastructure. Although it will take many years to implement, and a national campaign of education and debate to acquire public buy-in for such a comprehensive system, the potential benefits warrant increased consideration by the US government.
Connecting Architecture and Implementation
NASA Astrophysics Data System (ADS)
Buchgeher, Georg; Weinreich, Rainer
Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.
Parallel Ada benchmarks for the SVMS
NASA Technical Reports Server (NTRS)
Collard, Philippe E.
1990-01-01
The use of parallel processing paradigm to design and develop faster and more reliable computers appear to clearly mark the future of information processing. NASA started the development of such an architecture: the Spaceborne VHSIC Multi-processor System (SVMS). Ada will be one of the languages used to program the SVMS. One of the unique characteristics of Ada is that it supports parallel processing at the language level through the tasking constructs. It is important for the SVMS project team to assess how efficiently the SVMS architecture will be implemented, as well as how efficiently Ada environment will be ported to the SVMS. AUTOCLASS II, a Bayesian classifier written in Common Lisp, was selected as one of the benchmarks for SVMS configurations. The purpose of the R and D effort was to provide the SVMS project team with the version of AUTOCLASS II, written in Ada, that would make use of Ada tasking constructs as much as possible so as to constitute a suitable benchmark. Additionally, a set of programs was developed that would measure Ada tasking efficiency on parallel architectures as well as determine the critical parameters influencing tasking efficiency. All this was designed to provide the SVMS project team with a set of suitable tools in the development of the SVMS architecture.
Advanced Information Processing System (AIPS)
NASA Technical Reports Server (NTRS)
Pitts, Felix L.
1993-01-01
Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.
An application of business process method to the clinical efficiency of hospital.
Leu, Jun-Der; Huang, Yu-Tsung
2011-06-01
The concept of Total Quality Management (TQM) has come to be applied in healthcare over the last few years. The process management category in the Baldrige Health Care Criteria for Performance Excellence model is designed to evaluate the quality of medical services. However, a systematic approach for implementation support is necessary to achieve excellence in the healthcare business process. The Architecture of Integrated Information Systems (ARIS) is a business process architecture developed by IDS Scheer AG and has been applied in a variety of industrial application. It starts with a business strategy to identify the core and support processes, and encompasses the whole life-cycle range, from business process design to information system deployment, which is compatible with the concept of healthcare performance excellence criteria. In this research, we apply the basic ARIS framework to optimize the clinical processes of an emergency department in a mid-size hospital with 300 clinical beds while considering the characteristics of the healthcare organization. Implementation of the case is described, and 16 months of clinical data are then collected, which are used to study the performance and feasibility of the method. The experience gleaned in this case study can be used a reference for mid-size hospitals with similar business models.
NASA Astrophysics Data System (ADS)
Duda, James L.; Mulligan, Joseph; Valenti, James; Wenkel, Michael
2005-01-01
A key feature of the National Polar-orbiting Operational Environmental Satellite System (NPOESS) is the Northrop Grumman Space Technology patent-pending innovative data routing and retrieval architecture called SafetyNetTM. The SafetyNetTM ground system architecture for the National Polar-orbiting Operational Environmental Satellite System (NPOESS), combined with the Interface Data Processing Segment (IDPS), will together provide low data latency and high data availability to its customers. The NPOESS will cut the time between observation and delivery by a factor of four when compared with today's space-based weather systems, the Defense Meteorological Satellite Program (DMSP) and NOAA's Polar-orbiting Operational Environmental Satellites (POES). SafetyNetTM will be a key element of the NPOESS architecture, delivering near real-time data over commercial telecommunications networks. Scattered around the globe, the 15 unmanned ground receptors are linked by fiber-optic systems to four central data processing centers in the U. S. known as Weather Centrals. The National Environmental Satellite, Data and Information Service; Air Force Weather Agency; Fleet Numerical Meteorology and Oceanography Center, and the Naval Oceanographic Office operate the Centrals. In addition, this ground system architecture will have unused capacity attendant with an infrastructure that can accommodate additional users.
Ando, S; Sekine, S; Mita, M; Katsuo, S
1989-12-15
An architecture and the algorithms for matrix multiplication using optical flip-flops (OFFs) in optical processors are proposed based on residue arithmetic. The proposed system is capable of processing all elements of matrices in parallel utilizing the information retrieving ability of optical Fourier processors. The employment of OFFs enables bidirectional data flow leading to a simpler architecture and the burden of residue-to-decimal (or residue-to-binary) conversion to operation time can be largely reduced by processing all elements in parallel. The calculated characteristics of operation time suggest a promising use of the system in a real time 2-D linear transform.
Reference Architecture Model Enabling Standards Interoperability.
Blobel, Bernd
2017-01-01
Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.
McEwan, Reed; Melton, Genevieve B; Knoll, Benjamin C; Wang, Yan; Hultman, Gretchen; Dale, Justin L; Meyer, Tim; Pakhomov, Serguei V
2016-01-01
Many design considerations must be addressed in order to provide researchers with full text and semantic search of unstructured healthcare data such as clinical notes and reports. Institutions looking at providing this functionality must also address the big data aspects of their unstructured corpora. Because these systems are complex and demand a non-trivial investment, there is an incentive to make the system capable of servicing future needs as well, further complicating the design. We present architectural best practices as lessons learned in the design and implementation NLP-PIER (Patient Information Extraction for Research), a scalable, extensible, and secure system for processing, indexing, and searching clinical notes at the University of Minnesota.
Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey
NASA Astrophysics Data System (ADS)
Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.
2018-01-01
The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.
39 CFR 501.7 - Postage Evidencing System requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Performance Criteria for Information-Based Indicia and Security Architecture for Open IBI Postage Evidencing Systems or Performance Criteria for Information-Based Indicia and Security Architecture for Closed IBI... Information-Based Indicia and Security Architecture for Open IBI Postage Evidencing Systems or Performance...
39 CFR 501.7 - Postage Evidencing System requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Performance Criteria for Information-Based Indicia and Security Architecture for Open IBI Postage Evidencing Systems or Performance Criteria for Information-Based Indicia and Security Architecture for Closed IBI... Information-Based Indicia and Security Architecture for Open IBI Postage Evidencing Systems or Performance...
Bioinspired decision architectures containing host and microbiome processing units.
Heyde, K C; Gallagher, P W; Ruder, W C
2016-09-27
Biomimetic robots have been used to explore and explain natural phenomena ranging from the coordination of ants to the locomotion of lizards. Here, we developed a series of decision architectures inspired by the information exchange between a host organism and its microbiome. We first modeled the biochemical exchanges of a population of synthetically engineered E. coli. We then built a physical, differential drive robot that contained an integrated, onboard computer vision system. A relay was established between the simulated population of cells and the robot's microcontroller. By placing the robot within a target-containing a two-dimensional arena, we explored how different aspects of the simulated cells and the robot's microcontroller could be integrated to form hybrid decision architectures. We found that distinct decision architectures allow for us to develop models of computation with specific strengths such as runtime efficiency or minimal memory allocation. Taken together, our hybrid decision architectures provide a new strategy for developing bioinspired control systems that integrate both living and nonliving components.
Papale, Paolo; Chiesi, Leonardo; Rampinini, Alessandra C.; Pietrini, Pietro; Ricciardi, Emiliano
2016-01-01
In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting ‘neuro-architecture’ as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people–environment relationships, and even provide empirical foundations for a renewed evidence-based design theory. PMID:27375542
Space Communications Technology Conference: Onboard Processing and Switching
NASA Technical Reports Server (NTRS)
1991-01-01
Papers and presentations from the conference are presented. The topics covered include the following: satellite network architecture, network control and protocols, fault tolerance and autonomy, multichanned demultiplexing and demodulation, information switching and routing, modulation and coding, and planned satellite communications systems.
Using architectures for semantic interoperability to create journal clubs for emergency response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, James E; Collins, Linn M; Martinez, Mark L B
2009-01-01
In certain types of 'slow burn' emergencies, careful accumulation and evaluation of information can offer a crucial advantage. The SARS outbreak in the first decade of the 21st century was such an event, and ad hoc journal clubs played a critical role in assisting scientific and technical responders in identifying and developing various strategies for halting what could have become a dangerous pandemic. This research-in-progress paper describes a process for leveraging emerging semantic web and digital library architectures and standards to (1) create a focused collection of bibliographic metadata, (2) extract semantic information, (3) convert it to the Resource Descriptionmore » Framework /Extensible Markup Language (RDF/XML), and (4) integrate it so that scientific and technical responders can share and explore critical information in the collections.« less
Quantum information processing with trapped ions
NASA Astrophysics Data System (ADS)
Gaebler, John
2013-03-01
Trapped ions are one promising architecture for scalable quantum information processing. Ion qubits are held in multizone traps created from segmented arrays of electrodes and transported between trap zones using time varying electric potentials applied to the electrodes. Quantum information is stored in the ions' internal hyperfine states and quantum gates to manipulate the internal states and create entanglement are performed with laser beams and microwaves. Recently we have made progress in speeding up the ion transport and cooling processes that were the limiting tasks for the operation speed in previous experiments. We are also exploring improved two-qubit gates and new methods for creating ion entanglement. This work was supported by IARPA, ARO contract No. EAO139840, ONR and the NIST Quantum Information Program
Glaser, John P
2008-01-01
Partners Healthcare, and its affiliated hospitals, have a long track record of accomplishments in clinical information systems implementations and research. Seven ideas have shaped the information systems strategies and tactics at Partners; centrality of processes, organizational partnerships, progressive incrementalism, agility, architecture, embedded research, and engage the field. This article reviews the ideas and discusses the rationale and steps taken to put the ideas into practice.
Glaser, John P.
2008-01-01
Partners Healthcare, and its affiliated hospitals, have a long track record of accomplishments in clinical information systems implementations and research. Seven ideas have shaped the information systems strategies and tactics at Partners; centrality of processes, organizational partnerships, progressive incrementalism, agility, architecture, embedded research, and engage the field. This article reviews the ideas and discusses the rationale and steps taken to put the ideas into practice. PMID:18308978
Memristive crypto primitive for building highly secure physical unclonable functions
NASA Astrophysics Data System (ADS)
Gao, Yansong; Ranasinghe, Damith C.; Al-Sarawi, Said F.; Kavehei, Omid; Abbott, Derek
2015-08-01
Physical unclonable functions (PUFs) exploit the intrinsic complexity and irreproducibility of physical systems to generate secret information. The advantage is that PUFs have the potential to provide fundamentally higher security than traditional cryptographic methods by preventing the cloning of devices and the extraction of secret keys. Most PUF designs focus on exploiting process variations in Complementary Metal Oxide Semiconductor (CMOS) technology. In recent years, progress in nanoelectronic devices such as memristors has demonstrated the prevalence of process variations in scaling electronics down to the nano region. In this paper, we exploit the extremely large information density available in nanocrossbar architectures and the significant resistance variations of memristors to develop an on-chip memristive device based strong PUF (mrSPUF). Our novel architecture demonstrates desirable characteristics of PUFs, including uniqueness, reliability, and large number of challenge-response pairs (CRPs) and desirable characteristics of strong PUFs. More significantly, in contrast to most existing PUFs, our PUF can act as a reconfigurable PUF (rPUF) without additional hardware and is of benefit to applications needing revocation or update of secure key information.
Memristive crypto primitive for building highly secure physical unclonable functions.
Gao, Yansong; Ranasinghe, Damith C; Al-Sarawi, Said F; Kavehei, Omid; Abbott, Derek
2015-08-04
Physical unclonable functions (PUFs) exploit the intrinsic complexity and irreproducibility of physical systems to generate secret information. The advantage is that PUFs have the potential to provide fundamentally higher security than traditional cryptographic methods by preventing the cloning of devices and the extraction of secret keys. Most PUF designs focus on exploiting process variations in Complementary Metal Oxide Semiconductor (CMOS) technology. In recent years, progress in nanoelectronic devices such as memristors has demonstrated the prevalence of process variations in scaling electronics down to the nano region. In this paper, we exploit the extremely large information density available in nanocrossbar architectures and the significant resistance variations of memristors to develop an on-chip memristive device based strong PUF (mrSPUF). Our novel architecture demonstrates desirable characteristics of PUFs, including uniqueness, reliability, and large number of challenge-response pairs (CRPs) and desirable characteristics of strong PUFs. More significantly, in contrast to most existing PUFs, our PUF can act as a reconfigurable PUF (rPUF) without additional hardware and is of benefit to applications needing revocation or update of secure key information.
Memristive crypto primitive for building highly secure physical unclonable functions
Gao, Yansong; Ranasinghe, Damith C.; Al-Sarawi, Said F.; Kavehei, Omid; Abbott, Derek
2015-01-01
Physical unclonable functions (PUFs) exploit the intrinsic complexity and irreproducibility of physical systems to generate secret information. The advantage is that PUFs have the potential to provide fundamentally higher security than traditional cryptographic methods by preventing the cloning of devices and the extraction of secret keys. Most PUF designs focus on exploiting process variations in Complementary Metal Oxide Semiconductor (CMOS) technology. In recent years, progress in nanoelectronic devices such as memristors has demonstrated the prevalence of process variations in scaling electronics down to the nano region. In this paper, we exploit the extremely large information density available in nanocrossbar architectures and the significant resistance variations of memristors to develop an on-chip memristive device based strong PUF (mrSPUF). Our novel architecture demonstrates desirable characteristics of PUFs, including uniqueness, reliability, and large number of challenge-response pairs (CRPs) and desirable characteristics of strong PUFs. More significantly, in contrast to most existing PUFs, our PUF can act as a reconfigurable PUF (rPUF) without additional hardware and is of benefit to applications needing revocation or update of secure key information. PMID:26239669
Optimal causal inference: estimating stored information and approximating causal architecture.
Still, Susanne; Crutchfield, James P; Ellison, Christopher J
2010-09-01
We introduce an approach to inferring the causal architecture of stochastic dynamical systems that extends rate-distortion theory to use causal shielding--a natural principle of learning. We study two distinct cases of causal inference: optimal causal filtering and optimal causal estimation. Filtering corresponds to the ideal case in which the probability distribution of measurement sequences is known, giving a principled method to approximate a system's causal structure at a desired level of representation. We show that in the limit in which a model-complexity constraint is relaxed, filtering finds the exact causal architecture of a stochastic dynamical system, known as the causal-state partition. From this, one can estimate the amount of historical information the process stores. More generally, causal filtering finds a graded model-complexity hierarchy of approximations to the causal architecture. Abrupt changes in the hierarchy, as a function of approximation, capture distinct scales of structural organization. For nonideal cases with finite data, we show how the correct number of the underlying causal states can be found by optimal causal estimation. A previously derived model-complexity control term allows us to correct for the effect of statistical fluctuations in probability estimates and thereby avoid overfitting.
NASA Astrophysics Data System (ADS)
Javidi, Bahram
The present conference discusses topics in the fields of neural networks, acoustooptic signal processing, pattern recognition, phase-only processing, nonlinear signal processing, image processing, optical computing, and optical information processing. Attention is given to the optical implementation of an inner-product neural associative memory, optoelectronic associative recall via motionless-head/parallel-readout optical disk, a compact real-time acoustooptic image correlator, a multidimensional synthetic estimation filter, and a light-efficient joint transform optical correlator. Also discussed are a high-resolution spatial light modulator, compact real-time interferometric Fourier-transform processors, a fast decorrelation algorithm for permutation arrays, the optical interconnection of optical modules, and carry-free optical binary adders.
NASA Astrophysics Data System (ADS)
Murphy, M.; Corns, A.; Cahill, J.; Eliashvili, K.; Chenau, A.; Pybus, C.; Shaw, R.; Devlin, G.; Deevy, A.; Truong-Hong, L.
2017-08-01
Cultural heritage researchers have recently begun applying Building Information Modelling (BIM) to historic buildings. The model is comprised of intelligent objects with semantic attributes which represent the elements of a building structure and are organised within a 3D virtual environment. Case studies in Ireland are used to test and develop the suitable systems for (a) data capture/digital surveying/processing (b) developing library of architectural components and (c) mapping these architectural components onto the laser scan or digital survey to relate the intelligent virtual representation of a historic structure (HBIM). While BIM platforms have the potential to create a virtual and intelligent representation of a building, its full exploitation and use is restricted to narrow set of expert users with access to costly hardware, software and skills. The testing of open BIM approaches in particular IFCs and the use of game engine platforms is a fundamental component for developing much wider dissemination. The semantically enriched model can be transferred into a WEB based game engine platform.
SANDS: an architecture for clinical decision support in a National Health Information Network.
Wright, Adam; Sittig, Dean F
2007-10-11
A new architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support) is introduced and its performance evaluated. The architecture provides a method for performing clinical decision support across a network, as in a health information exchange. Using the prototype we demonstrated that, first, a number of useful types of decision support can be carried out using our architecture; and, second, that the architecture exhibits desirable reliability and performance characteristics.
A Proposed Information Architecture for Telehealth System Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warren, S.; Craft, R.L.; Parks, R.C.
1999-04-07
Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor providesmore » and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.« less
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2013-02-01
This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.
Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network.
Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R
2016-08-15
Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors.
Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network
Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R.
2016-01-01
Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors. PMID:27537878
JPL Project Information Management: A Continuum Back to the Future
NASA Technical Reports Server (NTRS)
Reiz, Julie M.
2009-01-01
This slide presentation reviews the practices and architecture that support information management at JPL. This practice has allowed concurrent use and reuse of information by primary and secondary users. The use of this practice is illustrated in the evolution of the Mars Rovers from the Mars Pathfinder to the development of the Mars Science Laboratory. The recognition of the importance of information management during all phases of a project life cycle has resulted in the design of an information system that includes metadata, has reduced the risk of information loss through the use of an in-process appraisal, shaping of project's appreciation for capturing and managing the information on one project for re-use by future projects as a natural outgrowth of the process. This process has also assisted in connection of geographically disbursed partners into a team through sharing information, common tools and collaboration.
Information Processing Approaches to Cognitive Development
1988-07-01
Craik . F.I.M., & Lockhart , R.S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11...task at both levels of performance, then one would, in both cases, postulate systems that had the ability to process symbols at the microscopic level ...821760s and early 70s. (cf. Atkinson & Shiffrin. 1968: Craik & Lockhart . 1972: Norman, Rumelhart, & LNR, 1975). This architecture is comprised of several
Scalable Visual Analytics of Massive Textual Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.
2007-04-01
This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Design distributed simulation platform for vehicle management system
NASA Astrophysics Data System (ADS)
Wen, Zhaodong; Wang, Zhanlin; Qiu, Lihua
2006-11-01
Next generation military aircraft requires the airborne management system high performance. General modules, data integration, high speed data bus and so on are needed to share and manage information of the subsystems efficiently. The subsystems include flight control system, propulsion system, hydraulic power system, environmental control system, fuel management system, electrical power system and so on. The unattached or mixed architecture is changed to integrated architecture. That means the whole airborne system is regarded into one system to manage. So the physical devices are distributed but the system information is integrated and shared. The process function of each subsystem are integrated (including general process modules, dynamic reconfiguration), furthermore, the sensors and the signal processing functions are shared. On the other hand, it is a foundation for power shared. Establish a distributed vehicle management system using 1553B bus and distributed processors which can provide a validation platform for the research of airborne system integrated management. This paper establishes the Vehicle Management System (VMS) simulation platform. Discuss the software and hardware configuration and analyze the communication and fault-tolerant method.
SEA ARCHER Distributed Aviation Platform
2001-12-01
manual processes, but should also improve decision support functions through advanced modeling and simulation. SEA ARCHER’s information architecture...this payload model was the SH-60 for which accurate weights were attained. Weights for the Marine STOVL version of the JSF were also attained, and
Gratton, Gabriele
2018-03-01
Here, I propose a view of the architecture of the human information processing system, and of how it can be adapted to changing task demands (which is the hallmark of cognitive control). This view is informed by an interpretation of brain activity as reflecting the excitability level of neural representations, encoding not only stimuli and temporal contexts, but also action plans and task goals. The proposed cognitive architecture includes three types of circuits: open circuits, involved in feed-forward processing such as that connecting stimuli with responses and characterized by brief, transient brain activity; and two types of closed circuits, positive feedback circuits (characterized by sustained, high-frequency oscillatory activity), which help select and maintain representations, and negative feedback circuits (characterized by brief, low-frequency oscillatory bursts), which are instead associated with changes in representations. Feed-forward activity is primarily responsible for the spread of activation along the information processing system. Oscillatory activity, instead, controls this spread. Sustained oscillatory activity due to both local cortical circuits (gamma) and longer corticothalamic circuits (alpha and beta) allows for the selection of individuated representations. Through the interaction of these circuits, it also allows for the preservation of representations across different temporal spans (sensory and working memory) and their spread across the brain. In contrast, brief bursts of oscillatory activity, generated by novel and/or conflicting information, lead to the interruption of sustained oscillatory activity and promote the generation of new representations. I discuss how this framework can account for a number of psychological and behavioral phenomena. © 2017 Society for Psychophysiological Research.
The Information Science Experiment System - The computer for science experiments in space
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.; Husson, Charles
1989-01-01
The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.
The Architecture of Information at Plateau Beaubourg
ERIC Educational Resources Information Center
Branda, Ewan Edward
2012-01-01
During the course of the 1960s, computers and information networks made their appearance in the public imagination. To architects on the cusp of architecture's postmodern turn, information technology offered new forms, metaphors, and techniques by which modern architecture's technological and utopian basis could be reasserted. Yet by the end of…
NASA Technical Reports Server (NTRS)
Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven
1994-01-01
The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.
Evolution of Bow-Tie Architectures in Biology
Friedlander, Tamar; Mayo, Avraham E.; Tlusty, Tsvi; Alon, Uri
2015-01-01
Bow-tie or hourglass structure is a common architectural feature found in many biological systems. A bow-tie in a multi-layered structure occurs when intermediate layers have much fewer components than the input and output layers. Examples include metabolism where a handful of building blocks mediate between multiple input nutrients and multiple output biomass components, and signaling networks where information from numerous receptor types passes through a small set of signaling pathways to regulate multiple output genes. Little is known, however, about how bow-tie architectures evolve. Here, we address the evolution of bow-tie architectures using simulations of multi-layered systems evolving to fulfill a given input-output goal. We find that bow-ties spontaneously evolve when the information in the evolutionary goal can be compressed. Mathematically speaking, bow-ties evolve when the rank of the input-output matrix describing the evolutionary goal is deficient. The maximal compression possible (the rank of the goal) determines the size of the narrowest part of the network—that is the bow-tie. A further requirement is that a process is active to reduce the number of links in the network, such as product-rule mutations, otherwise a non-bow-tie solution is found in the evolutionary simulations. This offers a mechanism to understand a common architectural principle of biological systems, and a way to quantitate the effective rank of the goals under which they evolved. PMID:25798588
NASA Astrophysics Data System (ADS)
Ghasemi, S.; Khorasani, K.
2015-10-01
In this paper, the problem of fault detection and isolation (FDI) of the attitude control subsystem (ACS) of spacecraft formation flying systems is considered. For developing the FDI schemes, an extended Kalman filter (EKF) is utilised which belongs to a class of nonlinear state estimation methods. Three architectures, namely centralised, decentralised, and semi-decentralised, are considered and the corresponding FDI strategies are designed and constructed. Appropriate residual generation techniques and threshold selection criteria are proposed for these architectures. The capabilities of the proposed architectures for accomplishing the FDI tasks are studied through extensive numerical simulations for a team of four satellites in formation flight. Using a confusion matrix evaluation criterion, it is shown that the centralised architecture can achieve the most reliable results relative to the semi-decentralised and decentralised architectures at the expense of availability of a centralised processing module that requires the entire team information set. On the other hand, the semi-decentralised performance is close to the centralised scheme without relying on the availability of the entire team information set. Furthermore, the results confirm that the FDI results in formations with angular velocity measurement sensors achieve higher level of accuracy, true faulty, and precision, along with lower level of false healthy misclassification as compared to the formations that utilise attitude measurement sensors.
Research and development of service robot platform based on artificial psychology
NASA Astrophysics Data System (ADS)
Zhang, Xueyuan; Wang, Zhiliang; Wang, Fenhua; Nagai, Masatake
2007-12-01
Some related works about the control architecture of robot system are briefly summarized. According to the discussions above, this paper proposes control architecture of service robot based on artificial psychology. In this control architecture, the robot can obtain the cognition of environment through sensors, and then be handled with intelligent model, affective model and learning model, and finally express the reaction to the outside stimulation through its behavior. For better understanding the architecture, hierarchical structure is also discussed. The control system of robot can be divided into five layers, namely physical layer, drives layer, information-processing and behavior-programming layer, application layer and system inspection and control layer. This paper shows how to achieve system integration from hardware modules, software interface and fault diagnosis. Embedded system GENE-8310 is selected as the PC platform of robot APROS-I, and its primary memory media is CF card. The arms and body of the robot are constituted by 13 motors and some connecting fittings. Besides, the robot has a robot head with emotional facial expression, and the head has 13 DOFs. The emotional and intelligent model is one of the most important parts in human-machine interaction. In order to better simulate human emotion, an emotional interaction model for robot is proposed according to the theory of need levels of Maslom and mood information of Siminov. This architecture has already been used in our intelligent service robot.
Wright, Adam; Sittig, Dean F
2008-12-01
In this paper, we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. The SANDS architecture for decision support has several significant advantages over other architectures for clinical decision support. The most salient of these are:
Fast associative memory + slow neural circuitry = the computational model of the brain.
NASA Astrophysics Data System (ADS)
Berkovich, Simon; Berkovich, Efraim; Lapir, Gennady
1997-08-01
We propose a computational model of the brain based on a fast associative memory and relatively slow neural processors. In this model, processing time is expensive but memory access is not, and therefore most algorithmic tasks would be accomplished by using large look-up tables as opposed to calculating. The essential feature of an associative memory in this context (characteristic for a holographic type memory) is that it works without an explicit mechanism for resolution of multiple responses. As a result, the slow neuronal processing elements, overwhelmed by the flow of information, operate as a set of templates for ranking of the retrieved information. This structure addresses the primary controversy in the brain architecture: distributed organization of memory vs. localization of processing centers. This computational model offers an intriguing explanation of many of the paradoxical features in the brain architecture, such as integration of sensors (through DMA mechanism), subliminal perception, universality of software, interrupts, fault-tolerance, certain bizarre possibilities for rapid arithmetics etc. In conventional computer science the presented type of a computational model did not attract attention as it goes against the technological grain by using a working memory faster than processing elements.
Real-time information management environment (RIME)
NASA Astrophysics Data System (ADS)
DeCleene, Brian T.; Griffin, Sean; Matchett, Garry; Niejadlik, Richard
2000-08-01
Whereas data mining and exploitation improve the quality and quantity of information available to the user, there remains a mission requirement to assist the end-user in managing the access to this information and ensuring that the appropriate information is delivered to the right user in time to make decisions and take action. This paper discusses TASC's federated architecture to next- generation information management, contrasts the approach against emerging technologies, and quantifies the performance gains. This architecture and implementation, known as Real-time Information Management Environment (RIME), is based on two key concepts: information utility and content-based channelization. The introduction of utility allows users to express the importance and delivery requirements of their information needs in the context of their mission. Rather than competing for resources on a first-come/first-served basis, the infrastructure employs these utility functions to dynamically react to unanticipated loading by optimizing the delivered information utility. Furthermore, commander's resource policies shape these functions to ensure that resources are allocated according to military doctrine. Using information about the desired content, channelization identifies opportunities to aggregate users onto shared channels reducing redundant transmissions. Hence, channelization increases the information throughput of the system and balances sender/receiver processing load.
Project Integration Architecture
NASA Technical Reports Server (NTRS)
Jones, William Henry
2008-01-01
The Project Integration Architecture (PIA) is a distributed, object-oriented, conceptual, software framework for the generation, organization, publication, integration, and consumption of all information involved in any complex technological process in a manner that is intelligible to both computers and humans. In the development of PIA, it was recognized that in order to provide a single computational environment in which all information associated with any given complex technological process could be viewed, reviewed, manipulated, and shared, it is necessary to formulate all the elements of such a process on the most fundamental level. In this formulation, any such element is regarded as being composed of any or all of three parts: input information, some transformation of that input information, and some useful output information. Another fundamental principle of PIA is the assumption that no consumer of information, whether human or computer, can be assumed to have any useful foreknowledge of an element presented to it. Consequently, a PIA-compliant computing system is required to be ready to respond to any questions, posed by the consumer, concerning the nature of the proffered element. In colloquial terms, a PIA-compliant system must be prepared to provide all the information needed to place the element in context. To satisfy this requirement, PIA extends the previously established object-oriented- programming concept of self-revelation and applies it on a grand scale. To enable pervasive use of self-revelation, PIA exploits another previously established object-oriented-programming concept - that of semantic infusion through class derivation. By means of self-revelation and semantic infusion through class derivation, a consumer of information can inquire about the contents of all information entities (e.g., databases and software) and can interact appropriately with those entities. Other key features of PIA are listed.
Parallel photonic information processing at gigabyte per second data rates using transient states
NASA Astrophysics Data System (ADS)
Brunner, Daniel; Soriano, Miguel C.; Mirasso, Claudio R.; Fischer, Ingo
2013-01-01
The increasing demands on information processing require novel computational concepts and true parallelism. Nevertheless, hardware realizations of unconventional computing approaches never exceeded a marginal existence. While the application of optics in super-computing receives reawakened interest, new concepts, partly neuro-inspired, are being considered and developed. Here we experimentally demonstrate the potential of a simple photonic architecture to process information at unprecedented data rates, implementing a learning-based approach. A semiconductor laser subject to delayed self-feedback and optical data injection is employed to solve computationally hard tasks. We demonstrate simultaneous spoken digit and speaker recognition and chaotic time-series prediction at data rates beyond 1Gbyte/s. We identify all digits with very low classification errors and perform chaotic time-series prediction with 10% error. Our approach bridges the areas of photonic information processing, cognitive and information science.
Specification, Design, and Analysis of Advanced HUMS Architectures
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
2004-01-01
During the two-year project period, we have worked on several aspects of domain-specific architectures for HUMS. In particular, we looked at using scenario-based approach for the design and designed a language for describing such architectures. The language is now being used in all aspects of our HUMS design. In particular, we have made contributions in the following areas. 1) We have employed scenarios in the development of HUMS in three main areas. They are: (a) To improve reusability by using scenarios as a library indexing tool and as a domain analysis tool; (b) To improve maintainability by recording design rationales from two perspectives - problem domain and solution domain; (c) To evaluate the software architecture. 2) We have defined a new architectural language called HADL or HUMS Architectural Definition Language. It is a customized version of xArch/xADL. It is based on XML and, hence, is easily portable from domain to domain, application to application, and machine to machine. Specifications written in HADL can be easily read and parsed using the currently available XML parsers. Thus, there is no need to develop a plethora of software to support HADL. 3) We have developed an automated design process that involves two main techniques: (a) Selection of solutions from a large space of designs; (b) Synthesis of designs. However, the automation process is not an absolute Artificial Intelligence (AI) approach though it uses a knowledge-based system that epitomizes a specific HUMS domain. The process uses a database of solutions as an aid to solve the problems rather than creating a new design in the literal sense. Since searching is adopted as the main technique, the challenges involved are: (a) To minimize the effort in searching the database where a very large number of possibilities exist; (b) To develop representations that could conveniently allow us to depict design knowledge evolved over many years; (c) To capture the required information that aid the automation process.
Representations and processes of human spatial competence.
Gunzelmann, Glenn; Lyon, Don R
2011-10-01
This article presents an approach to understanding human spatial competence that focuses on the representations and processes of spatial cognition and how they are integrated with cognition more generally. The foundational theoretical argument for this research is that spatial information processing is central to cognition more generally, in the sense that it is brought to bear ubiquitously to improve the adaptivity and effectiveness of perception, cognitive processing, and motor action. We describe research spanning multiple levels of complexity to understand both the detailed mechanisms of spatial cognition, and how they are utilized in complex, naturalistic tasks. In the process, we discuss the critical role of cognitive architectures in developing a consistent account that spans this breadth, and we note some areas in which the current version of a popular architecture, ACT-R, may need to be augmented. Finally, we suggest a framework for understanding the representations and processes of spatial competence and their role in human cognition generally. Copyright © 2011 Cognitive Science Society, Inc.
A SOA-Based Solution to Monitor Vaccination Coverage Among HIV-Infected Patients in Liguria.
Giannini, Barbara; Gazzarata, Roberta; Sticchi, Laura; Giacomini, Mauro
2016-01-01
Vaccination in HIV-infected patients constitutes an essential tool in the prevention of the most common infectious diseases. The Ligurian Vaccination in HIV Program is a proposed vaccination schedule specifically dedicated to this risk group. Selective strategies are proposed within this program, employing ICT (Information and Communication) tools to identify this susceptible target group, to monitor immunization coverage over time and to manage failures and defaulting. The proposal is to connect an immunization registry system to an existing regional platform that allows clinical data re-use among several medical structures, to completely manage the vaccination process. This architecture will adopt a Service Oriented Architecture (SOA) approach and standard HSSP (Health Services Specification Program) interfaces to support interoperability. According to the presented solution, vaccination administration information retrieved from the immunization registry will be structured according to the specifications within the immunization section of the HL7 (Health Level 7) CCD (Continuity of Care Document) document. Immunization coverage will be evaluated through the continuous monitoring of serology and antibody titers gathered from the hospital LIS (Laboratory Information System) structured into a HL7 Version 3 (v3) Clinical Document Architecture Release 2 (CDA R2).
An Information Architect's View of Earth Observations for Disaster Risk Management
NASA Astrophysics Data System (ADS)
Moe, K.; Evans, J. D.; Cappelaere, P. G.; Frye, S. W.; Mandl, D.; Dobbs, K. E.
2014-12-01
Satellite observations play a significant role in supporting disaster response and risk management, however data complexity is a barrier to broader use especially by the public. In December 2013 the Committee on Earth Observation Satellites Working Group on Information Systems and Services documented a high-level reference model for the use of Earth observation satellites and associated products to support disaster risk management within the Global Earth Observation System of Systems context. The enterprise architecture identified the important role of user access to all key functions supporting situational awareness and decision-making. This paper focuses on the need to develop actionable information products from these Earth observations to simplify the discovery, access and use of tailored products. To this end, our team has developed an Open GeoSocial API proof-of-concept for GEOSS. We envision public access to mobile apps available on smart phones using common browsers where users can set up a profile and specify a region of interest for monitoring events such as floods and landslides. Information about susceptibility and weather forecasts about flood risks can be accessed. Users can generate geo-located information and photos of local events, and these can be shared on social media. The information architecture can address usability challenges to transform sensor data into actionable information, based on the terminology of the emergency management community responsible for informing the public. This paper describes the approach to collecting relevant material from the disasters and risk management community to address the end user needs for information. The resulting information architecture addresses the structural design of the shared information in the disasters and risk management enterprise. Key challenges are organizing and labeling information to support both online user communities and machine-to-machine processing for automated product generation.
Creating ISO/EN 13606 archetypes based on clinical information needs.
Rinner, Christoph; Kohler, Michael; Hübner-Bloder, Gudrun; Saboor, Samrend; Ammenwerth, Elske; Duftschmid, Georg
2011-01-01
Archetypes model individual EHR contents and build the basis of the dual-model approach used in the ISO/EN 13606 EHR architecture. We present an approach to create archetypes using an iterative development process. It includes automated generation of electronic case report forms from archetypes. We evaluated our approach by developing 128 archetypes which represent 446 clinical information items from the diabetes domain.
Amplifying genetic logic gates.
Bonnet, Jerome; Yin, Peter; Ortiz, Monica E; Subsoontorn, Pakpoom; Endy, Drew
2013-05-03
Organisms must process information encoded via developmental and environmental signals to survive and reproduce. Researchers have also engineered synthetic genetic logic to realize simpler, independent control of biological processes. We developed a three-terminal device architecture, termed the transcriptor, that uses bacteriophage serine integrases to control the flow of RNA polymerase along DNA. Integrase-mediated inversion or deletion of DNA encoding transcription terminators or a promoter modulates transcription rates. We realized permanent amplifying AND, NAND, OR, XOR, NOR, and XNOR gates actuated across common control signal ranges and sequential logic supporting autonomous cell-cell communication of DNA encoding distinct logic-gate states. The single-layer digital logic architecture developed here enables engineering of amplifying logic gates to control transcription rates within and across diverse organisms.
McEwan, Reed; Melton, Genevieve B.; Knoll, Benjamin C.; Wang, Yan; Hultman, Gretchen; Dale, Justin L.; Meyer, Tim; Pakhomov, Serguei V.
2016-01-01
Many design considerations must be addressed in order to provide researchers with full text and semantic search of unstructured healthcare data such as clinical notes and reports. Institutions looking at providing this functionality must also address the big data aspects of their unstructured corpora. Because these systems are complex and demand a non-trivial investment, there is an incentive to make the system capable of servicing future needs as well, further complicating the design. We present architectural best practices as lessons learned in the design and implementation NLP-PIER (Patient Information Extraction for Research), a scalable, extensible, and secure system for processing, indexing, and searching clinical notes at the University of Minnesota. PMID:27570663
Mission-Oriented Sensor Arrays and UAVs - a Case Study on Environmental Monitoring
NASA Astrophysics Data System (ADS)
Figueira, N. M.; Freire, I. L.; Trindade, O.; Simões, E.
2015-08-01
This paper presents a new concept of UAV mission design in geomatics, applied to the generation of thematic maps for a multitude of civilian and military applications. We discuss the architecture of Mission-Oriented Sensors Arrays (MOSA), proposed in Figueira et Al. (2013), aimed at splitting and decoupling the mission-oriented part of the system (non safety-critical hardware and software) from the aircraft control systems (safety-critical). As a case study, we present an environmental monitoring application for the automatic generation of thematic maps to track gunshot activity in conservation areas. The MOSA modeled for this application integrates information from a thermal camera and an on-the-ground microphone array. The use of microphone arrays technology is of particular interest in this paper. These arrays allow estimation of the direction-of-arrival (DOA) of the incoming sound waves. Information about events of interest is obtained by the fusion of the data provided by the microphone array, captured by the UAV, fused with information from the termal image processing. Preliminary results show the feasibility of the on-the-ground sound processing array and the simulation of the main processing module, to be embedded into an UAV in a future work. The main contributions of this paper are the proposed MOSA system, including concepts, models and architecture.
ERIC Educational Resources Information Center
Santos, Thomas W.
2008-01-01
This article describes New York City. It presents information about its history, immigration process, geography, architecture, rivers, bridges, famous buildings and parks, famous neighborhoods, arts and entertainment, and tourist attractions and activities. The article also provides useful websites about New York City. It ends with a text about…
DOT National Transportation Integrated Search
1999-07-01
This report presents an examination of the process used in preparing electronic credentials for commercial vehicle operations in Kentucky Maryland, and Virginia. It describes the experience of using the Commercial Vehicle Information Systems & Networ...
A Biological-Plausable Architecture for Shape Recognition
2006-06-30
between curves. Information Processing Letters, 64, 1997. [4] Irving Biederman . Recognition-by-components: A theory of human image understanding...Psychological Review, 94(2):115–147, 1987 . 43 [5] C. Cadieu, M. Kouh, M. Riesenhuber, and T. Poggio. Shape representation in v4: Investi- gating position
2009-05-27
technology network architecture to connect various DHS elements and promote information sharing.17 • Establish a DHS State, Local, and Regional...A Strategic Plan; training, and the implementation of a comprehensive information systems architecture .65 As part of its integration...information technology network architecture was submitted to Congress last year. See DHS I&A, Homeland Security Information Technology Network
Concept of Integrated Information Systems of Rail Transport
NASA Astrophysics Data System (ADS)
Siergiejczyk, Mirosław; Gago, Stanisław
This paper will present a need to create integrated information systems of the rail transport and their links with other means of public transportation. IT standards will be discussed that are expected to create the integrated information systems of the rail transport. Also the main tasks will be presented of centralized information systems, the concept of their architecture, business processes and their implementation as well as the proposed measures to secure data. A method shall be proposed to implement a system to inform participants of rail transport in Polish conditions.
The population health record: concepts, definition, design, and implementation.
Friedman, Daniel J; Parrish, R Gibson
2010-01-01
In 1997, the American Medical Informatics Association proposed a US information strategy that included a population health record (PopHR). Despite subsequent progress on the conceptualization, development, and implementation of electronic health records and personal health records, minimal progress has occurred on the PopHR. Adapting International Organization for Standarization electronic health records standards, we define the PopHR as a repository of statistics, measures, and indicators regarding the state of and influences on the health of a defined population, in computer processable form, stored and transmitted securely, and accessible by multiple authorized users. The PopHR is based upon an explicit population health framework and a standardized logical information model. PopHR purpose and uses, content and content sources, functionalities, business objectives, information architecture, and system architecture are described. Barriers to implementation and enabling factors and a three-stage implementation strategy are delineated.
Computational intelligence and neuromorphic computing potential for cybersecurity applications
NASA Astrophysics Data System (ADS)
Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.
2013-05-01
In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.
Industrial implementation of spatial variability control by real-time SPC
NASA Astrophysics Data System (ADS)
Roule, O.; Pasqualini, F.; Borde, M.
2016-10-01
Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.
All-memristive neuromorphic computing with level-tuned neurons
NASA Astrophysics Data System (ADS)
Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos
2016-09-01
In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.
All-memristive neuromorphic computing with level-tuned neurons.
Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos
2016-09-02
In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.
Identifying elements of the health care environment that contribute to wayfinding.
Pati, Debajyoti; Harvey, Thomas E; Willis, Douglas A; Pati, Sipra
2015-01-01
Identify aspects of the physical environment that inform wayfinding for visitors. Compare and contrast the identified elements in frequency of use. Gain an understanding of the role the different elements and attributes play in the wayfinding process. Wayfinding by patients and visitors is a documented problem in healthcare facilities. The few studies that have been conducted have identified some of the environmental elements that influence wayfinding. Moreover, literatures comparing different design strategies are absent. Currently there is limited knowledge to inform prioritization of strategies to optimize wayfinding within capital budget. A multi-method, non-experimental, qualitative, exploratory study design was adopted. The study was conducted in a large, acute care facility in Texas. Ten healthy adults in five age groups, representing both sexes, participated in the study as simulated visitors. Data collection included (a) verbal protocols during navigation; (b) questionnaire; and (c) verbal directions from hospital employees. Data were collected during Fall 2013. Physical design elements contributing to wayfinding include signs, architectural features, maps, interior elements (artwork, display boards, information counters, etc.), functional clusters, interior elements pairing, structural elements, and furniture. The information is used in different ways - some for primary navigational information, some for supporting navigational information, and some as familiarity markers. The physical environment has a critical role in aiding navigation in healthcare facilities. Architectural feature is the top contributor in the domain of architecture. Artwork (painting, sculpture, etc.) is the top contributor in the domain of interior design. © The Author(s) 2015.
The automated data processing architecture for the GPI Exoplanet Survey
NASA Astrophysics Data System (ADS)
Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce
2017-09-01
The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.
A self-scaling, distributed information architecture for public health, research, and clinical care.
McMurry, Andrew J; Gilbert, Clint A; Reis, Ben Y; Chueh, Henry C; Kohane, Isaac S; Mandl, Kenneth D
2007-01-01
This study sought to define a scalable architecture to support the National Health Information Network (NHIN). This architecture must concurrently support a wide range of public health, research, and clinical care activities. The architecture fulfils five desiderata: (1) adopt a distributed approach to data storage to protect privacy, (2) enable strong institutional autonomy to engender participation, (3) provide oversight and transparency to ensure patient trust, (4) allow variable levels of access according to investigator needs and institutional policies, (5) define a self-scaling architecture that encourages voluntary regional collaborations that coalesce to form a nationwide network. Our model has been validated by a large-scale, multi-institution study involving seven medical centers for cancer research. It is the basis of one of four open architectures developed under funding from the Office of the National Coordinator of Health Information Technology, fulfilling the biosurveillance use case defined by the American Health Information Community. The model supports broad applicability for regional and national clinical information exchanges. This model shows the feasibility of an architecture wherein the requirements of care providers, investigators, and public health authorities are served by a distributed model that grants autonomy, protects privacy, and promotes participation.
Sensor-based architecture for medical imaging workflow analysis.
Silva, Luís A Bastião; Campos, Samuel; Costa, Carlos; Oliveira, José Luis
2014-08-01
The growing use of computer systems in medical institutions has been generating a tremendous quantity of data. While these data have a critical role in assisting physicians in the clinical practice, the information that can be extracted goes far beyond this utilization. This article proposes a platform capable of assembling multiple data sources within a medical imaging laboratory, through a network of intelligent sensors. The proposed integration framework follows a SOA hybrid architecture based on an information sensor network, capable of collecting information from several sources in medical imaging laboratories. Currently, the system supports three types of sensors: DICOM repository meta-data, network workflows and examination reports. Each sensor is responsible for converting unstructured information from data sources into a common format that will then be semantically indexed in the framework engine. The platform was deployed in the Cardiology department of a central hospital, allowing identification of processes' characteristics and users' behaviours that were unknown before the utilization of this solution.
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
This paper presents an overview of the application of the Space Generic Open Avionics Architecture (SGOAA) to the Space Shuttle Data Processing System (DPS) architecture design. This application has been performed to validate the SGOAA, and its potential use in flight critical systems. The paper summarizes key elements of the Space Shuttle avionics architecture, data processing system requirements and software architecture as currently implemented. It then summarizes the SGOAA architecture and describes a tailoring of the SGOAA to the Space Shuttle. The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, a six class model of interfaces and functional subsystem architectures for data services and operations control capabilities. It has been proposed as an avionics architecture standard with the National Aeronautics and Space Administration (NASA), through its Strategic Avionics Technology Working Group, and is being considered by the Society of Aeronautic Engineers (SAE) as an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division of JSC by the Lockheed Engineering and Sciences Company, Houston, Texas.
NASA Astrophysics Data System (ADS)
Curiac, Daniel-Ioan; Pachia, Mihai
2015-05-01
Information security represents the cornerstone of every data processing system that resides in an organisation's trusted network, implementing all necessary protocols, mechanisms and policies to be one step ahead of possible threats. Starting from the need to strengthen the set of security services, in this article we introduce a new and innovative process named controlled information destruction (CID) that is meant to secure sensitive data that are no longer needed for the organisation's future purposes but would be very damaging if revealed. The disposal of this type of data has to be controlled carefully in order to delete not only the information itself but also all its splinters spread throughout the network, thus denying any possibility of recovering the information after its alleged destruction. This process leads to a modified model of information assurance and also reconfigures the architecture of any information security management system. The scheme we envisioned relies on a reshaped information lifecycle, which reveals the impact of the CID procedure directly upon the information states.
Reference architecture of application services for personal wellbeing information management.
Tuomainen, Mika; Mykkänen, Juha
2011-01-01
Personal information management has been proposed as an important enabler for individual empowerment concerning citizens' wellbeing and health information. In the MyWellbeing project in Finland, a strictly citizen-driven concept of "Coper" and related architectural and functional guidelines have been specified. We present a reference architecture and a set of identified application services to support personal wellbeing information management. In addition, the related standards and developments are discussed.
2010-03-19
network architecture to connect various DHS elements and promote information sharing.17 • Establish a DHS State, Local, and Regional Fusion Center...of reports; the I&A Strategic Plan; training, and the implementation of a comprehensive information systems architecture .73 As part of its...comprehensive information technology network architecture was submitted to Congress last year. See DHS I&A, Homeland Security Information Technology Network
Block Architecture Problem with Depth First Search Solution and Its Application
NASA Astrophysics Data System (ADS)
Rahim, Robbi; Abdullah, Dahlan; Simarmata, Janner; Pranolo, Andri; Saleh Ahmar, Ansari; Hidayat, Rahmat; Napitupulu, Darmawan; Nurdiyanto, Heri; Febriadi, Bayu; Zamzami, Z.
2018-01-01
Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.
Fault-Tolerant Signal Processing Architectures with Distributed Error Control.
1985-01-01
Zm, Revisited," Information and Control, Vol. 37, pp. 100-104, 1978. 13. J. Wakerly , Error Detecting Codes. SeIf-Checkino Circuits and Applications ...However, the newer results concerning applications of real codes are still in the publication process. Hence, two very detailed appendices are included to...significant entities to be protected. While the distributed finite field approach afforded adequate protection, its applicability was restricted and
A Kinder and Gentler Transformation?
ERIC Educational Resources Information Center
Katz, Richard N.; Goldstein, Larry; Dobbin, Gregory
2001-01-01
Discusses the shifting focus regarding information technology in higher education from technology itself toward the people and business processes connected with it. Describes the University of California's efforts toward a new business architecture, and an Educause-sponsored forum on e-business discussing the same themes. Offers discussion of the…
Internet Architecture: Lessons Learned and Looking Forward
2006-12-01
Internet Architecture: Lessons Learned and Looking Forward Geoffrey G. Xie Department of Computer Science Naval Postgraduate School April 2006... Internet architecture. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...readers are referred there for more information about a specific protocol or concept. 2. Origin of Internet Architecture The Internet is easily
An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures
NASA Astrophysics Data System (ADS)
Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.
2009-07-01
A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.
Flexible medical image management using service-oriented architecture.
Shaham, Oded; Melament, Alex; Barak-Corren, Yuval; Kostirev, Igor; Shmueli, Noam; Peres, Yardena
2012-01-01
Management of medical images increasingly involves the need for integration with a variety of information systems. To address this need, we developed Content Management Offering (CMO), a platform for medical image management supporting interoperability through compliance with standards. CMO is based on the principles of service-oriented architecture, implemented with emphasis on three areas: clarity of business process definition, consolidation of service configuration management, and system scalability. Owing to the flexibility of this platform, a small team is able to accommodate requirements of customers varying in scale and in business needs. We describe two deployments of CMO, highlighting the platform's value to customers. CMO represents a flexible approach to medical image management, which can be applied to a variety of information technology challenges in healthcare and life sciences organizations.
Development of a graphical user interface for the global land information system (GLIS)
Alstad, Susan R.; Jackson, David A.
1993-01-01
The process of developing a Motif Graphical User Interface for the Global Land Information System (GLIS) involved incorporating user requirements, in-house visual and functional design requirements, and Open Software Foundation (OSF) Motif style guide standards. Motif user interface windows have been developed using the software to support Motif window functions war written using the C programming language. The GLIS architecture was modified to support multiple servers and remote handlers running the X Window System by forming a network of servers and handlers connected by TCP/IP communications. In April 1993, prior to release the GLIS graphical user interface and system architecture modifications were test by developers and users located at the EROS Data Center and 11 beta test sites across the country.
A novel software architecture for the provision of context-aware semantic transport information.
Moreno, Asier; Perallos, Asier; López-de-Ipiña, Diego; Onieva, Enrique; Salaberria, Itziar; Masegosa, Antonio D
2015-05-26
The effectiveness of Intelligent Transportation Systems depends largely on the ability to integrate information from diverse sources and the suitability of this information for the specific user. This paper describes a new approach for the management and exchange of this information, related to multimodal transportation. A novel software architecture is presented, with particular emphasis on the design of the data model and the enablement of services for information retrieval, thereby obtaining a semantic model for the representation of transport information. The publication of transport data as semantic information is established through the development of a Multimodal Transport Ontology (MTO) and the design of a distributed architecture allowing dynamic integration of transport data. The advantages afforded by the proposed system due to the use of Linked Open Data and a distributed architecture are stated, comparing it with other existing solutions. The adequacy of the information generated in regard to the specific user's context is also addressed. Finally, a working solution of a semantic trip planner using actual transport data and running on the proposed architecture is presented, as a demonstration and validation of the system.
Standardizing the information architecture for spacecraft operations
NASA Technical Reports Server (NTRS)
Easton, C. R.
1994-01-01
This paper presents an information architecture developed for the Space Station Freedom as a model from which to derive an information architecture standard for advanced spacecraft. The information architecture provides a way of making information available across a program, and among programs, assuming that the information will be in a variety of local formats, structures and representations. It provides a format that can be expanded to define all of the physical and logical elements that make up a program, add definitions as required, and import definitions from prior programs to a new program. It allows a spacecraft and its control center to work in different representations and formats, with the potential for supporting existing spacecraft from new control centers. It supports a common view of data and control of all spacecraft, regardless of their own internal view of their data and control characteristics, and of their communications standards, protocols and formats. This information architecture is central to standardizing spacecraft operations, in that it provides a basis for information transfer and translation, such that diverse spacecraft can be monitored and controlled in a common way.
Jupiter Europa Orbiter Architecture Definition Process
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Shishko, Robert
2011-01-01
The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.
NASA Astrophysics Data System (ADS)
Baik, A.; Yaagoubi, R.; Boehm, J.
2015-08-01
This work outlines a new approach for the integration of 3D Building Information Modelling and the 3D Geographic Information System (GIS) to provide semantically rich models, and to get the benefits from both systems to help document and analyse cultural heritage sites. Our proposed framework is based on the Jeddah Historical Building Information Modelling process (JHBIM). This JHBIM consists of a Hijazi Architectural Objects Library (HAOL) that supports higher level of details (LoD) while decreasing the time of modelling. The Hijazi Architectural Objects Library has been modelled based on the Islamic historical manuscripts and Hijazi architectural pattern books. Moreover, the HAOL is implemented using BIM software called Autodesk Revit. However, it is known that this BIM environment still has some limitations with the non-standard architectural objects. Hence, we propose to integrate the developed 3D JHBIM with 3D GIS for more advanced analysis. To do so, the JHBIM database is exported and semantically enriched with non-architectural information that is necessary for restoration and preservation of historical monuments. After that, this database is integrated with the 3D Model in the 3D GIS solution. At the end of this paper, we'll illustrate our proposed framework by applying it to a Historical Building called Nasif Historical House in Jeddah. First of all, this building is scanned by the use of a Terrestrial Laser Scanner (TLS) and Close Range Photogrammetry. Then, the 3D JHBIM based on the HOAL is designed on Revit Platform. Finally, this model is integrated to a 3D GIS solution through Autodesk InfraWorks. The shown analysis presented in this research highlights the importance of such integration especially for operational decisions and sharing the historical knowledge about Jeddah Historical City. Furthermore, one of the historical buildings in Old Jeddah, Nasif Historical House, was chosen as a test case for the project.
NASA Technical Reports Server (NTRS)
Laurini, Kathleen C.; Hufenbach, Bernhard; Junichiro, Kawaguchi; Piedboeuf, Jean-Claude; Schade, Britta; Lorenzoni, Andrea; Curtis, Jeremy; Hae-Dong, Kim
2010-01-01
The International Space Exploration Coordination Group (ISECG) was established in response to The Global Exploration Strategy: The Framework for Coordination developed by fourteen space agencies and released in May 2007. Several ISECG participating space agencies have been studying concepts for human exploration of the moon that allow individual and collective goals and objectives to be met. This 18 month study activity culminated with the development of the ISECG Reference Architecture for Human Lunar Exploration. The reference architecture is a series of elements delivered over time in a flexible and evolvable campaign. This paper will describe the reference architecture and how it will inform near-term and long-term programmatic planning within interested agencies. The reference architecture is intended to serve as a global point of departure conceptual architecture that enables individual agency investments in technology development and demonstration, International Space Station research and technology demonstration, terrestrial analog studies, and robotic precursor missions to contribute towards the eventual implementation of a human lunar exploration scenario which reflects the concepts and priorities established to date. It also serves to create opportunities for partnerships that will support evolution of this concept and its eventual realization. The ISECG Reference Architecture for Human Lunar Exploration (commonly referred to as the lunar gPoD) reflects the agency commitments to finding an effective balance between conducting important scientific investigations of and from the moon, as well as demonstrating and mastering the technologies and capabilities to send humans farther into the Solar System. The lunar gPoD begins with a robust robotic precursor phase that demonstrates technologies and capabilities considered important for the success of the campaign. Robotic missions will inform the human missions and buy down risks. Human exploration will start with a thorough scientific investigation of the polar region while allowing the ability to demonstrate and validate the systems needed to take humans on more ambitious lunar exploration excursions. The ISECG Reference Architecture for Human Lunar Exploration serves as a model for future cooperation and is documented in a summary report and a comprehensive document that also describes the collaborative international process that led to its development. ISECG plans to continue with architecture studies such as this to examine an open transportation architecture and other destinations, with expanded participation from ISECG agencies, as it works to inform international partnerships and advance the Global Exploration Strategy.
Updated Mars Mission Architectures Featuring Nuclear Thermal Propulsion
NASA Technical Reports Server (NTRS)
Rodriguez, Mitchell A.; Percy, Thomas K.
2017-01-01
Nuclear thermal propulsion (NTP) can potentially enable routine human exploration of Mars and the solar system. By using nuclear fission instead of a chemical combustion process, and using hydrogen as the propellant, NTP systems promise rocket efficiencies roughly twice that of the best chemical rocket engines currently available. The most recent major Mars architecture study featuring NTP was the Design Reference Architecture 5.0 (DRA 5.0), performed in 2009. Currently, the predominant transportation options being considered are solar electric propulsion (SEP) and chemical propulsion; however, given NTP's capabilities, an updated architectural analysis is needed. This paper provides a top-level overview of several different architectures featuring updated NTP performance data. New architectures presented include a proposed update to the DRA 5.0 as well as an investigation of architectures based on the current Evolvable Mars Campaign, which is the focus of NASA's current analyses for the Journey to Mars. Architectures investigated leverage the latest information relating to NTP performance and design considerations and address new support elements not available at the time of DRA 5.0, most notably the Orion crew module and the Space Launch System (SLS). The paper provides a top level quantitative comparison of key performance metrics as well as a qualitative discussion of improvements and key challenges still to be addressed. Preliminary results indicate that the updated NTP architectures can significantly reduce the campaign mass and subsequently the costs for assembly and number of launches.
An Assessment of Behavioral Dynamic Information Processing Measures in Audiovisual Speech Perception
Altieri, Nicholas; Townsend, James T.
2011-01-01
Research has shown that visual speech perception can assist accuracy in identification of spoken words. However, little is known about the dynamics of the processing mechanisms involved in audiovisual integration. In particular, architecture and capacity, measured using response time methodologies, have not been investigated. An issue related to architecture concerns whether the auditory and visual sources of the speech signal are integrated “early” or “late.” We propose that “early” integration most naturally corresponds to coactive processing whereas “late” integration corresponds to separate decisions parallel processing. We implemented the double factorial paradigm in two studies. First, we carried out a pilot study using a two-alternative forced-choice discrimination task to assess architecture, decision rule, and provide a preliminary assessment of capacity (integration efficiency). Next, Experiment 1 was designed to specifically assess audiovisual integration efficiency in an ecologically valid way by including lower auditory S/N ratios and a larger response set size. Results from the pilot study support a separate decisions parallel, late integration model. Results from both studies showed that capacity was severely limited for high auditory signal-to-noise ratios. However, Experiment 1 demonstrated that capacity improved as the auditory signal became more degraded. This evidence strongly suggests that integration efficiency is vitally affected by the S/N ratio. PMID:21980314
NASA Astrophysics Data System (ADS)
Maiti, Anup Kumar; Nath Roy, Jitendra; Mukhopadhyay, Sourangshu
2007-08-01
In the field of optical computing and parallel information processing, several number systems have been used for different arithmetic and algebraic operations. Therefore an efficient conversion scheme from one number system to another is very important. Modified trinary number (MTN) has already taken a significant role towards carry and borrow free arithmetic operations. In this communication, we propose a tree-net architecture based all optical conversion scheme from binary number to its MTN form. Optical switch using nonlinear material (NLM) plays an important role.
NASA Astrophysics Data System (ADS)
Suarez, Hernan; Zhang, Yan R.
2015-05-01
New radar applications need to perform complex algorithms and process large quantity of data to generate useful information for the users. This situation has motivated the search for better processing solutions that include low power high-performance processors, efficient algorithms, and high-speed interfaces. In this work, hardware implementation of adaptive pulse compression for real-time transceiver optimization are presented, they are based on a System-on-Chip architecture for Xilinx devices. This study also evaluates the performance of dedicated coprocessor as hardware accelerator units to speed up and improve the computation of computing-intensive tasks such matrix multiplication and matrix inversion which are essential units to solve the covariance matrix. The tradeoffs between latency and hardware utilization are also presented. Moreover, the system architecture takes advantage of the embedded processor, which is interconnected with the logic resources through the high performance AXI buses, to perform floating-point operations, control the processing blocks, and communicate with external PC through a customized software interface. The overall system functionality is demonstrated and tested for real-time operations using a Ku-band tested together with a low-cost channel emulator for different types of waveforms.
SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.
Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani
2016-01-01
Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed architecture׳s objectives, including resource awareness, smart data integration and visualization, cost reduction, and performance guarantee. Copyright © 2015 Elsevier Ltd. All rights reserved.
Service-Oriented Architecture Approach to MAGTF Logistics Support Systems
2013-09-01
Support System-Marine Corps IT Information Technology KPI Key Performance Indicators LCE Logistics Command Element ITV In-transit Visibility LCM...building blocks, options, KPI (key performance indicators), design decisions and the corresponding; the physical attributes which is the second attribute... KPI ) that they impact. h. Layer 8 (Information Architecture) The business intelligence layer and information architecture safeguards the inclusion
NASA Astrophysics Data System (ADS)
Armstrong, Michael James
Increases in power demands and changes in the design practices of overall equipment manufacturers has led to a new paradigm in vehicle systems definition. The development of unique power systems architectures is of increasing importance to overall platform feasibility and must be pursued early in the aircraft design process. Many vehicle systems architecture trades must be conducted concurrent to platform definition. With an increased complexity introduced during conceptual design, accurate predictions of unit level sizing requirements must be made. Architecture specific emergent requirements must be identified which arise due to the complex integrated effect of unit behaviors. Off-nominal operating scenarios present sizing critical requirements to the aircraft vehicle systems. These requirements are architecture specific and emergent. Standard heuristically defined failure mitigation is sufficient for sizing traditional and evolutionary architectures. However, architecture concepts which vary significantly in terms of structure and composition require that unique failure mitigation strategies be defined for accurate estimations of unit level requirements. Identifying of these off-nominal emergent operational requirements require extensions to traditional safety and reliability tools and the systematic identification of optimal performance degradation strategies. Discrete operational constraints posed by traditional Functional Hazard Assessment (FHA) are replaced by continuous relationships between function loss and operational hazard. These relationships pose the objective function for hazard minimization. Load shedding optimization is performed for all statistically significant failures by varying the allocation of functional capability throughout the vehicle systems architecture. Expressing hazards, and thereby, reliability requirements as continuous relationships with the magnitude and duration of functional failure requires augmentations to the traditional means for system safety assessment (SSA). The traditional two state and discrete system reliability assessment proves insufficient. Reliability is, therefore, handled in an analog fashion: as a function of magnitude of failure and failure duration. A series of metrics are introduced which characterize system performance in terms of analog hazard probabilities. These include analog and cumulative system and functional risk, hazard correlation, and extensions to the traditional component importance metrics. Continuous FHA, load shedding optimization, and analog SSA constitute the SONOMA process (Systematic Off-Nominal Requirements Analysis). Analog system safety metrics inform both architecture optimization (changes in unit level capability and reliability) and architecture augmentation (changes in architecture structure and composition). This process was applied for two vehicle systems concepts (conventional and 'more-electric') in terms of loss/hazard relationships with varying degrees of fidelity. Application of this process shows that the traditional assumptions regarding the structure of the function loss vs. hazard relationship apply undue design bias to functions and components during exploratory design. This bias is illustrated in terms of inaccurate estimations of the system and function level risk and unit level importance. It was also shown that off-nominal emergent requirements must be defined specific to each architecture concept. Quantitative comparisons of architecture specific off-nominal performance were obtained which provide evidence to the need for accurate definition of load shedding strategies during architecture exploratory design. Formally expressing performance degradation strategies in terms of the minimization of a continuous hazard space enhances the system architects ability to accurately predict sizing critical emergent requirements concurrent to architecture definition. Furthermore, the methods and frameworks generated here provide a structured and flexible means for eliciting these architecture specific requirements during the performance of architecture trades.
Optimization of Wireless Transceivers under Processing Energy Constraints
NASA Astrophysics Data System (ADS)
Wang, Gaojian; Ascheid, Gerd; Wang, Yanlu; Hanay, Oner; Negra, Renato; Herrmann, Matthias; Wehn, Norbert
2017-09-01
Focus of the article is on achieving maximum data rates under a processing energy constraint. For a given amount of processing energy per information bit, the overall power consumption increases with the data rate. When targeting data rates beyond 100 Gb/s, the system's overall power consumption soon exceeds the power which can be dissipated without forced cooling. To achieve a maximum data rate under this power constraint, the processing energy per information bit must be minimized. Therefore, in this article, suitable processing efficient transmission schemes together with energy efficient architectures and their implementations are investigated in a true cross-layer approach. Target use cases are short range wireless transmitters working at carrier frequencies around 60 GHz and bandwidths between 1 GHz and 10 GHz.
Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process
NASA Technical Reports Server (NTRS)
Racette, Paul
2010-01-01
Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.
2003-09-01
BLANK xv LIST OF ACRONYMS ABC Activity Based Costing ADO ActiveX Data Object ASP Application Server Page BPR Business Process Re...processes uses people and systems (hardware, software, machinery, etc.) and that these people and systems contain the “corporate” knowledge of the...server architecture was also a high maintenance item. Data was no longer contained on one mainframe but was distributed throughout the enterprise
Architecture of COOPTO Remote Voting Solution
NASA Astrophysics Data System (ADS)
Silhavy, Radek; Silhavy, Petr; Prokopova, Zdenka
This contribution focuses on investigation of remote electronic voting system, named COOPTO. Researching of suitability of electronic voting solution is forced by necessity of the improvement election process. The COOPTO is based on topical investigation of voting process and their implementation of using modern information and communication technology. The COOPTO allows voters, who are not in their election district, to participate in the democracy process. The aim of this contribution is to describe results of the development of the COOPTO solutions.
Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective
NASA Astrophysics Data System (ADS)
Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.
1997-01-01
Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.
Shim, Vickie B; Hunter, Peter J; Pivonka, Peter; Fernandez, Justin W
2011-12-01
The initiation of osteoarthritis (OA) has been linked to the onset and progression of pathologic mechanisms at the cartilage-bone interface. Most importantly, this degenerative disease involves cross-talk between the cartilage and subchondral bone environments, so an informative model should contain the complete complex. In order to evaluate this process, we have developed a multiscale model using the open-source ontologies developed for the Physiome Project with cartilage and bone descriptions at the cellular, micro, and macro levels. In this way, we can effectively model the influence of whole body loadings at the macro level and the influence of bone organization and architecture at the micro level, and have cell level processes that determine bone and cartilage remodeling. Cell information is then passed up the spatial scales to modify micro architecture and provide a macro spatial characterization of cartilage inflammation. We evaluate the framework by linking a common knee injury (anterior cruciate ligament deficiency) to proinflammatory mediators as a possible pathway to initiate OA. This framework provides a "virtual bone-cartilage" tool for evaluating hypotheses, treatment effects, and disease onset to inform and strengthen clinical studies.
Spin transport in epitaxial graphene
NASA Astrophysics Data System (ADS)
Tbd, -
2014-03-01
Spintronics is a paradigm focusing on spin as the information vector in fast and ultra-low-power non volatile devices such as the new STT-MRAM. Beyond its widely distributed application in data storage it aims at providing more complex architectures and a powerful beyond CMOS solution for information processing. The recent discovery of graphene has opened novel exciting opportunities in terms of functionalities and performances for spintronics devices. We will present experimental results allowing us to assess the potential of graphene for spintronics. We will show that unprecedented highly efficient spin information transport can occur in epitaxial graphene leading to large spin signals and macroscopic spin diffusion lengths (~ 100 microns), a key enabler for the advent of envisioned beyond-CMOS spin-based logic architectures. We will also show that how the device behavior is well explained within the framework of the Valet-Fert drift-diffusion equations. Furthermore, we will show that a thin graphene passivation layer can prevent the oxidation of a ferromagnet, enabling its use in novel humide/ambient low-cost processes for spintronics devices, while keeping its highly surface sensitive spin current polarizer/analyzer behavior and adding new enhanced spin filtering property. These different experiments unveil promising uses of graphene for spintronics.
Business Process Aware IS Change Management in SMEs
NASA Astrophysics Data System (ADS)
Makna, Janis
Changes in the business process usually require changes in the computer supported information system and, vice versa, changes in the information system almost always cause at least some changes in the business process. In many situations it is not even possible to detect which of those changes are causes and which of them are effects. Nevertheless, it is possible to identify a set of changes that usually happen when one of the elements of the set changes its state. These sets of changes may be used as patterns for situation analysis to anticipate full range of activities to be performed to get the business process and/or information system back to the stable state after it is lost because of the changes in one of the elements. Knowledge about the change pattern gives an opportunity to manage changes of information systems even if business process models and information systems architecture are not neatly documented as is the case in many SMEs. Using change patterns it is possible to know whether changes in information systems are to be expected and how changes in information systems activities, data and users will impact different aspects of the business process supported by the information system.
Analysis of fault-tolerant neurocontrol architectures
NASA Technical Reports Server (NTRS)
Troudet, T.; Merrill, W.
1992-01-01
The fault-tolerance of analog parallel distributed implementations of a multivariable aircraft neurocontroller is analyzed by simulating weight and neuron failures in a simplified scheme of analog processing based on the functional architecture of the ETANN chip (Electrically Trainable Artificial Neural Network). The neural information processing is found to be only partially distributed throughout the set of weights of the neurocontroller synthesized with the backpropagation algorithm. Although the degree of distribution of the neural processing, and consequently the fault-tolerance of the neurocontroller, could be enhanced using Locally Distributed Weight and Neuron Approaches, a satisfactory level of fault-tolerance could only be obtained by retraining the degrated VLSI neurocontroller. The possibility of maintaining neurocontrol performance and stability in the presence of single weight of neuron failures was demonstrated through an automated retraining procedure of the neurocontroller based on a pre-programmed choice and sequence of the training parameters.
Information network architectures
NASA Technical Reports Server (NTRS)
Murray, N. D.
1985-01-01
Graphs, charts, diagrams and outlines of information relative to information network architectures for advanced aerospace missions, such as the Space Station, are presented. Local area information networks are considered a likely technology solution. The principle needs for the network are listed.
SecureCore Security Architecture: Authority Mode and Emergency Management
2007-10-16
can shield first responders from social vultures (e.g., “ambulance chasers”) or malicious parties who could intentionally interfere with emergency...hierarchical design Communications Management: network communication Process Management...and Emergency Management 1 I. Introduction During many crises, first- responder access to sensitive, restricted emergency information is
Art and Museum Librarianship; A Syllabus and Bibliography. Bibliographic Studies Number One.
ERIC Educational Resources Information Center
Lemke, Antje B.
This outline of library science in the area of museology and art history provides bibliographies on various facets of art librarianship; art; architecture; museums; history; current state; journals; professional programs and organizations; relationship with government, foundations, and business; information sources; processing of art books,…
Linguistically Motivated Features for CCG Realization Ranking
ERIC Educational Resources Information Center
Rajkumar, Rajakrishnan
2012-01-01
Natural Language Generation (NLG) is the process of generating natural language text from an input, which is a communicative goal and a database or knowledge base. Informally, the architecture of a standard NLG system consists of the following modules (Reiter and Dale, 2000): content determination, sentence planning (or microplanning) and surface…
MONARCH: A Morphable Networked micro-ARCHitecture
2002-09-01
USC INFORMATION INSTITUTE Principal Investigator John Granacki RESEARCH STAFF Jeff Draper Pedro Diniz Jeff LaCoss Co-Principal Investigator Michael...representative applications MONARCH Processing Card - 6Ux160 double euro card form factor - B ac kp la n e In te rc o n n ec t MONARCH chip EDGE MEMORY
Wright, Adam; Sittig, Dean F.
2008-01-01
In this paper we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. PMID:18434256
van de Burgt, Yoeri; Lubberman, Ewout; Fuller, Elliot J.; ...
2017-02-20
The brain is capable of massively parallel information processing while consuming only ~1- 100 fJ per synaptic event. Inspired by the efficiency of the brain, CMOS-based neural architectures and memristors are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low energymore » (<10 pJ for 10 3 μm 2 devices) and voltage, displays >500 distinct, non-volatile conductance states within a ~1 V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODEs are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems. Mechanical flexibility makes ENODes compatible with 3D architectures, opening a path towards extreme interconnectivity comparable to the human brain.« less
NASA Astrophysics Data System (ADS)
van de Burgt, Yoeri; Lubberman, Ewout; Fuller, Elliot J.; Keene, Scott T.; Faria, Grégorio C.; Agarwal, Sapan; Marinella, Matthew J.; Alec Talin, A.; Salleo, Alberto
2017-04-01
The brain is capable of massively parallel information processing while consuming only ~1-100 fJ per synaptic event. Inspired by the efficiency of the brain, CMOS-based neural architectures and memristors are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low voltage and energy (<10 pJ for 103 μm2 devices), displays >500 distinct, non-volatile conductance states within a ~1 V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODes are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems. Mechanical flexibility makes ENODes compatible with three-dimensional architectures, opening a path towards extreme interconnectivity comparable to the human brain.
van de Burgt, Yoeri; Lubberman, Ewout; Fuller, Elliot J; Keene, Scott T; Faria, Grégorio C; Agarwal, Sapan; Marinella, Matthew J; Alec Talin, A; Salleo, Alberto
2017-04-01
The brain is capable of massively parallel information processing while consuming only ∼1-100 fJ per synaptic event. Inspired by the efficiency of the brain, CMOS-based neural architectures and memristors are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low voltage and energy (<10 pJ for 10 3 μm 2 devices), displays >500 distinct, non-volatile conductance states within a ∼1 V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODes are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems. Mechanical flexibility makes ENODes compatible with three-dimensional architectures, opening a path towards extreme interconnectivity comparable to the human brain.
Sample RFP for Architectural Services, 2000.
ERIC Educational Resources Information Center
Arizona State School Facilities Board, Phoenix.
This document presents a sample request for proposal that Arizona school districts can use when requesting architectural services, from the general request requirements to response information and signature sheet. General proposal requirements cover such areas as information on special terms and conditions, the scope of architectural services…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Velsko, Stephan; Bates, Thomas
Despite numerous calls for improvement, the U.S. biosurveillance enterprise remains a patchwork of uncoordinated systems that fail to take advantage of the rapid progress in information processing, communication, and analytics made in the past decade. By synthesizing components from the extensive biosurveillance literature, we propose a conceptual framework for a national biosurveillance architecture and provide suggestions for implementation. The framework differs from the current federal biosurveillance development pathway in that it is not focused on systems useful for “situational awareness,” but is instead focused on the long-term goal of having true warning capabilities. Therefore, a guiding design objective is themore » ability to digitally detect emerging threats that span jurisdictional boundaries, because attempting to solve the most challenging biosurveillance problem first provides the strongest foundation to meet simpler surveillance objectives. Core components of the vision are: (1) a whole-of-government approach to support currently disparate federal surveillance efforts that have a common data need, including those for food safety, vaccine and medical product safety, and infectious disease surveillance; (2) an information architecture that enables secure, national access to electronic health records, yet does not require that data be sent to a centralized location for surveillance analysis; (3) an inference architecture that leverages advances in ‘big data’ analytics and learning inference engines—a significant departure from the statistical process control paradigm that underpins nearly all current syndromic surveillance systems; and, (4) an organizational architecture with a governance model aimed at establishing national biosurveillance as a critical part of the U.S. national infrastructure. Although it will take many years to implement, and a national campaign of education and debate to acquire public buy-in for such a comprehensive system, the potential benefits warrant increased consideration within the U.S. government.« less
Velsko, Stephan; Bates, Thomas
2016-06-17
Despite numerous calls for improvement, the U.S. biosurveillance enterprise remains a patchwork of uncoordinated systems that fail to take advantage of the rapid progress in information processing, communication, and analytics made in the past decade. By synthesizing components from the extensive biosurveillance literature, we propose a conceptual framework for a national biosurveillance architecture and provide suggestions for implementation. The framework differs from the current federal biosurveillance development pathway in that it is not focused on systems useful for “situational awareness,” but is instead focused on the long-term goal of having true warning capabilities. Therefore, a guiding design objective is themore » ability to digitally detect emerging threats that span jurisdictional boundaries, because attempting to solve the most challenging biosurveillance problem first provides the strongest foundation to meet simpler surveillance objectives. Core components of the vision are: (1) a whole-of-government approach to support currently disparate federal surveillance efforts that have a common data need, including those for food safety, vaccine and medical product safety, and infectious disease surveillance; (2) an information architecture that enables secure, national access to electronic health records, yet does not require that data be sent to a centralized location for surveillance analysis; (3) an inference architecture that leverages advances in ‘big data’ analytics and learning inference engines—a significant departure from the statistical process control paradigm that underpins nearly all current syndromic surveillance systems; and, (4) an organizational architecture with a governance model aimed at establishing national biosurveillance as a critical part of the U.S. national infrastructure. Although it will take many years to implement, and a national campaign of education and debate to acquire public buy-in for such a comprehensive system, the potential benefits warrant increased consideration within the U.S. government.« less
Natural language processing: an introduction.
Nadkarni, Prakash M; Ohno-Machado, Lucila; Chapman, Wendy W
2011-01-01
To provide an overview and tutorial of natural language processing (NLP) and modern NLP-system design. This tutorial targets the medical informatics generalist who has limited acquaintance with the principles behind NLP and/or limited knowledge of the current state of the art. We describe the historical evolution of NLP, and summarize common NLP sub-problems in this extensive field. We then provide a synopsis of selected highlights of medical NLP efforts. After providing a brief description of common machine-learning approaches that are being used for diverse NLP sub-problems, we discuss how modern NLP architectures are designed, with a summary of the Apache Foundation's Unstructured Information Management Architecture. We finally consider possible future directions for NLP, and reflect on the possible impact of IBM Watson on the medical field.
Introducing Technical Aspects of Research Data Management in the Leipzig Health Atlas.
Meineke, Frank A; Löbe, Matthias; Stäubert, Sebastian
2018-01-01
Medical research is an active field in which a wide range of information is collected, collated, combined and analyzed. Essential results are reported in publications, but it is often problematic to have the data (raw and processed), algorithms and tools associated with the publication available. The Leipzig Health Atlas (LHA) project has therefore set itself the goal of providing a repository for this purpose and enabling controlled access to it via a web-based portal. A data sharing concept in accordance to FAIR and OAIS is the basis for the processing and provision of data in the LHA. An IT architecture has been designed for this purpose. The paper presents essential aspects of the data sharing concept, the IT architecture and the methods used.
A message passing kernel for the hypercluster parallel processing test bed
NASA Technical Reports Server (NTRS)
Blech, Richard A.; Quealy, Angela; Cole, Gary L.
1989-01-01
A Message-Passing Kernel (MPK) for the Hypercluster parallel-processing test bed is described. The Hypercluster is being developed at the NASA Lewis Research Center to support investigations of parallel algorithms and architectures for computational fluid and structural mechanics applications. The Hypercluster resembles the hypercube architecture except that each node consists of multiple processors communicating through shared memory. The MPK efficiently routes information through the Hypercluster, using a message-passing protocol when necessary and faster shared-memory communication whenever possible. The MPK also interfaces all of the processors with the Hypercluster operating system (HYCLOPS), which runs on a Front-End Processor (FEP). This approach distributes many of the I/O tasks to the Hypercluster processors and eliminates the need for a separate I/O support program on the FEP.
Natural language processing: an introduction
Ohno-Machado, Lucila; Chapman, Wendy W
2011-01-01
Objectives To provide an overview and tutorial of natural language processing (NLP) and modern NLP-system design. Target audience This tutorial targets the medical informatics generalist who has limited acquaintance with the principles behind NLP and/or limited knowledge of the current state of the art. Scope We describe the historical evolution of NLP, and summarize common NLP sub-problems in this extensive field. We then provide a synopsis of selected highlights of medical NLP efforts. After providing a brief description of common machine-learning approaches that are being used for diverse NLP sub-problems, we discuss how modern NLP architectures are designed, with a summary of the Apache Foundation's Unstructured Information Management Architecture. We finally consider possible future directions for NLP, and reflect on the possible impact of IBM Watson on the medical field. PMID:21846786
Rapid prototyping strategy for a surgical data warehouse.
Tang, S-T; Huang, Y-F; Hsiao, M-L; Yang, S-H; Young, S-T
2003-01-01
Healthcare processes typically generate an enormous volume of patient information. This information largely represents unexploited knowledge, since current hospital operational systems (e.g., HIS, RIS) are not suitable for knowledge exploitation. Data warehousing provides an attractive method for solving these problems, but the process is very complicated. This study presents a novel strategy for effectively implementing a healthcare data warehouse. This study adopted the rapid prototyping (RP) method, which involves intensive interactions. System developers and users were closely linked throughout the life cycle of the system development. The presence of iterative RP loops meant that the system requirements were increasingly integrated and problems were gradually solved, such that the prototype system evolved into the final operational system. The results were analyzed by monitoring the series of iterative RP loops. First a definite workflow for ensuring data completeness was established, taking a patient-oriented viewpoint when collecting the data. Subsequently the system architecture was determined for data retrieval, storage, and manipulation. This architecture also clarifies the relationships among the novel system and legacy systems. Finally, a graphic user interface for data presentation was implemented. Our results clearly demonstrate the potential for adopting an RP strategy in the successful establishment of a healthcare data warehouse. The strategy can be modified and expanded to provide new services or support new application domains. The design patterns and modular architecture used in the framework will be useful in solving problems in different healthcare domains.
Distributed Data Collection for the ATLAS EventIndex
NASA Astrophysics Data System (ADS)
Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.
2015-12-01
The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.
Study on Global GIS architecture and its key technologies
NASA Astrophysics Data System (ADS)
Cheng, Chengqi; Guan, Li; Lv, Xuefeng
2009-09-01
Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.
Study on Global GIS architecture and its key technologies
NASA Astrophysics Data System (ADS)
Cheng, Chengqi; Guan, Li; Lv, Xuefeng
2010-11-01
Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.
Cavity-based architecture to preserve quantum coherence and entanglement
NASA Astrophysics Data System (ADS)
Man, Zhong-Xiao; Xia, Yun-Jie; Lo Franco, Rosario
2015-09-01
Quantum technology relies on the utilization of resources, like quantum coherence and entanglement, which allow quantum information and computation processing. This achievement is however jeopardized by the detrimental effects of the environment surrounding any quantum system, so that finding strategies to protect quantum resources is essential. Non-Markovian and structured environments are useful tools to this aim. Here we show how a simple environmental architecture made of two coupled lossy cavities enables a switch between Markovian and non-Markovian regimes for the dynamics of a qubit embedded in one of the cavity. Furthermore, qubit coherence can be indefinitely preserved if the cavity without qubit is perfect. We then focus on entanglement control of two independent qubits locally subject to such an engineered environment and discuss its feasibility in the framework of circuit quantum electrodynamics. With up-to-date experimental parameters, we show that our architecture allows entanglement lifetimes orders of magnitude longer than the spontaneous lifetime without local cavity couplings. This cavity-based architecture is straightforwardly extendable to many qubits for scalability.
Cavity-based architecture to preserve quantum coherence and entanglement.
Man, Zhong-Xiao; Xia, Yun-Jie; Lo Franco, Rosario
2015-09-09
Quantum technology relies on the utilization of resources, like quantum coherence and entanglement, which allow quantum information and computation processing. This achievement is however jeopardized by the detrimental effects of the environment surrounding any quantum system, so that finding strategies to protect quantum resources is essential. Non-Markovian and structured environments are useful tools to this aim. Here we show how a simple environmental architecture made of two coupled lossy cavities enables a switch between Markovian and non-Markovian regimes for the dynamics of a qubit embedded in one of the cavity. Furthermore, qubit coherence can be indefinitely preserved if the cavity without qubit is perfect. We then focus on entanglement control of two independent qubits locally subject to such an engineered environment and discuss its feasibility in the framework of circuit quantum electrodynamics. With up-to-date experimental parameters, we show that our architecture allows entanglement lifetimes orders of magnitude longer than the spontaneous lifetime without local cavity couplings. This cavity-based architecture is straightforwardly extendable to many qubits for scalability.
Cavity-based architecture to preserve quantum coherence and entanglement
Man, Zhong-Xiao; Xia, Yun-Jie; Lo Franco, Rosario
2015-01-01
Quantum technology relies on the utilization of resources, like quantum coherence and entanglement, which allow quantum information and computation processing. This achievement is however jeopardized by the detrimental effects of the environment surrounding any quantum system, so that finding strategies to protect quantum resources is essential. Non-Markovian and structured environments are useful tools to this aim. Here we show how a simple environmental architecture made of two coupled lossy cavities enables a switch between Markovian and non-Markovian regimes for the dynamics of a qubit embedded in one of the cavity. Furthermore, qubit coherence can be indefinitely preserved if the cavity without qubit is perfect. We then focus on entanglement control of two independent qubits locally subject to such an engineered environment and discuss its feasibility in the framework of circuit quantum electrodynamics. With up-to-date experimental parameters, we show that our architecture allows entanglement lifetimes orders of magnitude longer than the spontaneous lifetime without local cavity couplings. This cavity-based architecture is straightforwardly extendable to many qubits for scalability. PMID:26351004
Molecular Sticker Model Stimulation on Silicon for a Maximum Clique Problem
Ning, Jianguo; Li, Yanmei; Yu, Wen
2015-01-01
Molecular computers (also called DNA computers), as an alternative to traditional electronic computers, are smaller in size but more energy efficient, and have massive parallel processing capacity. However, DNA computers may not outperform electronic computers owing to their higher error rates and some limitations of the biological laboratory. The stickers model, as a typical DNA-based computer, is computationally complete and universal, and can be viewed as a bit-vertically operating machine. This makes it attractive for silicon implementation. Inspired by the information processing method on the stickers computer, we propose a novel parallel computing model called DEM (DNA Electronic Computing Model) on System-on-a-Programmable-Chip (SOPC) architecture. Except for the significant difference in the computing medium—transistor chips rather than bio-molecules—the DEM works similarly to DNA computers in immense parallel information processing. Additionally, a plasma display panel (PDP) is used to show the change of solutions, and helps us directly see the distribution of assignments. The feasibility of the DEM is tested by applying it to compute a maximum clique problem (MCP) with eight vertices. Owing to the limited computing sources on SOPC architecture, the DEM could solve moderate-size problems in polynomial time. PMID:26075867
How to Build a Hybrid Neurofeedback Platform Combining EEG and fMRI
Mano, Marsel; Lécuyer, Anatole; Bannier, Elise; Perronnet, Lorraine; Noorzadeh, Saman; Barillot, Christian
2017-01-01
Multimodal neurofeedback estimates brain activity using information acquired with more than one neurosignal measurement technology. In this paper we describe how to set up and use a hybrid platform based on simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI), then we illustrate how to use it for conducting bimodal neurofeedback experiments. The paper is intended for those willing to build a multimodal neurofeedback system, to guide them through the different steps of the design, setup, and experimental applications, and help them choose a suitable hardware and software configuration. Furthermore, it reports practical information from bimodal neurofeedback experiments conducted in our lab. The platform presented here has a modular parallel processing architecture that promotes real-time signal processing performance and simple future addition and/or replacement of processing modules. Various unimodal and bimodal neurofeedback experiments conducted in our lab showed high performance and accuracy. Currently, the platform is able to provide neurofeedback based on electroencephalography and functional magnetic resonance imaging, but the architecture and the working principles described here are valid for any other combination of two or more real-time brain activity measurement technologies. PMID:28377691
Masanz, James J; Ogren, Philip V; Zheng, Jiaping; Sohn, Sunghwan; Kipper-Schuler, Karin C; Chute, Christopher G
2010-01-01
We aim to build and evaluate an open-source natural language processing system for information extraction from electronic medical record clinical free-text. We describe and evaluate our system, the clinical Text Analysis and Knowledge Extraction System (cTAKES), released open-source at http://www.ohnlp.org. The cTAKES builds on existing open-source technologies—the Unstructured Information Management Architecture framework and OpenNLP natural language processing toolkit. Its components, specifically trained for the clinical domain, create rich linguistic and semantic annotations. Performance of individual components: sentence boundary detector accuracy=0.949; tokenizer accuracy=0.949; part-of-speech tagger accuracy=0.936; shallow parser F-score=0.924; named entity recognizer and system-level evaluation F-score=0.715 for exact and 0.824 for overlapping spans, and accuracy for concept mapping, negation, and status attributes for exact and overlapping spans of 0.957, 0.943, 0.859, and 0.580, 0.939, and 0.839, respectively. Overall performance is discussed against five applications. The cTAKES annotations are the foundation for methods and modules for higher-level semantic processing of clinical free-text. PMID:20819853
Information Architecture in Library and Information Science Curricula.
ERIC Educational Resources Information Center
Robins, David
2002-01-01
Discusses how information architecture is being handled in some library and information science (LIS) programs and suggests mappings between traditional LIS curricula and the marketplace for information architects. Topics include terminology used in LIS curricula; current job opportunities; and projections for the future. (LRW)
A Cognitive Architecture for Human Performance Process Model Research
1992-11-01
individually defined, updatable world representation which is a description of the world as the operator knows it. It contains rules for decisions, an...operate it), and rules of engagement (knowledge about the operator’s expected behavior). The HPP model works in the following way. Information enters...based models depict the problem-solving processes of experts. The experts’ knowledge is represented in symbol structures, along with rules for
High-Level Vision: Top-Down Processing in Neurally Inspired Architectures
2008-02-01
shunting subsystem). Visual input from the lateral geniculate enters the visual buffer via the black arrow at the bottom. Processing subsystems used... lateral geniculate nucleus of the thalamus (LGNd), the superior colliculus of the midbrain, and cortical regions V1 through V4. Beyond early vision...resonance imaging FOA: focus of attention IMPER: IMagery and PERception model IS: information shunting system LGNd: dorsal lateral geniculate nucleus
NASA Astrophysics Data System (ADS)
Kirst, Christoph
It is astonishing how the sub-parts of a brain co-act to produce coherent behavior. What are mechanism that coordinate information processing and communication and how can those be changed flexibly in order to cope with variable contexts? Here we show that when information is encoded in the deviations around a collective dynamical reference state of a recurrent network the propagation of these fluctuations is strongly dependent on precisely this underlying reference. Information here 'surfs' on top of the collective dynamics and switching between states enables fast and flexible rerouting of information. This in turn affects local processing and consequently changes in the global reference dynamics that re-regulate the distribution of information. This provides a generic mechanism for self-organized information processing as we demonstrate with an oscillatory Hopfield network that performs contextual pattern recognition. Deep neural networks have proven to be very successful recently. Here we show that generating information channels via collective reference dynamics can effectively compress a deep multi-layer architecture into a single layer making this mechanism a promising candidate for the organization of information processing in biological neuronal networks.
Efficient self-organizing multilayer neural network for nonlinear system modeling.
Han, Hong-Gui; Wang, Li-Dan; Qiao, Jun-Fei
2013-07-01
It has been shown extensively that the dynamic behaviors of a neural system are strongly influenced by the network architecture and learning process. To establish an artificial neural network (ANN) with self-organizing architecture and suitable learning algorithm for nonlinear system modeling, an automatic axon-neural network (AANN) is investigated in the following respects. First, the network architecture is constructed automatically to change both the number of hidden neurons and topologies of the neural network during the training process. The approach introduced in adaptive connecting-and-pruning algorithm (ACP) is a type of mixed mode operation, which is equivalent to pruning or adding the connecting of the neurons, as well as inserting some required neurons directly. Secondly, the weights are adjusted, using a feedforward computation (FC) to obtain the information for the gradient during learning computation. Unlike most of the previous studies, AANN is able to self-organize the architecture and weights, and to improve the network performances. Also, the proposed AANN has been tested on a number of benchmark problems, ranging from nonlinear function approximating to nonlinear systems modeling. The experimental results show that AANN can have better performances than that of some existing neural networks. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
A Multi-Agent System Architecture for Sensor Networks
Fuentes-Fernández, Rubén; Guijarro, María; Pajares, Gonzalo
2009-01-01
The design of the control systems for sensor networks presents important challenges. Besides the traditional problems about how to process the sensor data to obtain the target information, engineers need to consider additional aspects such as the heterogeneity and high number of sensors, and the flexibility of these networks regarding topologies and the sensors in them. Although there are partial approaches for resolving these issues, their integration relies on ad hoc solutions requiring important development efforts. In order to provide an effective approach for this integration, this paper proposes an architecture based on the multi-agent system paradigm with a clear separation of concerns. The architecture considers sensors as devices used by an upper layer of manager agents. These agents are able to communicate and negotiate services to achieve the required functionality. Activities are organized according to roles related with the different aspects to integrate, mainly sensor management, data processing, communication and adaptation to changes in the available devices and their capabilities. This organization largely isolates and decouples the data management from the changing network, while encouraging reuse of solutions. The use of the architecture is facilitated by a specific modelling language developed through metamodelling. A case study concerning a generic distributed system for fire fighting illustrates the approach and the comparison with related work. PMID:22303172
A multi-agent system architecture for sensor networks.
Fuentes-Fernández, Rubén; Guijarro, María; Pajares, Gonzalo
2009-01-01
The design of the control systems for sensor networks presents important challenges. Besides the traditional problems about how to process the sensor data to obtain the target information, engineers need to consider additional aspects such as the heterogeneity and high number of sensors, and the flexibility of these networks regarding topologies and the sensors in them. Although there are partial approaches for resolving these issues, their integration relies on ad hoc solutions requiring important development efforts. In order to provide an effective approach for this integration, this paper proposes an architecture based on the multi-agent system paradigm with a clear separation of concerns. The architecture considers sensors as devices used by an upper layer of manager agents. These agents are able to communicate and negotiate services to achieve the required functionality. Activities are organized according to roles related with the different aspects to integrate, mainly sensor management, data processing, communication and adaptation to changes in the available devices and their capabilities. This organization largely isolates and decouples the data management from the changing network, while encouraging reuse of solutions. The use of the architecture is facilitated by a specific modelling language developed through metamodelling. A case study concerning a generic distributed system for fire fighting illustrates the approach and the comparison with related work.
A fault-tolerant multiprocessor architecture for aircraft, volume 1. [autopilot configuration
NASA Technical Reports Server (NTRS)
Smith, T. B.; Hopkins, A. L.; Taylor, W.; Ausrotas, R. A.; Lala, J. H.; Hanley, L. D.; Martin, J. H.
1978-01-01
A fault-tolerant multiprocessor architecture is reported. This architecture, together with a comprehensive information system architecture, has important potential for future aircraft applications. A preliminary definition and assessment of a suitable multiprocessor architecture for such applications is developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Kenneth; Oxstrand, Johanna
The Digital Architecture effort is a part of the Department of Energy (DOE) sponsored Light-Water Reactor Sustainability (LWRS) Program conducted at Idaho National Laboratory (INL). The LWRS program is performed in close collaboration with industry research and development (R&D) programs that provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants (NPPs). One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. Therefore,more » a major objective of the LWRS program is the development of a seamless digital environment for plant operations and support by integrating information from plant systems with plant processes for nuclear workers through an array of interconnected technologies. In order to get the most benefits of the advanced technology suggested by the different research activities in the LWRS program, the nuclear utilities need a digital architecture in place to support the technology. A digital architecture can be defined as a collection of information technology (IT) capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. It is not hard to imagine that many processes within the plant can be largely improved from both a system and human performance perspective by utilizing a plant wide (or near plant wide) wireless network. For example, a plant wide wireless network allows for real time plant status information to easily be accessed in the control room, field workers’ computer-based procedures can be updated based on the real time plant status, and status on ongoing procedures can be incorporated into smart schedules in the outage command center to allow for more accurate planning of critical tasks. The goal of the digital architecture project is to provide a long-term strategy to integrate plant systems, plant processes, and plant workers. This include technologies to improve nuclear worker efficiency and human performance; to offset a range of plant surveillance and testing activities with new on-line monitoring technologies; improve command, control, and collaboration in settings such as outage control centers and work execution centers; and finally to improve operator performance with new operator aid technologies for the control room. The requirements identified through the activities in the Digital Architecture project will be used to estimate the amount of traffic on the network and hence estimating the minimal bandwidth needed.« less
A Novel Software Architecture for the Provision of Context-Aware Semantic Transport Information
Moreno, Asier; Perallos, Asier; López-de-Ipiña, Diego; Onieva, Enrique; Salaberria, Itziar; Masegosa, Antonio D.
2015-01-01
The effectiveness of Intelligent Transportation Systems depends largely on the ability to integrate information from diverse sources and the suitability of this information for the specific user. This paper describes a new approach for the management and exchange of this information, related to multimodal transportation. A novel software architecture is presented, with particular emphasis on the design of the data model and the enablement of services for information retrieval, thereby obtaining a semantic model for the representation of transport information. The publication of transport data as semantic information is established through the development of a Multimodal Transport Ontology (MTO) and the design of a distributed architecture allowing dynamic integration of transport data. The advantages afforded by the proposed system due to the use of Linked Open Data and a distributed architecture are stated, comparing it with other existing solutions. The adequacy of the information generated in regard to the specific user’s context is also addressed. Finally, a working solution of a semantic trip planner using actual transport data and running on the proposed architecture is presented, as a demonstration and validation of the system. PMID:26016915
NASA Technical Reports Server (NTRS)
Farah, Jeffrey J.
1992-01-01
Developing a robust, task level, error recovery and on-line planning architecture is an open research area. There is previously published work on both error recovery and on-line planning; however, none incorporates error recovery and on-line planning into one integrated platform. The integration of these two functionalities requires an architecture that possesses the following characteristics. The architecture must provide for the inclusion of new information without the destruction of existing information. The architecture must provide for the relating of pieces of information, old and new, to one another in a non-trivial rather than trivial manner (e.g., object one is related to object two under the following constraints, versus, yes, they are related; no, they are not related). Finally, the architecture must be not only a stand alone architecture, but also one that can be easily integrated as a supplement to some existing architecture. This thesis proposal addresses architectural development. Its intent is to integrate error recovery and on-line planning onto a single, integrated, multi-processor platform. This intelligent x-autonomous platform, called the Planning Coordinator, will be used initially to supplement existing x-autonomous systems and eventually replace them.
EPA's Information Architecture and Web Taxonomy
EPA's Information Architecture creates a topical organization of our website, instead of an ownership-based organization. The EPA Web Taxonomy allows audiences easy access to relevant information from EPA programs, by using a common vocabulary.
Possible Circuit Architectures for Molecular Nanoelectronics
NASA Astrophysics Data System (ADS)
Likharev, Konstantin
2003-03-01
Chemically-directed self-assembly of molecular devices is apparently the only feasible way to continue the fast progress of microelectronics after its Moore-Laws-based development runs into the wall of physical and economic limitations [1]. The architectures of VLSI circuits using such devices should be substantially fault-tolerant and accommodate other their features including low transconductance. The most significant feature of all promising suggested architectures is the hybridization of three technologies: advanced CMOS, simple nanowire arrays, and molecular devices self-assembling on these wires. Molecular memory arrays may have a simple structure, and their simple prototypes have already been implemented experimentally [2]. In contrast, the logic circuit development is just starting. I will describe a family of neuromorphic networks based on so-called CrossNet arrays [3] that look promising for advanced information processing, starting from fast image recognition and beyond. This architecture may combine very high density (above 10^12 functions per cm^2) and relatively high speed (100-ns-scale latency of cell-to-cell communications) at acceptable power consumption. In future, these features may allow to put an artificial analog of the human cerebral cortex, capable of processing information and (hopefully) self-evolution at 4 to 5 orders of magnitude faster than its biological prototype, on a 20x20 cm^2 silicon wafer. [1] K. Likharev, "Electronics Below 20-nm", see http://rsfq1.physics.sunysb.edu/ likharev/nano/ForMorkoc.pdf. [2] See, e.g, http://nanotechweb.org/articles/news/1/9/8/1. [3] O. Turel and K. Likharev, Int. J. of Circuit Theory and Applications 31, No.1 (2003); see http://rsfq1.physics.sunysb.edu/ likharev/nano/Preprint070102.pdf.
1990-12-01
data rate to the electronics would be much lower on the average and the data much "richer" in information. Intelligent use of...system bottleneck, a high data rate should be provided by I/O systems. 2. machines with intelligent storage management specially designed for logic...management information processing, surveillance sensors, intelligence data collection and handling, solid state sciences, electromagnetics, and propagation, and electronic reliability/maintainability and compatibility.
Quantitative imaging methods in osteoporosis.
Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G
2016-12-01
Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.
NASA Enterprise Architecture and Its Use in Transition of Research Results to Operations
NASA Astrophysics Data System (ADS)
Frisbie, T. E.; Hall, C. M.
2006-12-01
Enterprise architecture describes the design of the components of an enterprise, their relationships and how they support the objectives of that enterprise. NASA Stennis Space Center leads several projects involving enterprise architecture tools used to gather information on research assets within NASA's Earth Science Division. In the near future, enterprise architecture tools will link and display the relevant requirements, parameters, observatories, models, decision systems, and benefit/impact information relationships and map to the Federal Enterprise Architecture Reference Models. Components configured within the enterprise architecture serving the NASA Applied Sciences Program include the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool. The Earth Science Components Knowledge Base systematically catalogues NASA missions, sensors, models, data products, model products, and network partners appropriate for consideration in NASA Earth Science applications projects. The Systems Components database is a centralized information warehouse of NASA's Earth Science research assets and a critical first link in the implementation of enterprise architecture. The Earth Science Architecture Tool is used to analyze potential NASA candidate systems that may be beneficial to decision-making capabilities of other Federal agencies. Use of the current configuration of NASA enterprise architecture (the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool) has far exceeded its original intent and has tremendous potential for the transition of research results to operational entities.
NASA Technical Reports Server (NTRS)
Bryant, N. A.; Zobrist, A. L.
1978-01-01
The paper describes the development of an image based information system and its use to process a Landsat thematic map showing land use or land cover in conjunction with a census tract polygon file to produce a tabulation of land use acreages per census tract. The system permits the efficient cross-tabulation of two or more geo-coded data sets, thereby setting the stage for the practical implementation of models of diffusion processes or cellular transformation. Characteristics of geographic information systems are considered, and functional requirements, such as data management, geocoding, image data management, and data analysis are discussed. The system is described, and the potentialities of its use are examined.
2016-09-01
BEHAVIORAL MODELING OF SYSTEM- AND SOFTWARE- ARCHITECTURE SPECIFICATIONS TO INFORM RESOURCING DECISIONS by Monica F. Farah-Stapleton...AND SOFTWARE- ARCHITECTURE SPECIFICATIONS TO INFORM RESOURCING DECISIONS 5. FUNDING NUMBERS 6. AUTHOR(S) Monica F. Farah-Stapleton 7. PERFORMING...this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government. IRB number
A National Medical Information System for Senegal: Architecture and Services.
Camara, Gaoussou; Diallo, Al Hassim; Lo, Moussa; Tendeng, Jacques-Noël; Lo, Seynabou
2016-01-01
In Senegal, great amounts of data are daily generated by medical activities such as consultation, hospitalization, blood test, x-ray, birth, death, etc. These data are still recorded in register, printed images, audios and movies which are manually processed. However, some medical organizations have their own software for non-standardized patient record management, appointment, wages, etc. without any possibility of sharing these data or communicating with other medical structures. This leads to lots of limitations in reusing or sharing these data because of their possible structural and semantic heterogeneity. To overcome these problems we have proposed a National Medical Information System for Senegal (SIMENS). As an integrated platform, SIMENS provides an EHR system that supports healthcare activities, a mobile version and a web portal. The SIMENS architecture proposes also a data and application integration services for supporting interoperability and decision making.
Patterns-Based IS Change Management in SMEs
NASA Astrophysics Data System (ADS)
Makna, Janis; Kirikova, Marite
The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.
75 FR 68806 - Statement of Organization, Functions and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-09
... Agency business applications architectures, the engineering of business processes, the building and... architecture, engineers technology for business processes, builds, deploys, maintains and manages enterprise systems and data collections efforts; (5) applies business applications architecture to process specific...
NREL's Buildings Research Honored by Architecture Magazine
NREL's Buildings Research Honored by Architecture Magazine For more information contact: Kerry Masson, (303) 275-4083 Golden, Colo., January 15, 1997ÂArchitecture magazine's Progressive Architecture
Partially Decentralized Control Architectures for Satellite Formations
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Bauer, Frank H.
2002-01-01
In a partially decentralized control architecture, more than one but less than all nodes have supervisory capability. This paper describes an approach to choosing the number of supervisors in such au architecture, based on a reliability vs. cost trade. It also considers the implications of these results for the design of navigation systems for satellite formations that could be controlled with a partially decentralized architecture. Using an assumed cost model, analytic and simulation-based results indicate that it may be cheaper to achieve a given overall system reliability with a partially decentralized architecture containing only a few supervisors, than with either fully decentralized or purely centralized architectures. Nominally, the subset of supervisors may act as centralized estimation and control nodes for corresponding subsets of the remaining subordinate nodes, and act as decentralized estimation and control peers with respect to each other. However, in the context of partially decentralized satellite formation control, the absolute positions and velocities of each spacecraft are unique, so that correlations which make estimates using only local information suboptimal only occur through common biases and process noise. Covariance and monte-carlo analysis of a simplified system show that this lack of correlation may allow simplification of the local estimators while preserving the global optimality of the maneuvers commanded by the supervisors.
Software synthesis using generic architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay
1993-01-01
A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.
USDA-ARS?s Scientific Manuscript database
Scientific data integration and computational service discovery are challenges for the bioinformatic community. This process is made more difficult by the separate and independent construction of biological databases, which makes the exchange of scientific data between information resources difficu...
Collaborative Annotation System Environment (CASE) for Online Learning
ERIC Educational Resources Information Center
Glover, Ian; Hardaker, Glenn; Xu, Zhijie
2004-01-01
This paper outlines the design and development process of an online annotation system and how it is applied to the sphere of collaborative online learning. The architecture and design of the annotation system, illustrated in this paper, have been developed to enrich collaborative learning content through adding a layer of information in online…
Neural architecture underlying classification of face perception paradigms.
Laird, Angela R; Riedel, Michael C; Sutherland, Matthew T; Eickhoff, Simon B; Ray, Kimberly L; Uecker, Angela M; Fox, P Mickle; Turner, Jessica A; Fox, Peter T
2015-10-01
We present a novel strategy for deriving a classification system of functional neuroimaging paradigms that relies on hierarchical clustering of experiments archived in the BrainMap database. The goal of our proof-of-concept application was to examine the underlying neural architecture of the face perception literature from a meta-analytic perspective, as these studies include a wide range of tasks. Task-based results exhibiting similar activation patterns were grouped as similar, while tasks activating different brain networks were classified as functionally distinct. We identified four sub-classes of face tasks: (1) Visuospatial Attention and Visuomotor Coordination to Faces, (2) Perception and Recognition of Faces, (3) Social Processing and Episodic Recall of Faces, and (4) Face Naming and Lexical Retrieval. Interpretation of these sub-classes supports an extension of a well-known model of face perception to include a core system for visual analysis and extended systems for personal information, emotion, and salience processing. Overall, these results demonstrate that a large-scale data mining approach can inform the evolution of theoretical cognitive models by probing the range of behavioral manipulations across experimental tasks. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Xu, Boyi; Xu, Li Da; Fei, Xiang; Jiang, Lihong; Cai, Hongming; Wang, Shuai
2017-08-01
Facing the rapidly changing business environments, implementation of flexible business process is crucial, but difficult especially in data-intensive application areas. This study aims to provide scalable and easily accessible information resources to leverage business process management. In this article, with a resource-oriented approach, enterprise data resources are represented as data-centric Web services, grouped on-demand of business requirement and configured dynamically to adapt to changing business processes. First, a configurable architecture CIRPA involving information resource pool is proposed to act as a scalable and dynamic platform to virtualise enterprise information resources as data-centric Web services. By exposing data-centric resources as REST services in larger granularities, tenant-isolated information resources could be accessed in business process execution. Second, dynamic information resource pool is designed to fulfil configurable and on-demand data accessing in business process execution. CIRPA also isolates transaction data from business process while supporting diverse business processes composition. Finally, a case study of using our method in logistics application shows that CIRPA provides an enhanced performance both in static service encapsulation and dynamic service execution in cloud computing environment.
Agent-oriented privacy-based information brokering architecture for healthcare environments.
Masaud-Wahaishi, Abdulmutalib; Ghenniwa, Hamada
2009-01-01
Healthcare industry is facing a major reform at all levels-locally, regionally, nationally, and internationally. Healthcare services and systems become very complex and comprise of a vast number of components (software systems, doctors, patients, etc.) that are characterized by shared, distributed and heterogeneous information sources with varieties of clinical and other settings. The challenge now faced with decision making, and management of care is to operate effectively in order to meet the information needs of healthcare personnel. Currently, researchers, developers, and systems engineers are working toward achieving better efficiency and quality of service in various sectors of healthcare, such as hospital management, patient care, and treatment. This paper presents a novel information brokering architecture that supports privacy-based information gathering in healthcare. Architecturally, the brokering is viewed as a layer of services where a brokering service is modeled as an agent with a specific architecture and interaction protocol that are appropriate to serve various requests. Within the context of brokering, we model privacy in terms of the entities ability to hide or reveal information related to its identities, requests, and/or capabilities. A prototype of the proposed architecture has been implemented to support information-gathering capabilities in healthcare environments using FIPA-complaint platform JADE.
Integrative Potential of Architectural Activities
NASA Astrophysics Data System (ADS)
Davydova, O. V.
2017-11-01
The architectural activity integrative potential is considered through the combination as well as the organization of necessary universal human and professional, artificial and natural, social and individual architectural activities in the multidimensional unity of its components reflecting and influencing the public thinking with the artistic-figurative language of international communication using experimental form-building, interactive presentations, theatrical and gaming expressiveness to organize an easier contact with the consumer, methods of design and advertising. The methodology is used to reflect the mutual influence of personal and social problems through globalization and identification of their problem in the public, to study the existing methods of the problem solving, to analyze their effectiveness, to search for actual problems and new solutions to them using the latest achievements of technological progress, artistic patterns, creation of a holistic architectural image reflecting the author’s worldview in the general picture of the modern world with its inherent tendencies “Surah” and “entertainment”. The operative communication means in the chain of social experience are developed - the teacher - the trainee - the new educational result used to transmit the updated information in a generalized form, the current and final control through the use of feedback sheets, supporting summaries, info cards, its decisions. The paper considers the study time efficiency due to the organization of the research activity which allows students to obtain a theoretical generalized information (the creator’s limitation) in the process of filling or compiling informative and diagnostic maps that provide the theoretical framework for the creative activity through gaming activity that turns into a work activity which has a diagnosed result.
Bønes, Erlend; Hasvold, Per; Henriksen, Eva; Strandenaes, Thomas
2007-09-01
Instant messaging (IM) is suited for immediate communication because messages are delivered almost in real time. Results from studies of IM use in enterprise work settings make us believe that IM based services may prove useful also within the healthcare sector. However, today's public instant messaging services do not have the level of information security required for adoption of IM in healthcare. We proposed MedIMob, our own architecture for a secure enterprise IM service for use in healthcare. MedIMob supports IM clients on mobile devices in addition to desktop based clients. Security threats were identified in a risk analysis of the MedIMob architecture. The risk analysis process consists of context identification, threat identification, analysis of consequences and likelihood, risk evaluation, and proposals for risk treatment. The risk analysis revealed a number of potential threats to the information security of a service like this. Many of the identified threats are general when dealing with mobile devices and sensitive data; others are threats which are more specific to our service and architecture. Individual threats identified in the risks analysis are discussed and possible counter measures presented. The risk analysis showed that most of the proposed risk treatment measures must be implemented to obtain an acceptable risk level; among others blocking much of the additional functionality of the smartphone. To conclude on the usefulness of this IM service, it will be evaluated in a trial study of the human-computer interaction. Further work also includes an improved design of the proposed MedIMob architecture. 2006 Elsevier Ireland Ltd
Architectural Heritage Visualization Using Interactive Technologies
NASA Astrophysics Data System (ADS)
Albourae, A. T.; Armenakis, C.; Kyan, M.
2017-08-01
With the increased exposure to tourists, historical monuments are at an ever-growing risk of disappearing. Building Information Modelling (BIM) offers a process of digitally documenting of all the features that are made or incorporated into the building over its life-span, thus affords unique opportunities for information preservation. BIM of historical buildings are called Historical Building Information Models (HBIM). This involves documenting a building in detail throughout its history. Geomatics professionals have the potential to play a major role in this area as they are often the first professionals involved on construction development sites for many Architectural, Engineering, and Construction (AEC) projects. In this work, we discuss how to establish an architectural database of a heritage site, digitally reconstruct, preserve and then interact with it through an immersive environment that leverages BIM for exploring historic buildings. The reconstructed heritage site under investigation was constructed in the early 15th century. In our proposed approach, the site selection was based on many factors such as architectural value, size, and accessibility. The 3D model is extracted from the original collected and integrated data (Image-based, range-based, CAD modelling, and land survey methods), after which the elements of the 3D objects are identified by creating a database using the BIM software platform (Autodesk Revit). The use of modern and widely accessible game engine technology (Unity3D) is explored, allowing the user to fully embed and interact with the scene using handheld devices. The details of implementing an integrated pipeline between HBIM, GIS and augmented and virtual reality (AVR) tools and the findings of the work are presented.
NASA Technical Reports Server (NTRS)
Wakim, Nagi T.; Srivastava, Sadanand; Bousaidi, Mehdi; Goh, Gin-Hua
1995-01-01
Agent-based technologies answer to several challenges posed by additional information processing requirements in today's computing environments. In particular, (1) users desire interaction with computing devices in a mode which is similar to that used between people, (2) the efficiency and successful completion of information processing tasks often require a high-level of expertise in complex and multiple domains, (3) information processing tasks often require handling of large volumes of data and, therefore, continuous and endless processing activities. The concept of an agent is an attempt to address these new challenges by introducing information processing environments in which (1) users can communicate with a system in a natural way, (2) an agent is a specialist and a self-learner and, therefore, it qualifies to be trusted to perform tasks independent of the human user, and (3) an agent is an entity that is continuously active performing tasks that are either delegated to it or self-imposed. The work described in this paper focuses on the development of an interface agent for users of a complex information processing environment (IPE). This activity is part of an on-going effort to build a model for developing agent-based information systems. Such systems will be highly applicable to environments which require a high degree of automation, such as, flight control operations and/or processing of large volumes of data in complex domains, such as the EOSDIS environment and other multidisciplinary, scientific data systems. The concept of an agent as an information processing entity is fully described with emphasis on characteristics of special interest to the User-System Interface Agent (USIA). Issues such as agent 'existence' and 'qualification' are discussed in this paper. Based on a definition of an agent and its main characteristics, we propose an architecture for the development of interface agents for users of an IPE that is agent-oriented and whose resources are likely to be distributed and heterogeneous in nature. The architecture of USIA is outlined in two main components: (1) the user interface which is concerned with issues as user dialog and interaction, user modeling, and adaptation to user profile, and (2) the system interface part which deals with identification of IPE capabilities, task understanding and feasibility assessment, and task delegation and coordination of assistant agents.
Information Architecture as Reflected in Classrooms.
ERIC Educational Resources Information Center
Zhang, Xiangmin; Strand, Linda; Fisher, Nancy; Kneip, Jason; Ayoub, Olga
2002-01-01
Explores information architecture curricula at North American universities based on an analysis of 40 course descriptions available on the Web. Academic disciplines related to IA education include library and information science, information technology, business administration, literature, arts, and design as well as continuing education programs.…
Touring by Design: Using Information Architecture To Create a Virtual Library Tour.
ERIC Educational Resources Information Center
Kittelson, Pat; Jones, Sarah
2002-01-01
Describes the development of a Web-based virtual tour of the University of Otago (New Zealand) science library. Highlights include information literacy learning outcomes; information architecture, including information organization and navigation; integrating the tour into course work; and evaluation results. (LRW)
Evaluation of an Atmosphere Revitalization Subsystem for Deep Space Exploration Missions
NASA Technical Reports Server (NTRS)
Perry, Jay L.; Abney, Morgan B.; Conrad, Ruth E.; Frederick, Kenneth R.; Greenwood, Zachary W.; Kayatin, Matthew J.; Knox, James C.; Newton, Robert L.; Parrish, Keith J.; Takada, Kevin C.;
2015-01-01
An Atmosphere Revitalization Subsystem (ARS) suitable for deployment aboard deep space exploration mission vehicles has been developed and functionally demonstrated. This modified ARS process design architecture was derived from the International Space Station's (ISS) basic ARS. Primary functions considered in the architecture include trace contaminant control, carbon dioxide removal, carbon dioxide reduction, and oxygen generation. Candidate environmental monitoring instruments were also evaluated. The process architecture rearranges unit operations and employs equipment operational changes to reduce mass, simplify, and improve the functional performance for trace contaminant control, carbon dioxide removal, and oxygen generation. Results from integrated functional demonstration are summarized and compared to the performance observed during previous testing conducted on an ISS-like subsystem architecture and a similarly evolved process architecture. Considerations for further subsystem architecture and process technology development are discussed.
Selecting a Benchmark Suite to Profile High-Performance Computing (HPC) Machines
2014-11-01
architectures. Machines now contain central processing units (CPUs), graphics processing units (GPUs), and many integrated core ( MIC ) architecture all...evaluate the feasibility and applicability of a new architecture just released to the market . Researchers are often unsure how available resources will...architectures. Having a suite of programs running on different architectures, such as GPUs, MICs , and CPUs, adds complexity and technical challenges
Fault-tolerant onboard digital information switching and routing for communications satellites
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary JO; Quintana, Jorge A.; Soni, Nitin J.; Kim, Heechul
1993-01-01
The NASA Lewis Research Center is developing an information-switching processor for future meshed very-small-aperture terminal (VSAT) communications satellites. The information-switching processor will switch and route baseband user data onboard the VSAT satellite to connect thousands of Earth terminals. Fault tolerance is a critical issue in developing information-switching processor circuitry that will provide and maintain reliable communications services. In parallel with the conceptual development of the meshed VSAT satellite network architecture, NASA designed and built a simple test bed for developing and demonstrating baseband switch architectures and fault-tolerance techniques. The meshed VSAT architecture and the switching demonstration test bed are described, and the initial switching architecture and the fault-tolerance techniques that were developed and tested are discussed.
How predictive quantitative modelling of tissue organisation can inform liver disease pathogenesis.
Drasdo, Dirk; Hoehme, Stefan; Hengstler, Jan G
2014-10-01
From the more than 100 liver diseases described, many of those with high incidence rates manifest themselves by histopathological changes, such as hepatitis, alcoholic liver disease, fatty liver disease, fibrosis, and, in its later stages, cirrhosis, hepatocellular carcinoma, primary biliary cirrhosis and other disorders. Studies of disease pathogeneses are largely based on integrating -omics data pooled from cells at different locations with spatial information from stained liver structures in animal models. Even though this has led to significant insights, the complexity of interactions as well as the involvement of processes at many different time and length scales constrains the possibility to condense disease processes in illustrations, schemes and tables. The combination of modern imaging modalities with image processing and analysis, and mathematical models opens up a promising new approach towards a quantitative understanding of pathologies and of disease processes. This strategy is discussed for two examples, ammonia metabolism after drug-induced acute liver damage, and the recovery of liver mass as well as architecture during the subsequent regeneration process. This interdisciplinary approach permits integration of biological mechanisms and models of processes contributing to disease progression at various scales into mathematical models. These can be used to perform in silico simulations to promote unravelling the relation between architecture and function as below illustrated for liver regeneration, and bridging from the in vitro situation and animal models to humans. In the near future novel mechanisms will usually not be directly elucidated by modelling. However, models will falsify hypotheses and guide towards the most informative experimental design. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.
Roads towards fault-tolerant universal quantum computation
NASA Astrophysics Data System (ADS)
Campbell, Earl T.; Terhal, Barbara M.; Vuillot, Christophe
2017-09-01
A practical quantum computer must not merely store information, but also process it. To prevent errors introduced by noise from multiplying and spreading, a fault-tolerant computational architecture is required. Current experiments are taking the first steps toward noise-resilient logical qubits. But to convert these quantum devices from memories to processors, it is necessary to specify how a universal set of gates is performed on them. The leading proposals for doing so, such as magic-state distillation and colour-code techniques, have high resource demands. Alternative schemes, such as those that use high-dimensional quantum codes in a modular architecture, have potential benefits, but need to be explored further.
Roads towards fault-tolerant universal quantum computation.
Campbell, Earl T; Terhal, Barbara M; Vuillot, Christophe
2017-09-13
A practical quantum computer must not merely store information, but also process it. To prevent errors introduced by noise from multiplying and spreading, a fault-tolerant computational architecture is required. Current experiments are taking the first steps toward noise-resilient logical qubits. But to convert these quantum devices from memories to processors, it is necessary to specify how a universal set of gates is performed on them. The leading proposals for doing so, such as magic-state distillation and colour-code techniques, have high resource demands. Alternative schemes, such as those that use high-dimensional quantum codes in a modular architecture, have potential benefits, but need to be explored further.
Feeding People's Curiosity: Leveraging the Cloud for Automatic Dissemination of Mars Images
NASA Technical Reports Server (NTRS)
Knight, David; Powell, Mark
2013-01-01
Smartphones and tablets have made wireless computing ubiquitous, and users expect instant, on-demand access to information. The Mars Science Laboratory (MSL) operations software suite, MSL InterfaCE (MSLICE), employs a different back-end image processing architecture compared to that of the Mars Exploration Rovers (MER) in order to better satisfy modern consumer-driven usage patterns and to offer greater server-side flexibility. Cloud services are a centerpiece of the server-side architecture that allows new image data to be delivered automatically to both scientists using MSLICE and the general public through the MSL website (http://mars.jpl.nasa.gov/msl/).
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang (Editor); Schenker, Paul (Editor)
1987-01-01
The papers presented in this volume provide an overview of current research in both optical and digital pattern recognition, with a theme of identifying overlapping research problems and methodologies. Topics discussed include image analysis and low-level vision, optical system design, object analysis and recognition, real-time hybrid architectures and algorithms, high-level image understanding, and optical matched filter design. Papers are presented on synthetic estimation filters for a control system; white-light correlator character recognition; optical AI architectures for intelligent sensors; interpreting aerial photographs by segmentation and search; and optical information processing using a new photopolymer.
Simulating motivated cognition
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.
An Autonomous Sensor System Architecture for Active Flow and Noise Control Feedback
NASA Technical Reports Server (NTRS)
Humphreys, William M, Jr.; Culliton, William G.
2008-01-01
Multi-channel sensor fusion represents a powerful technique to simply and efficiently extract information from complex phenomena. While the technique has traditionally been used for military target tracking and situational awareness, a study has been successfully completed that demonstrates that sensor fusion can be applied equally well to aerodynamic applications. A prototype autonomous hardware processor was successfully designed and used to detect in real-time the two-dimensional flow reattachment location generated by a simple separated-flow wind tunnel model. The success of this demonstration illustrates the feasibility of using autonomous sensor processing architectures to enhance flow control feedback signal generation.
NASA Astrophysics Data System (ADS)
Roh, Won B.
Photonic technologies-based computational systems are projected to be able to offer order-of-magnitude improvements in processing speed, due to their intrinsic architectural parallelism and ultrahigh switching speeds; these architectures also minimize connectors, thereby enhancing reliability, and preclude EMP vulnerability. The use of optoelectronic ICs would also extend weapons capabilities in such areas as automated target recognition, systems-state monitoring, and detection avoidance. Fiber-optics technologies have an information-carrying capacity fully five orders of magnitude greater than copper-wire-based systems; energy loss in transmission is two orders of magnitude lower, and error rates one order of magnitude lower. Attention is being given to ZrF glasses for optical fibers with unprecedentedly low scattering levels.
Parallel processing via a dual olfactory pathway in the honeybee.
Brill, Martin F; Rosenbaum, Tobias; Reus, Isabelle; Kleineidam, Christoph J; Nawrot, Martin P; Rössler, Wolfgang
2013-02-06
In their natural environment, animals face complex and highly dynamic olfactory input. Thus vertebrates as well as invertebrates require fast and reliable processing of olfactory information. Parallel processing has been shown to improve processing speed and power in other sensory systems and is characterized by extraction of different stimulus parameters along parallel sensory information streams. Honeybees possess an elaborate olfactory system with unique neuronal architecture: a dual olfactory pathway comprising a medial projection-neuron (PN) antennal lobe (AL) protocerebral output tract (m-APT) and a lateral PN AL output tract (l-APT) connecting the olfactory lobes with higher-order brain centers. We asked whether this neuronal architecture serves parallel processing and employed a novel technique for simultaneous multiunit recordings from both tracts. The results revealed response profiles from a high number of PNs of both tracts to floral, pheromonal, and biologically relevant odor mixtures tested over multiple trials. PNs from both tracts responded to all tested odors, but with different characteristics indicating parallel processing of similar odors. Both PN tracts were activated by widely overlapping response profiles, which is a requirement for parallel processing. The l-APT PNs had broad response profiles suggesting generalized coding properties, whereas the responses of m-APT PNs were comparatively weaker and less frequent, indicating higher odor specificity. Comparison of response latencies within and across tracts revealed odor-dependent latencies. We suggest that parallel processing via the honeybee dual olfactory pathway provides enhanced odor processing capabilities serving sophisticated odor perception and olfactory demands associated with a complex olfactory world of this social insect.
Performance measurement integrated information framework in e-Manufacturing
NASA Astrophysics Data System (ADS)
Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José
2014-11-01
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
Design of SIP transformation server for efficient media negotiation
NASA Astrophysics Data System (ADS)
Pack, Sangheon; Paik, Eun Kyoung; Choi, Yanghee
2001-07-01
Voice over IP (VoIP) is one of the advanced services supported by the next generation mobile communication. VoIP should support various media formats and terminals existing together. This heterogeneous environment may prevent diverse users from establishing VoIP sessions among them. To solve the problem an efficient media negotiation mechanism is required. In this paper, we propose the efficient media negotiation architecture using the transformation server and the Intelligent Location Server (ILS). The transformation server is an extended Session Initiation Protocol (SIP) proxy server. It can modify an unacceptable session INVITE message into an acceptable one using the ILS. The ILS is a directory server based on the Lightweight Directory Access Protocol (LDAP) that keeps userí*s location information and available media information. The proposed architecture can eliminate an unnecessary response and re-INVITE messages of the standard SIP architecture. It takes only 1.5 round trip times to negotiate two different media types while the standard media negotiation mechanism takes 2.5 round trip times. The extra processing time in message handling is negligible in comparison to the reduced round trip time. The experimental results show that the session setup time in the proposed architecture is less than the setup time in the standard SIP. These results verify that the proposed media negotiation mechanism is more efficient in solving diversity problems.
García-Cabezas, Miguel Ángel; Barbas, Helen
2018-01-01
Noninvasive imaging and tractography methods have yielded information on broad communication networks but lack resolution to delineate intralaminar cortical and subcortical pathways in humans. An important unanswered question is whether we can use the wealth of precise information on pathways from monkeys to understand connections in humans. We addressed this question within a theoretical framework of systematic cortical variation and used identical high-resolution methods to compare the architecture of cortical gray matter and the white matter beneath, which gives rise to short- and long-distance pathways in humans and rhesus monkeys. We used the prefrontal cortex as a model system because of its key role in attention, emotions, and executive function, which are processes often affected in brain diseases. We found striking parallels and consistent trends in the gray and white matter architecture in humans and monkeys and between the architecture and actual connections mapped with neural tracers in rhesus monkeys and, by extension, in humans. Using the novel architectonic portrait as a base, we found significant changes in pathways between nearby prefrontal and distant areas in autism. Our findings reveal that a theoretical framework allows study of normal neural communication in humans at high resolution and specific disruptions in diverse psychiatric and neurodegenerative diseases. PMID:29401206
Plug-and-Play Environmental Monitoring Spacecraft Subsystem
NASA Technical Reports Server (NTRS)
Patel, Jagdish; Brinza, David E.; Tran, Tuan A.; Blaes, Brent R.
2011-01-01
A Space Environment Monitor (SEM) subsystem architecture has been developed and demonstrated that can benefit future spacecraft by providing (1) real-time knowledge of the spacecraft state in terms of exposure to the environment; (2) critical, instantaneous information for anomaly resolution; and (3) invaluable environmental data for designing future missions. The SEM architecture consists of a network of plug-and- play (PnP) Sensor Interface Units (SIUs), each servicing one or more environmental sensors. The SEM architecture is influenced by the IEEE Smart Transducer Interface Bus standard (IEEE Std 1451) for its PnP functionality. A network of PnP Spacecraft SIUs is enabling technology for gathering continuous real-time information critical to validating spacecraft health in harsh space environments. The demonstrated system that provided a proof-of-concept of the SEM architecture consisted of three SIUs for measurement of total ionizing dose (TID) and single event upset (SEU) radiation effects, electromagnetic interference (EMI), and deep dielectric charging through use of a prototype Internal Electro-Static Discharge Monitor (IESDM). Each SIU consists of two stacked 2X2 in. (approximately 5X5 cm) circuit boards: a Bus Interface Unit (BIU) board that provides data conversion, processing and connection to the SEM power-and-data bus, and a Sensor Interface Electronics (SIE) board that provides sensor interface needs and data path connection to the BIU.
An information model for a virtual private optical network (OVPN) using virtual routers (VRs)
NASA Astrophysics Data System (ADS)
Vo, Viet Minh Nhat
2002-05-01
This paper describes a virtual private optical network architecture (Optical VPN - OVPN) based on virtual router (VR). It improves over architectures suggested for virtual private networks by using virtual routers with optical networks. The new things in this architecture are necessary changes to adapt to devices and protocols used in optical networks. This paper also presents information models for the OVPN: at the architecture level and at the service level. These are extensions to the DEN (directory enable network) and CIM (Common Information Model) for OVPNs using VRs. The goal is to propose a common management model using policies.
NASA Technical Reports Server (NTRS)
Jones, J. R.; Bodenheimer, R. E.
1976-01-01
A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Kacpura, Thomas J.
2004-01-01
The NASA Glenn Research Center is investigating the development and suitability of a software-based open-architecture for space-based reconfigurable transceivers (RTs) and software-defined radios (SDRs). The main objectives of this project are to enable advanced operations and reduce mission costs. SDRs are becoming more common because of the capabilities of reconfigurable digital signal processing technologies such as field programmable gate arrays and digital signal processors, which place radio functions in firmware and software that were traditionally performed with analog hardware components. Features of interest of this communications architecture include nonproprietary open standards and application programming interfaces to enable software reuse and portability, independent hardware and software development, and hardware and software functional separation. The goals for RT and SDR technologies for NASA space missions include prelaunch and on-orbit frequency and waveform reconfigurability and programmability, high data rate capability, and overall communications and processing flexibility. These operational advances over current state-of-art transceivers will be provided to reduce the power, mass, and cost of RTs and SDRs for space communications. The open architecture for NASA communications will support existing (legacy) communications needs and capabilities while providing a path to more capable, advanced waveform development and mission concepts (e.g., ad hoc constellations with self-healing networks and high-rate science data return). A study was completed to assess the state of the art in RT architectures, implementations, and technologies. In-house researchers conducted literature searches and analysis, interviewed Government and industry contacts, and solicited information and white papers from industry on space-qualifiable RTs and SDRs and their associated technologies for space-based NASA applications. The white papers were evaluated, compiled, and used to assess RT and SDR system architectures and core technology elements to determine an appropriate investment strategy to advance these technologies to meet future mission needs. The use of these radios in the space environment represents a challenge because of the space radiation suitability of the components, which drastically reduces the processing capability. The radios available for space are considered to be RTs (as opposed to SDRs), which are digitally programmable radios with selectable changes from an architecture combining analog and digital components. The limited flexibility of this design contrasts against the desire to have a power-efficient solution and open architecture.
Open architecture for health care systems: the European RICHE experience.
Frandji, B
1997-01-01
Groupe RICHE is bringing to the market of health IT the Open Systems approach allowing a new generation of health information systems to arise with benefit for patients, health care professionals, hospital managers, agencies and citizens. Groupe RICHE is a forum for exchanging information, expertise around open systems in health care. It is open to any organisation interested by open systems in health care and wanting to participate and influence the work done by its user, marketing and technical committees. The Technical Committee is in charge of the maintenance of the architecture and impact the results of industrial experiences on new releases. Any Groupe RICHE member is entitled to participate to this process. This unique approach in Europe allows health care professionals to benefit from applications supporting their business processes, including providing a cooperative working environment, a shared electronic record, in an integrated system where the information is entered only once, customised according to the user needs and available to the administrative applications. This allows Hospital managers to satisfy their health care professionals, to smoothly migrate from their existing environment (protecting their investment), to choose products in a competitive environment, being able to mix and match system components and services from different suppliers, being free to change suppliers without having to replace their existing system (minimising risk), in line with national and regional strategies. For suppliers, this means being able to commercialise products well fitted to their field of competence in a large market, reducing investments and increasing returns. The RICHE approach also allows agencies to define a strategy, allowing to create a supporting infrastructure, organising the market leaving enough freedom to health care organisations and suppliers. Such an approach is based on the definition of an open standard architecture. The RICHE esprit project defined in 1993 the three layered architecture, with four main components and their services of which the main principles have recently been adopted by the CEN TC251 as a european pre-standard. From this architecture specifications various implementations have been completed including the IMS DHE, the GESI DHE and the REFERENCE Kernel. However putting into practice this approach on a large scale is not so easy. Interesting lessons have been learned in the last years in different countries.
Berkowitz, Murray R
2013-01-01
Current information systems for use in detecting bioterrorist attacks lack a consistent, overarching information architecture. An overview of the use of biological agents as weapons during a bioterrorist attack is presented. Proposed are the design, development, and implementation of a medical informatics system to mine pertinent databases, retrieve relevant data, invoke appropriate biostatistical and epidemiological software packages, and automatically analyze these data. The top-level information architecture is presented. Systems requirements and functional specifications for this level are presented. Finally, future studies are identified.
Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System
NASA Technical Reports Server (NTRS)
Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana
2011-01-01
The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.
ACOUSTICS IN ARCHITECTURAL DESIGN, AN ANNOTATED BIBLIOGRAPHY ON ARCHITECTURAL ACOUSTICS.
ERIC Educational Resources Information Center
DOELLE, LESLIE L.
THE PURPOSE OF THIS ANNOTATED BIBLIOGRAPHY ON ARCHITECTURAL ACOUSTICS WAS--(1) TO COMPILE A CLASSIFIED BIBLIOGRAPHY, INCLUDING MOST OF THOSE PUBLICATIONS ON ARCHITECTURAL ACOUSTICS, PUBLISHED IN ENGLISH, FRENCH, AND GERMAN WHICH CAN SUPPLY A USEFUL AND UP-TO-DATE SOURCE OF INFORMATION FOR THOSE ENCOUNTERING ANY ARCHITECTURAL-ACOUSTIC DESIGN…
Signori, Marcos R; Garcia, Renato
2010-01-01
This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.
Localized analysis of paint-coat drying using dynamic speckle interferometry
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel
2018-07-01
The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves
Something old, something new: data warehousing in the digital age
NASA Astrophysics Data System (ADS)
Maguire, Rob; Woolf, Andrew
2015-04-01
The implications of digital transformation for Earth science data managers are significant: big data, internet of things, new sources of third-party observations. This at a time when many are struggling to deal with half a century of legacy data infrastructure since the International Geophysical Year. While data management best practice has evolved over this time, large-scale migration activities are rare, with processes and applications instead built up around a plethora of different technologies and approaches. It is perhaps more important than ever, before embarking on major investments in new technologies, to consider the benefits first of 'catching up' with mature best-practice. Data warehousing, as an architectural formalism, was developed in the 1990s as a response to the growing challenges in corporate environments of assembling, integrating, and quality controlling large amounts of data from multiple sources and for multiple purposes. A layered architecture separates transactional data, integration and staging areas, the warehouse itself, and analytical 'data marts', with optimised ETL (Extract, Transform, Load) processes used to promote data through the layers. The data warehouse, together with associated techniques of 'master data management' and 'business intelligence', provides a classic foundation for 'enterprise information management' ("an integrative discipline for structuring, describing and governing information assets across organizational and technological boundaries to improve efficiency, promote transparency and enable business insight", Gartner). The Australian Bureau of Meteorology, like most Earth-science agencies, maintains a large amount of observation data in a variety of systems and architectures. These data assets evolve over decades, usually for operational, rather than information management, reasons. Consequently there can be inconsistency in architectures and technologies. We describe our experience with two major data assets: the Australian Water Resource Information System (AWRIS) and the Australian Data Archive for Meteorology (ADAM). These maintain the national archive of hydrological and climate data. We are undertaking a migration of AWRIS from a 'software-centric' system to a 'data-centric' warehouse, with significant benefits in performance, scalability, and maintainability. As well, the architecture supports the use of conventional BI tools for product development and visualisation. We have also experimented with a warehouse ETL replacement for custom tsunameter ingest code in ADAM, with considerable success. Our experience suggests that there is benefit to be gained through adoption by science agencies of professional IT best practice that is mature in industry but may have been overlooked by scientific information practitioners. In the case of data warehousing, the practice requires a change of perspective from a focus on code development to a focus on data. It will continue to be relevant in the 'digital age' as vendors increasingly support integrated warehousing and 'big data' platforms.
On the Inevitable Intertwining of Requirements and Architecture
NASA Astrophysics Data System (ADS)
Sutcliffe, Alistair
The chapter investigates the relationship between architecture and requirements, arguing that architectural issues need to be addressed early in the RE process. Three trends are driving architectural implications for RE: the growth of intelligent, context-aware and adaptable systems. First the relationship between architecture and requirements is considered from a theoretical viewpoint of problem frames and abstract conceptual models. The relationships between architectural decisions and non-functional requirements is reviewed, and then the impact of architecture on the RE process is assessed using a case study of developing configurable, semi-intelligent software to support medical researchers in e-science domains.
Architecture for Survivable System Processing (ASSP)
NASA Astrophysics Data System (ADS)
Wood, Richard J.
1991-11-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Architecture for Survivable System Processing (ASSP)
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1991-01-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
A parallel-pipelined architecture for a multi carrier demodulator
NASA Astrophysics Data System (ADS)
Kwatra, S. C.; Jamali, M. M.; Eugene, Linus P.
1991-03-01
Analog devices have been used for processing the information on board the satellites. Presently, digital devices are being used because they are economical and flexible as compared to their analog counterparts. Several schemes of digital transmission can be used depending on the data rate requirement of the user. An economical scheme of transmission for small earth stations uses single channel per carrier/frequency division multiple access (SCPC/FDMA) on the uplink and time division multiplexing (TDM) on the downlink. This is a typical communication service offered to low data rate users in commercial mass market. These channels usually pertain to either voice or data transmission. An efficient digital demodulator architecture is provided for a large number of law data rate users. A demodulator primarily consists of carrier, clock, and data recovery modules. This design uses principles of parallel processing, pipelining, and time sharing schemes to process large numbers of voice or data channels. It maintains the optimum throughput which is derived from the designed architecture and from the use of high speed components. The design is optimized for reduced power and area requirements. This is essential for satellite applications. The design is also flexible in processing a group of a varying number of channels. The algorithms that are used are verified by the use of a computer aided software engineering (CASE) tool called the Block Oriented System Simulator. The data flow, control circuitry, and interface of the hardware design is simulated in C language. Also, a multiprocessor approach is provided to map, model, and simulate the demodulation algorithms mainly from a speed view point. A hypercude based architecture implementation is provided for such a scheme of operation. The hypercube structure and the demodulation models on hypercubes are simulated in Ada.
NASA Technical Reports Server (NTRS)
Kwatra, S. C.; Jamali, M. M.; Eugene, Linus P.
1991-01-01
Analog devices have been used for processing the information on board the satellites. Presently, digital devices are being used because they are economical and flexible as compared to their analog counterparts. Several schemes of digital transmission can be used depending on the data rate requirement of the user. An economical scheme of transmission for small earth stations uses single channel per carrier/frequency division multiple access (SCPC/FDMA) on the uplink and time division multiplexing (TDM) on the downlink. This is a typical communication service offered to low data rate users in commercial mass market. These channels usually pertain to either voice or data transmission. An efficient digital demodulator architecture is provided for a large number of law data rate users. A demodulator primarily consists of carrier, clock, and data recovery modules. This design uses principles of parallel processing, pipelining, and time sharing schemes to process large numbers of voice or data channels. It maintains the optimum throughput which is derived from the designed architecture and from the use of high speed components. The design is optimized for reduced power and area requirements. This is essential for satellite applications. The design is also flexible in processing a group of a varying number of channels. The algorithms that are used are verified by the use of a computer aided software engineering (CASE) tool called the Block Oriented System Simulator. The data flow, control circuitry, and interface of the hardware design is simulated in C language. Also, a multiprocessor approach is provided to map, model, and simulate the demodulation algorithms mainly from a speed view point. A hypercude based architecture implementation is provided for such a scheme of operation. The hypercube structure and the demodulation models on hypercubes are simulated in Ada.
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
Information Architecture and the Comic Arts: Knowledge Structure and Access
ERIC Educational Resources Information Center
Farmer, Lesley S. J.
2015-01-01
This article explains information architecture, focusing on comic arts' features for representing and structuring knowledge. Then it details information design theory and information behaviors relative to this format, also noting visual literacy. Next , applications of comic arts in education are listed. With this background, several research…
1994-09-01
as Copernicus brought about a revolutionary paradigm shift in astronomy , the Copernicus Architecture was so named because it represents a...34 ........................................ 7 3. The Navy’s Copernicus Architecture .......................................... 8 B . SY ST E M S...evolution of JMCIS are DoD’s Corporate Information Management (CIM), The Joint Staffs "C41 for the Warrior", and the Navy’s Copernicus architecture programs
Invocation oriented architecture for agile code and agile data
NASA Astrophysics Data System (ADS)
Verma, Dinesh; Chan, Kevin; Leung, Kin; Gkelias, Athanasios
2017-05-01
In order to address the unique requirements of sensor information fusion in a tactical coalition environment, we are proposing a new architecture - one based on the concept of invocations. An invocation is a combination of a software code and a piece of data, both managed using techniques from Information Centric networking. This paper will discuss limitations of current approaches, present the architecture for an invocation oriented architecture, illustrate how it works with an example scenario, and provide reasons for its suitability in a coalition environment.
DFT algorithms for bit-serial GaAs array processor architectures
NASA Technical Reports Server (NTRS)
Mcmillan, Gary B.
1988-01-01
Systems and Processes Engineering Corporation (SPEC) has developed an innovative array processor architecture for computing Fourier transforms and other commonly used signal processing algorithms. This architecture is designed to extract the highest possible array performance from state-of-the-art GaAs technology. SPEC's architectural design includes a high performance RISC processor implemented in GaAs, along with a Floating Point Coprocessor and a unique Array Communications Coprocessor, also implemented in GaAs technology. Together, these data processors represent the latest in technology, both from an architectural and implementation viewpoint. SPEC has examined numerous algorithms and parallel processing architectures to determine the optimum array processor architecture. SPEC has developed an array processor architecture with integral communications ability to provide maximum node connectivity. The Array Communications Coprocessor embeds communications operations directly in the core of the processor architecture. A Floating Point Coprocessor architecture has been defined that utilizes Bit-Serial arithmetic units, operating at very high frequency, to perform floating point operations. These Bit-Serial devices reduce the device integration level and complexity to a level compatible with state-of-the-art GaAs device technology.
System Architecture for Anti-Ship Ballistic Missile Defense (ASBMD)
2009-12-01
this threat. This thesis documents the process that was used to select and integrate the proposed ASBMD architecture. 15. NUMBER OF PAGES 232...thesis documents the process that was used to select and integrate the proposed ASBMD architecture. vi This page is intentionally left blank...39 B. Process
DOT National Transportation Integrated Search
2002-04-01
The Physical Architecture identifies the physical subsystems and, architecture flows between subsystems that will implement the processes and support the data flows of the ITS Logical Architecture. The Physical Architecture further identifies the sys...
The Swedish strategy and method for development of a national healthcare information architecture.
Rosenälv, Jessica; Lundell, Karl-Henrik
2012-01-01
"We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision-making step in the process where information is processed, the amount and type of information and its structure were defined in terms of reference templates. Reference templates manage clinical, administrative and demographic types of information in a specific clinical context. Based on a survey of clinical processes at the reference level, the identification of specific clinical processes such as diabetes and congestive heart failure in adults were made. Process-specific templates were defined by using reference templates and populated with information that was relevant to each health problem in a specific clinical context. Throughout this process, medical data for knowledge management were collected for each health problem. Parallel with the efforts to define archetypes and templates, terminology binding work is on-going. Different strategies are used depending on the terminology binding level.
NASA Technical Reports Server (NTRS)
Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil
2007-01-01
The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity issues among various NASA systems that impact schedules and planning.
A security architecture for interconnecting health information systems.
Gritzalis, Dimitris; Lambrinoudakis, Costas
2004-03-31
Several hereditary and other chronic diseases necessitate continuous and complicated health care procedures, typically offered in different, often distant, health care units. Inevitably, the medical records of patients suffering from such diseases become complex, grow in size very fast and are scattered all over the units involved in the care process, hindering communication of information between health care professionals. Web-based electronic medical records have been recently proposed as the solution to the above problem, facilitating the interconnection of the health care units in the sense that health care professionals can now access the complete medical record of the patient, even if it is distributed in several remote units. However, by allowing users to access information from virtually anywhere, the universe of ineligible people who may attempt to harm the system is dramatically expanded, thus severely complicating the design and implementation of a secure environment. This paper presents a security architecture that has been mainly designed for providing authentication and authorization services in web-based distributed systems. The architecture has been based on a role-based access scheme and on the implementation of an intelligent security agent per site (i.e. health care unit). This intelligent security agent: (a). authenticates the users, local or remote, that can access the local resources; (b). assigns, through temporary certificates, access privileges to the authenticated users in accordance to their role; and (c). communicates to other sites (through the respective security agents) information about the local users that may need to access information stored in other sites, as well as about local resources that can be accessed remotely.
Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn
2006-09-01
Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.
Diamond Eye: a distributed architecture for image data mining
NASA Astrophysics Data System (ADS)
Burl, Michael C.; Fowlkes, Charless; Roden, Joe; Stechert, Andre; Mukhtar, Saleem
1999-02-01
Diamond Eye is a distributed software architecture, which enables users (scientists) to analyze large image collections by interacting with one or more custom data mining servers via a Java applet interface. Each server is coupled with an object-oriented database and a computational engine, such as a network of high-performance workstations. The database provides persistent storage and supports querying of the 'mined' information. The computational engine provides parallel execution of expensive image processing, object recognition, and query-by-content operations. Key benefits of the Diamond Eye architecture are: (1) the design promotes trial evaluation of advanced data mining and machine learning techniques by potential new users (all that is required is to point a web browser to the appropriate URL), (2) software infrastructure that is common across a range of science mining applications is factored out and reused, and (3) the system facilitates closer collaborations between algorithm developers and domain experts.
Trinczek, B.; Köpcke, F.; Leusch, T.; Majeed, R.W.; Schreiweis, B.; Wenk, J.; Bergh, B.; Ohmann, C.; Röhrig, R.; Prokosch, H.U.; Dugas, M.
2014-01-01
Summary Objective (1) To define features and data items of a Patient Recruitment System (PRS); (2) to design a generic software architecture of such a system covering the requirements; (3) to identify implementation options available within different Hospital Information System (HIS) environments; (4) to implement five PRS following the architecture and utilizing the implementation options as proof of concept. Methods Existing PRS were reviewed and interviews with users and developers conducted. All reported PRS features were collected and prioritized according to their published success and user’s request. Common feature sets were combined into software modules of a generic software architecture. Data items to process and transfer were identified for each of the modules. Each site collected implementation options available within their respective HIS environment for each module, provided a prototypical implementation based on available implementation possibilities and supported the patient recruitment of a clinical trial as a proof of concept. Results 24 commonly reported and requested features of a PRS were identified, 13 of them prioritized as being mandatory. A UML version 2 based software architecture containing 5 software modules covering these features was developed. 13 data item groups processed by the modules, thus required to be available electronically, have been identified. Several implementation options could be identified for each module, most of them being available at multiple sites. Utilizing available tools, a PRS could be implemented in each of the five participating German university hospitals. Conclusion A set of required features and data items of a PRS has been described for the first time. The software architecture covers all features in a clear, well-defined way. The variety of implementation options and the prototypes show that it is possible to implement the given architecture in different HIS environments, thus enabling more sites to successfully support patient recruitment in clinical trials. PMID:24734138
Trinczek, B; Köpcke, F; Leusch, T; Majeed, R W; Schreiweis, B; Wenk, J; Bergh, B; Ohmann, C; Röhrig, R; Prokosch, H U; Dugas, M
2014-01-01
(1) To define features and data items of a Patient Recruitment System (PRS); (2) to design a generic software architecture of such a system covering the requirements; (3) to identify implementation options available within different Hospital Information System (HIS) environments; (4) to implement five PRS following the architecture and utilizing the implementation options as proof of concept. Existing PRS were reviewed and interviews with users and developers conducted. All reported PRS features were collected and prioritized according to their published success and user's request. Common feature sets were combined into software modules of a generic software architecture. Data items to process and transfer were identified for each of the modules. Each site collected implementation options available within their respective HIS environment for each module, provided a prototypical implementation based on available implementation possibilities and supported the patient recruitment of a clinical trial as a proof of concept. 24 commonly reported and requested features of a PRS were identified, 13 of them prioritized as being mandatory. A UML version 2 based software architecture containing 5 software modules covering these features was developed. 13 data item groups processed by the modules, thus required to be available electronically, have been identified. Several implementation options could be identified for each module, most of them being available at multiple sites. Utilizing available tools, a PRS could be implemented in each of the five participating German university hospitals. A set of required features and data items of a PRS has been described for the first time. The software architecture covers all features in a clear, well-defined way. The variety of implementation options and the prototypes show that it is possible to implement the given architecture in different HIS environments, thus enabling more sites to successfully support patient recruitment in clinical trials.
An open, interoperable, and scalable prehospital information technology network architecture.
Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy
2011-01-01
Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.
39 CFR 501.7 - Postage Evidencing System requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Information-Based Indicia and Security Architecture for Open IBI Postage Evidencing Systems or Performance Criteria for Information-Based Indicia and Security Architecture for Closed IBI Postage Metering Systems...
Long-term knowledge acquisition using contextual information in a memory-inspired robot architecture
NASA Astrophysics Data System (ADS)
Pratama, Ferdian; Mastrogiovanni, Fulvio; Lee, Soon Geul; Chong, Nak Young
2017-03-01
In this paper, we present a novel cognitive framework allowing a robot to form memories of relevant traits of its perceptions and to recall them when necessary. The framework is based on two main principles: on the one hand, we propose an architecture inspired by current knowledge in human memory organisation; on the other hand, we integrate such an architecture with the notion of context, which is used to modulate the knowledge acquisition process when consolidating memories and forming new ones, as well as with the notion of familiarity, which is employed to retrieve proper memories given relevant cues. Although much research has been carried out, which exploits Machine Learning approaches to provide robots with internal models of their environment (including objects and occurring events therein), we argue that such approaches may not be the right direction to follow if a long-term, continuous knowledge acquisition is to be achieved. As a case study scenario, we focus on both robot-environment and human-robot interaction processes. In case of robot-environment interaction, a robot performs pick and place movements using the objects in the workspace, at the same time observing their displacement on a table in front of it, and progressively forms memories defined as relevant cues (e.g. colour, shape or relative position) in a context-aware fashion. As far as human-robot interaction is concerned, the robot can recall specific snapshots representing past events using both sensory information and contextual cues upon request by humans.
Meta-Design and the Triple Learning Organization in Architectural Design Process
NASA Astrophysics Data System (ADS)
Barelkowski, Robert
2017-10-01
The paper delves into the improvement of Meta-Design methodology being the result of implementation of triple learning organization. Grown from the concept of reflective practice, it offers an opportunity to segregate and hierarchize both criteria and knowledge management and at least twofold application. It induces constant feedback loops recharging the basic level of “design” with second level of “learning from design” and third level of “learning from learning”. While learning from design reflects the absorption of knowledge, structuralization of skills, management of information, learning from learning gives deeper understanding and provides axiological perspective which is necessary when combining cultural, social, and abstract conceptual problems. The second level involves multidisciplinary applications imported from many engineering disciplines, technical sciences, but also psychological background, or social environment. The third level confronts these applications with their respective sciences (wide extra-architectural knowledge) and axiological issues. This distinction may be represented in difference between e.g. purposeful, systemic use of participatory design which again generates experience-by-doing versus use of disciplinary knowledge starting from its theoretical framework, then narrowed down to be relevant to particular design task. The paper discusses the application in two cases: awarded competition proposal of Digital Arts Museum in Madrid and BAIRI university building. Both cases summarize the effects of implementation and expose the impact of triple-loop knowledge circles onto design, teaching the architect or helping them to learn how to manage information flows and how to accommodate paradigm shifts in the architectural design process.
NASA Astrophysics Data System (ADS)
Stanga, C.; Spinelli, C.; Brumana, R.; Oreni, D.; Valente, R.; Banfi, F.
2017-08-01
This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.
Security Shift in Future Network Architectures
2010-11-01
RTO-MP-IST-091 2 - 1 Security Shift in Future Network Architectures Tim Hartog, M.Sc Information Security Dept. TNO Information and...current practice military communication infrastructures are deployed as stand-alone networked information systems. Network -Enabled Capabilities (NEC) and...information architects and security specialists about the separation of network and information security, the consequences of this shift and our view
a Framework for Architectural Heritage Hbim Semantization and Development
NASA Astrophysics Data System (ADS)
Brusaporci, S.; Maiezza, P.; Tata, A.
2018-05-01
Despite the recognized advantages of the use of BIM in the field of architecture and engineering, the extension of this procedure to the architectural heritage is neither immediate nor critical. The uniqueness and irregularity of historical architecture, on the one hand, and the great quantity of information necessary for the knowledge of architectural heritage, on the other, require appropriate reflections. The aim of this paper is to define a general framework for the use of BIM procedures for architectural heritage. The proposed methodology consists of three different Level of Development (LoD), depending on the characteristics of the building and the objectives of the study: a simplified model with a low geometric accuracy and a minimum quantity of information (LoD 200); a model nearer to the reality but, however, with a high deviation between virtual and real model (LoD 300); a detailed BIM model that reproduce as much as possible the geometric irregularities of the building and is enriched by the maximum quantity of information available (LoD 400).
Design and implementation for integrated UAV multi-spectral inspection system
NASA Astrophysics Data System (ADS)
Zhu, X.; Li, X.; Yan, F.
2018-04-01
In order to improve the working efficiency of the transmission line inspection and reduce the labour intensity of the inspectors, this paper presents an Unmanned Aerial Vehicle (UAV) inspection system architecture for the transmission line inspection. In this document, the light-duty design for different inspection equipment and processing terminals is completed. It presents the reference design for the information-processing terminal, supporting the inspection and interactive equipment accessing, and obtains all performance indicators of the inspection information processing through the tests. Practical application shows that the UAV inspection system supports access and management of different types of mainstream fault detection equipment, and can implement the independent diagnosis of the detected information to generate inspection reports in line with industry norms, which can meet the fast, timely, and efficient requirements for the power line inspection work.
Gis-Hbim Integration for the Management of Historical Buildings
NASA Astrophysics Data System (ADS)
Vacca, G.; Quaquero, E.; Pili, D.; Brandolini, M.
2018-05-01
As is well known, Italy's very consistent buildings stock has become the major field for real estate investments and for the related projects and actions. It is a heritage that is often barely known and extremely complex, whose management has until now been addressed in a rather casual and uninformed manner, with unsatisfactory and sometimes disastrous outcomes. The situation is worse in the case of buildings of particular historical, artistic and architectural value so frequent within the heritage of our country. This paper shows the findings of an ongoing research which is aimed at structuring the cognitive process and assessing enhancement and re-functionalisation scenarios of our historical and architectural heritage through the use and integration of information systems such as BIM and the GIS. The work led to the development of a workflow able to integrate the contribution of the HBIM and GIS methodologies in the structuring and management of a wide range of digital data and information useful for its management. The research, focused on "La Gran Torre di Oristano, is aimed at creating the best conditions for an integrated and multidisciplinary strategy of requalification and refunctionalisation of historical and architectural heritage.