Sample records for architecture knowledge management

  1. An Object-Oriented Software Architecture for the Explorer-2 Knowledge Management Environment

    PubMed Central

    Tarabar, David B.; Greenes, Robert A.; Slosser, Eric T.

    1989-01-01

    Explorer-2 is a workstation based environment to facilitate knowledge management. It provides consistent access to a broad range of knowledge on the basis of purpose, not type. We have developed a software architecture based on Object-Oriented programming for Explorer-2. We have defined three classes of program objects: Knowledge ViewFrames, Knowledge Resources, and Knowledge Bases. This results in knowledge management at three levels: the screen level, the disk level and the meta-knowledge level. We have applied this design to several knowledge bases, and believe that there is a broad applicability of this design.

  2. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  3. Design, Analysis and User Acceptance of Architectural Design Education in Learning System Based on Knowledge Management Theory

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Lin, Yu-An; Wen, Ming-Hui; Perng, Yeng-Hong; Hsu, I-Ting

    2016-01-01

    The major purpose of this study is to develop an architectural design knowledge management learning system with corresponding learning activities to help the students have meaningful learning and improve their design capability in their learning process. Firstly, the system can help the students to obtain and share useful knowledge. Secondly,…

  4. An architecture for automated fault diagnosis. [Space Station Module/Power Management And Distribution

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry R.

    1989-01-01

    A description is given of the SSM/PMAD power system automation testbed, which was developed using a systems engineering approach. The architecture includes a knowledge-based system and has been successfully used in power system management and fault diagnosis. Architectural issues which effect overall system activities and performance are examined. The knowledge-based system is discussed along with its associated automation implications, and interfaces throughout the system are presented.

  5. DataHub knowledge based assistance for science visualization and analysis using large distributed databases

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.

    1991-01-01

    Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.

  6. A Collaborative Knowledge Plane for Autonomic Networks

    NASA Astrophysics Data System (ADS)

    Mbaye, Maïssa; Krief, Francine

    Autonomic networking aims to give network components self-managing capabilities. Several autonomic architectures have been proposed. Each of these architectures includes sort of a knowledge plane which is very important to mimic an autonomic behavior. Knowledge plane has a central role for self-functions by providing suitable knowledge to equipment and needs to learn new strategies for more accuracy.However, defining knowledge plane's architecture is still a challenge for researchers. Specially, defining the way cognitive supports interact each other in knowledge plane and implementing them. Decision making process depends on these interactions between reasoning and learning parts of knowledge plane. In this paper we propose a knowledge plane's architecture based on machine learning (inductive logic programming) paradigm and situated view to deal with distributed environment. This architecture is focused on two self-functions that include all other self-functions: self-adaptation and self-organization. Study cases are given and implemented.

  7. Establishment of a Digital Knowledge Conversion Architecture Design Learning with High User Acceptance

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Apollo; Weng, Kuo-Hua

    2017-01-01

    The purpose of this study is to design a knowledge conversion and management digital learning system for architecture design learning, helping students to share, extract, use and create their design knowledge through web-based interactive activities based on socialization, internalization, combination and externalization process in addition to…

  8. Molecular basis of angiosperm tree architecture

    USDA-ARS?s Scientific Manuscript database

    The shoot architecture of trees greatly impacts orchard and forest management methods. Amassing greater knowledge of the molecular genetics behind tree form can benefit these industries as well as contribute to basic knowledge of plant developmental biology. This review covers basic components of ...

  9. Beginning to manage drug discovery and development knowledge.

    PubMed

    Sumner-Smith, M

    2001-05-01

    Knowledge management approaches and technologies are beginning to be implemented by the pharmaceutical industry in support of new drug discovery and development processes aimed at greater efficiencies and effectiveness. This trend coincides with moves to reduce paper, coordinate larger teams with more diverse skills that are distributed around the globe, and to comply with regulatory requirements for electronic submissions and the associated maintenance of electronic records. Concurrently, the available technologies have implemented web-based architectures with a greater range of collaborative tools and personalization through portal approaches. However, successful application of knowledge management methods depends on effective cultural change management, as well as proper architectural design to match the organizational and work processes within a company.

  10. Knowledge Innovation System: The Common Language.

    ERIC Educational Resources Information Center

    Rogers, Debra M. Amidon

    1993-01-01

    The Knowledge Innovation System is a management technique in which a networked enterprise uses knowledge flow as a collaborative advantage. Enterprise Management System-Architecture, which can be applied to collaborative activities, has five domains: economic, sociological, psychological, managerial, and technological. (SK)

  11. Knowledge Management System Model for Learning Organisations

    ERIC Educational Resources Information Center

    Amin, Yousif; Monamad, Roshayu

    2017-01-01

    Based on the literature of knowledge management (KM), this paper reports on the progress of developing a new knowledge management system (KMS) model with components architecture that are distributed over the widely-recognised socio-technical system (STS) aspects to guide developers for selecting the most applicable components to support their KM…

  12. Knowledge management model for teleconsulting in telemedicine.

    PubMed

    Pico, Lilia Edith Aparicio; Cuenca, Orlando Rodriguez; Alvarez, Daniel José Salas; Salgado, Piere Augusto Peña

    2008-01-01

    The present article shows a study about requirements for teleconsulting in a telemedicine solution in order to create a knowledge management system. Several concepts have been found related to the term teleconsulting in telemedicine which will serve to clear up their corresponding applications, potentialities, and scope. Afterwards, different theories about the art state in knowledge management have been considered by exploring methodologies and architectures to establish the trends of knowledge management and the possibilities of using them in teleconsulting. Furthermore, local and international experiences have been examined to assess knowledge management systems focused on telemedicine. The objective of this study is to obtain a model for developing teleconsulting systems in Colombia because we have many health-information management systems but they don't offer telemedicine services for remote areas. In Colombia there are many people in rural areas with different necessities and they don't have medicine services, teleconsulting will be a good solution to this problem. Lastly, a model of a knowledge system is proposed for teleconsulting in telemedicine. The model has philosophical principles and architecture that shows the fundamental layers for its development.

  13. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations

    PubMed Central

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts. PMID:25993414

  14. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    PubMed

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  15. Content and Knowledge Management in a Digital Library and Museum.

    ERIC Educational Resources Information Center

    Yeh, Jian-Hua; Chang, Jia-Yang; Oyang, Yen-Jen

    2000-01-01

    Discusses the design of the National Taiwan University Digital Library and Museum that addresses both content and knowledge management. Describes a two-tier repository architecture that facilitates content management, includes an object-oriented model to facilitate the management of temporal information, and eliminates the need to manually…

  16. Applying Service-Oriented Architecture on The Development of Groundwater Modeling Support System

    NASA Astrophysics Data System (ADS)

    Li, C. Y.; WANG, Y.; Chang, L. C.; Tsai, J. P.; Hsiao, C. T.

    2016-12-01

    Groundwater simulation has become an essential step on the groundwater resources management and assessment. There are many stand-alone pre- and post-processing software packages to alleviate the model simulation loading, but the stand-alone software do not consider centralized management of data and simulation results neither do they provide network sharing functions. Hence, it is difficult to share and reuse the data and knowledge (simulation cases) systematically within or across companies. Therefore, this study develops a centralized and network based groundwater modeling support system to assist model construction. The system is based on service-oriented architecture and allows remote user to develop their modeling cases on internet. The data and cases (knowledge) are thus easy to manage centralized. MODFLOW is the modeling engine of the system, which is the most popular groundwater model in the world. The system provides a data warehouse to restore groundwater observations, MODFLOW Support Service, MODFLOW Input File & Shapefile Convert Service, MODFLOW Service, and Expert System Service to assist researchers to build models. Since the system architecture is service-oriented, it is scalable and flexible. The system can be easily extended to include the scenarios analysis and knowledge management to facilitate the reuse of groundwater modeling knowledge.

  17. Architecture and Initial Development of a Digital Library Platform for Computable Knowledge Objects for Health.

    PubMed

    Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P

    2017-01-01

    Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.

  18. A Knowledge Management Technology Architecture for Educational Research Organisations: Scaffolding Research Projects and Workflow Processing

    ERIC Educational Resources Information Center

    Muthukumar; Hedberg, John G.

    2005-01-01

    There is growing recognition that the economic climate of the world is shifting towards a knowledge-based economy where knowledge will be cherished as the most prized asset. In this regard, technology can be leveraged as a useful tool in effectually managing the knowledge capital of an organisation. Although several research studies have advanced…

  19. A knowledge Management Technology Architecture for Educational Research Organisations: Scaffolding Research Projects and Workflow Processing

    ERIC Educational Resources Information Center

    Muthukumar; Hedberg, John G.

    2005-01-01

    There is growing recognition that the economic climate of the world is shifting towards a knowledge-based economy where knowledge will be cherished as the most prized asset. In this regard, technology can be leveraged as a useful tool in effectually managing the knowledge capital of an organisation. Although several research studies have advanced…

  20. Knowledge Framework Implementation with Multiple Architectures - 13090

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, H.; Lagos, L.; Quintero, W.

    2013-07-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework andmore » architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)« less

  1. Application of a Multimedia Service and Resource Management Architecture for Fault Diagnosis

    PubMed Central

    Castro, Alfonso; Sedano, Andrés A.; García, Fco. Javier; Villoslada, Eduardo

    2017-01-01

    Nowadays, the complexity of global video products has substantially increased. They are composed of several associated services whose functionalities need to adapt across heterogeneous networks with different technologies and administrative domains. Each of these domains has different operational procedures; therefore, the comprehensive management of multi-domain services presents serious challenges. This paper discusses an approach to service management linking fault diagnosis system and Business Processes for Telefónica’s global video service. The main contribution of this paper is the proposal of an extended service management architecture based on Multi Agent Systems able to integrate the fault diagnosis with other different service management functionalities. This architecture includes a distributed set of agents able to coordinate their actions under the umbrella of a Shared Knowledge Plane, inferring and sharing their knowledge with semantic techniques and three types of automatic reasoning: heterogeneous, ontology-based and Bayesian reasoning. This proposal has been deployed and validated in a real scenario in the video service offered by Telefónica Latam. PMID:29283398

  2. Application of a Multimedia Service and Resource Management Architecture for Fault Diagnosis.

    PubMed

    Castro, Alfonso; Sedano, Andrés A; García, Fco Javier; Villoslada, Eduardo; Villagrá, Víctor A

    2017-12-28

    Nowadays, the complexity of global video products has substantially increased. They are composed of several associated services whose functionalities need to adapt across heterogeneous networks with different technologies and administrative domains. Each of these domains has different operational procedures; therefore, the comprehensive management of multi-domain services presents serious challenges. This paper discusses an approach to service management linking fault diagnosis system and Business Processes for Telefónica's global video service. The main contribution of this paper is the proposal of an extended service management architecture based on Multi Agent Systems able to integrate the fault diagnosis with other different service management functionalities. This architecture includes a distributed set of agents able to coordinate their actions under the umbrella of a Shared Knowledge Plane, inferring and sharing their knowledge with semantic techniques and three types of automatic reasoning: heterogeneous, ontology-based and Bayesian reasoning. This proposal has been deployed and validated in a real scenario in the video service offered by Telefónica Latam.

  3. Knowledge Resources - A Knowledge Management Approach for Digital Ecosystems

    NASA Astrophysics Data System (ADS)

    Kurz, Thomas; Eder, Raimund; Heistracher, Thomas

    The paper at hand presents an innovative approach for the conception and implementation of knowledge management in Digital Ecosystems. Based on a reflection of Digital Ecosystem research of the past years, an architecture is outlined which utilizes Knowledge Resources as the central and simplest entities of knowledge transfer. After the discussion of the related conception, the result of a first prototypical implementation is described that helps the transformation of implicit knowledge to explicit knowledge for wide use.

  4. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  5. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    PubMed

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  6. Participative Knowledge Production of Learning Objects for E-Books.

    ERIC Educational Resources Information Center

    Dodero, Juan Manuel; Aedo, Ignacio; Diaz, Paloma

    2002-01-01

    Defines a learning object as any digital resource that can be reused to support learning and thus considers electronic books as learning objects. Highlights include knowledge management; participative knowledge production, i.e. authoring electronic books by a distributed group of authors; participative knowledge production architecture; and…

  7. A knowledge base architecture for distributed knowledge agents

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

  8. Knowledge representation and management enabling intelligent interoperability - principles and standards.

    PubMed

    Blobel, Bernd

    2013-01-01

    Based on the paradigm changes for health, health services and underlying technologies as well as the need for at best comprehensive and increasingly automated interoperability, the paper addresses the challenge of knowledge representation and management for medical decision support. After introducing related definitions, a system-theoretical, architecture-centric approach to decision support systems (DSSs) and appropriate ways for representing them using systems of ontologies is given. Finally, existing and emerging knowledge representation and management standards are presented. The paper focuses on the knowledge representation and management part of DSSs, excluding the reasoning part from consideration.

  9. An architecture for heuristic control of real-time processes

    NASA Technical Reports Server (NTRS)

    Raulefs, P.; Thorndyke, P. W.

    1987-01-01

    Abstract Process management combines complementary approaches of heuristic reasoning and analytical process control. Management of a continuous process requires monitoring the environment and the controlled system, assessing the ongoing situation, developing and revising planned actions, and controlling the execution of the actions. For knowledge-intensive domains, process management entails the potentially time-stressed cooperation among a variety of expert systems. By redesigning a blackboard control architecture in an object-oriented framework, researchers obtain an approach to process management that considerably extends blackboard control mechanisms and overcomes limitations of blackboard systems.

  10. Meta-Design and the Triple Learning Organization in Architectural Design Process

    NASA Astrophysics Data System (ADS)

    Barelkowski, Robert

    2017-10-01

    The paper delves into the improvement of Meta-Design methodology being the result of implementation of triple learning organization. Grown from the concept of reflective practice, it offers an opportunity to segregate and hierarchize both criteria and knowledge management and at least twofold application. It induces constant feedback loops recharging the basic level of “design” with second level of “learning from design” and third level of “learning from learning”. While learning from design reflects the absorption of knowledge, structuralization of skills, management of information, learning from learning gives deeper understanding and provides axiological perspective which is necessary when combining cultural, social, and abstract conceptual problems. The second level involves multidisciplinary applications imported from many engineering disciplines, technical sciences, but also psychological background, or social environment. The third level confronts these applications with their respective sciences (wide extra-architectural knowledge) and axiological issues. This distinction may be represented in difference between e.g. purposeful, systemic use of participatory design which again generates experience-by-doing versus use of disciplinary knowledge starting from its theoretical framework, then narrowed down to be relevant to particular design task. The paper discusses the application in two cases: awarded competition proposal of Digital Arts Museum in Madrid and BAIRI university building. Both cases summarize the effects of implementation and expose the impact of triple-loop knowledge circles onto design, teaching the architect or helping them to learn how to manage information flows and how to accommodate paradigm shifts in the architectural design process.

  11. Communication management between architects and clients

    NASA Astrophysics Data System (ADS)

    Taleb, Hala; Ismail, Syuhaida; Wahab, Mohammad Hussaini; Rani, Wan Nurul Mardiah Wan Mohd.

    2017-10-01

    Architectural projects are initiated with the designing phase, that tends to translate and materialize the client's requirements and needs. This phase is highly and directly affected by the exchanged information and communication between architects with their clients. Nevertheless, despite of its importance, studies have proven that communication management, being a significant field of project management, is distinctly overlooked by architects in the architectural industry. Thus, this paper highlights the current practices and attributes of communication management in the context of architectural design phase. It outlines the different aspects' definitions of communication, as well as communication management standards and practices. By the end of this paper, the findings are expected to increase the communication management knowledge amongst architects to achieve success in projects by promoting the relationships between them and their clients. Finally, this paper uncover the architects' need for significant improvement of communication management as an insistent matter to ultimately fulfill project success.

  12. Knowledge base and sensor bus messaging service architecture for critical tsunami warning and decision-support

    NASA Astrophysics Data System (ADS)

    Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.

    2012-04-01

    The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.

  13. Information Architecture for Quality Management Support in Hospitals.

    PubMed

    Rocha, Álvaro; Freixo, Jorge

    2015-10-01

    Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

  14. Knowledge management in healthcare: towards 'knowledge-driven' decision-support services.

    PubMed

    Abidi, S S

    2001-09-01

    In this paper, we highlight the involvement of Knowledge Management in a healthcare enterprise. We argue that the 'knowledge quotient' of a healthcare enterprise can be enhanced by procuring diverse facets of knowledge from the seemingly placid healthcare data repositories, and subsequently operationalising the procured knowledge to derive a suite of Strategic Healthcare Decision-Support Services that can impact strategic decision-making, planning and management of the healthcare enterprise. In this paper, we firstly present a reference Knowledge Management environment-a Healthcare Enterprise Memory-with the functionality to acquire, share and operationalise the various modalities of healthcare knowledge. Next, we present the functional and architectural specification of a Strategic Healthcare Decision-Support Services Info-structure, which effectuates a synergy between knowledge procurement (vis-à-vis Data Mining) and knowledge operationalisation (vis-à-vis Knowledge Management) techniques to generate a suite of strategic knowledge-driven decision-support services. In conclusion, we argue that the proposed Healthcare Enterprise Memory is an attempt to rethink the possible sources of leverage to improve healthcare delivery, hereby providing a valuable strategic planning and management resource to healthcare policy makers.

  15. The SysMan monitoring service and its management environment

    NASA Astrophysics Data System (ADS)

    Debski, Andrzej; Janas, Ekkehard

    1996-06-01

    Management of modern information systems is becoming more and more complex. There is a growing need for powerful, flexible and affordable management tools to assist system managers in maintaining such systems. It is at the same time evident that effective management should integrate network management, system management and application management in a uniform way. Object oriented OSI management architecture with its four basic modelling concepts (information, organization, communication and functional models) together with widely accepted distribution platforms such as ANSA/CORBA, constitutes a reliable and modern framework for the implementation of a management toolset. This paper focuses on the presentation of concepts and implementation results of an object oriented management toolset developed and implemented within the framework of the ESPRIT project 7026 SysMan. An overview is given of the implemented SysMan management services including the System Management Service, Monitoring Service, Network Management Service, Knowledge Service, Domain and Policy Service, and the User Interface. Special attention is paid to the Monitoring Service which incorporates the architectural key entity responsible for event management. Its architecture and building components, especially filters, are emphasized and presented in detail.

  16. Knowledge Management in Role Based Agents

    NASA Astrophysics Data System (ADS)

    Kır, Hüseyin; Ekinci, Erdem Eser; Dikenelli, Oguz

    In multi-agent system literature, the role concept is getting increasingly researched to provide an abstraction to scope beliefs, norms, goals of agents and to shape relationships of the agents in the organization. In this research, we propose a knowledgebase architecture to increase applicability of roles in MAS domain by drawing inspiration from the self concept in the role theory of sociology. The proposed knowledgebase architecture has granulated structure that is dynamically organized according to the agent's identification in a social environment. Thanks to this dynamic structure, agents are enabled to work on consistent knowledge in spite of inevitable conflicts between roles and the agent. The knowledgebase architecture is also implemented and incorporated into the SEAGENT multi-agent system development framework.

  17. Ground support system methodology and architecture

    NASA Technical Reports Server (NTRS)

    Schoen, P. D.

    1991-01-01

    A synergistic approach to systems test and support is explored. A building block architecture provides transportability of data, procedures, and knowledge. The synergistic approach also lowers cost and risk for life cycle of a program. The determination of design errors at the earliest phase reduces cost of vehicle ownership. Distributed scaleable architecture is based on industry standards maximizing transparency and maintainability. Autonomous control structure provides for distributed and segmented systems. Control of interfaces maximizes compatibility and reuse, reducing long term program cost. Intelligent data management architecture also reduces analysis time and cost (automation).

  18. An ontology-based telemedicine tasks management system architecture.

    PubMed

    Nageba, Ebrahim; Fayn, Jocelyne; Rubel, Paul

    2008-01-01

    The recent developments in ambient intelligence and ubiquitous computing offer new opportunities for the design of advanced Telemedicine systems providing high quality services, anywhere, anytime. In this paper we present an approach for building an ontology-based task-driven telemedicine system. The architecture is composed of a task management server, a communication server and a knowledge base for enabling decision makings taking account of different telemedical concepts such as actors, resources, services and the Electronic Health Record. The final objective is to provide an intelligent management of the different types of available human, material and communication resources.

  19. A task-based support architecture for developing point-of-care clinical decision support systems for the emergency department.

    PubMed

    Wilk, S; Michalowski, W; O'Sullivan, D; Farion, K; Sayyad-Shirabad, J; Kuziemsky, C; Kukawka, B

    2013-01-01

    The purpose of this study was to create a task-based support architecture for developing clinical decision support systems (CDSSs) that assist physicians in making decisions at the point-of-care in the emergency department (ED). The backbone of the proposed architecture was established by a task-based emergency workflow model for a patient-physician encounter. The architecture was designed according to an agent-oriented paradigm. Specifically, we used the O-MaSE (Organization-based Multi-agent System Engineering) method that allows for iterative translation of functional requirements into architectural components (e.g., agents). The agent-oriented paradigm was extended with ontology-driven design to implement ontological models representing knowledge required by specific agents to operate. The task-based architecture allows for the creation of a CDSS that is aligned with the task-based emergency workflow model. It facilitates decoupling of executable components (agents) from embedded domain knowledge (ontological models), thus supporting their interoperability, sharing, and reuse. The generic architecture was implemented as a pilot system, MET3-AE--a CDSS to help with the management of pediatric asthma exacerbation in the ED. The system was evaluated in a hospital ED. The architecture allows for the creation of a CDSS that integrates support for all tasks from the task-based emergency workflow model, and interacts with hospital information systems. Proposed architecture also allows for reusing and sharing system components and knowledge across disease-specific CDSSs.

  20. Definition of information technology architectures for continuous data management and medical device integration in diabetes.

    PubMed

    Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J

    2008-09-01

    The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.

  1. Knowledge-based processing for aircraft flight control

    NASA Technical Reports Server (NTRS)

    Painter, John H.; Glass, Emily; Economides, Gregory; Russell, Paul

    1994-01-01

    This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area.

  2. A Technical Infrastructure to Integrate Dynamics AX ERP and CRM into University Curriculum

    ERIC Educational Resources Information Center

    Wimmer, Hayden; Hall, Kenneth

    2016-01-01

    Enterprise Resource Planning and Customer Relationship Management are becoming important topics at the university level, and are increasingly receiving course-level attention in the curriculum. In fact, the Information Systems Body of Knowledge specifically identifies Enterprise Architecture as an Information Systems-specific knowledge area. The…

  3. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    PubMed

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  4. 36 CFR 905.735-401 - Standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... knowledge and experience in one or more fields of history, architecture, city planning, retailing, real... responsibilities for the operation and management of the Corporation consistent with these regulations, and other...

  5. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  6. Recommended Architecture for a Knowledge Management System for the Undersea Launchers Division at the Naval Undersea Warfare Center

    DTIC Science & Technology

    2010-09-01

    OF CONTENTS I. INTRODUCTION ............................................................................................. 1 A. BACKGROUND...PROBLEM DEFINITION............................................................................... 11 A. INTRODUCTION ...27 III. REQUIREMENTS DEVELOPMENT ............................................................. 29 A. INTRODUCTION

  7. A Collaborative Reasoning Maintenance System for a Reliable Application of Legislations

    NASA Astrophysics Data System (ADS)

    Tamisier, Thomas; Didry, Yoann; Parisot, Olivier; Feltz, Fernand

    Decision support systems are nowadays used to disentangle all kinds of intricate situations and perform sophisticated analysis. Moreover, they are applied in areas where the knowledge can be heterogeneous, partially un-formalized, implicit, or diffuse. The representation and management of this knowledge become the key point to ensure the proper functioning of the system and keep an intuitive view upon its expected behavior. This paper presents a generic architecture for implementing knowledge-base systems used in collaborative business, where the knowledge is organized into different databases, according to the usage, persistence and quality of the information. This approach is illustrated with Cadral, a customizable automated tool built on this architecture and used for processing family benefits applications at the National Family Benefits Fund of the Grand-Duchy of Luxembourg.

  8. Integrity Constraint Monitoring in Software Development: Proposed Architectures

    NASA Technical Reports Server (NTRS)

    Fernandez, Francisco G.

    1997-01-01

    In the development of complex software systems, designers are required to obtain from many sources and manage vast amounts of knowledge of the system being built and communicate this information to personnel with a variety of backgrounds. Knowledge concerning the properties of the system, including the structure of, relationships between and limitations of the data objects in the system, becomes increasingly more vital as the complexity of the system and the number of knowledge sources increases. Ensuring that violations of these properties do not occur becomes steadily more challenging. One approach toward managing the enforcement or system properties, called context monitoring, uses a centralized repository of integrity constraints and a constraint satisfiability mechanism for dynamic verification of property enforcement during program execution. The focus of this paper is to describe possible software architectures that define a mechanism for dynamically checking the satisfiability of a set of constraints on a program. The next section describes the context monitoring approach in general. Section 3 gives an overview of the work currently being done toward the addition of an integrity constraint satisfiability mechanism to a high-level program language, SequenceL, and demonstrates how this model is being examined to develop a general software architecture. Section 4 describes possible architectures for a general constraint satisfiability mechanism, as well as an alternative approach that, uses embedded database queries in lieu of an external monitor. The paper concludes with a brief summary outlining the, current state of the research and future work.

  9. Architecture of next-generation information management systems for digital radiology enterprises

    NASA Astrophysics Data System (ADS)

    Wong, Stephen T. C.; Wang, Huili; Shen, Weimin; Schmidt, Joachim; Chen, George; Dolan, Tom

    2000-05-01

    Few information systems today offer a clear and flexible means to define and manage the automated part of radiology processes. None of them provide a coherent and scalable architecture that can easily cope with heterogeneity and inevitable local adaptation of applications. Most importantly, they often lack a model that can integrate clinical and administrative information to aid better decisions in managing resources, optimizing operations, and improving productivity. Digital radiology enterprises require cost-effective solutions to deliver information to the right person in the right place and at the right time. We propose a new architecture of image information management systems for digital radiology enterprises. Such a system is based on the emerging technologies in workflow management, distributed object computing, and Java and Web techniques, as well as Philips' domain knowledge in radiology operations. Our design adapts the approach of '4+1' architectural view. In this new architecture, PACS and RIS will become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, it will provide powerful query and statistical functions for managing resources and improving productivity in real time. This work will lead to a new direction of image information management in the next millennium. We will illustrate the innovative design with implemented examples of a working prototype.

  10. Using Ada to implement the operations management system in a community of experts

    NASA Technical Reports Server (NTRS)

    Frank, M. S.

    1986-01-01

    An architecture is described for the Space Station Operations Management System (OMS), consisting of a distributed expert system framework implemented in Ada. The motivation for such a scheme is based on the desire to integrate the very diverse elements of the OMS while taking maximum advantage of knowledge based systems technology. Part of the foundation of an Ada based distributed expert system was accomplished in the form of a proof of concept prototype for the KNOMES project (Knowledge-based Maintenance Expert System). This prototype successfully used concurrently active experts to accomplish monitoring and diagnosis for the Remote Manipulator System. The basic concept of this software architecture is named ACTORS for Ada Cognitive Task ORganization Scheme. It is when one considers the overall problem of integrating all of the OMS elements into a cooperative system that the AI solution stands out. By utilizing a distributed knowledge based system as the framework for OMS, it is possible to integrate those components which need to share information in an intelligent manner.

  11. An architecture for integrating distributed and cooperating knowledge-based Air Force decision aids

    NASA Technical Reports Server (NTRS)

    Nugent, Richard O.; Tucker, Richard W.

    1988-01-01

    MITRE has been developing a Knowledge-Based Battle Management Testbed for evaluating the viability of integrating independently-developed knowledge-based decision aids in the Air Force tactical domain. The primary goal for the testbed architecture is to permit a new system to be added to a testbed with little change to the system's software. Each system that connects to the testbed network declares that it can provide a number of services to other systems. When a system wants to use another system's service, it does not address the server system by name, but instead transmits a request to the testbed network asking for a particular service to be performed. A key component of the testbed architecture is a common database which uses a relational database management system (RDBMS). The RDBMS provides a database update notification service to requesting systems. Normally, each system is expected to monitor data relations of interest to it. Alternatively, a system may broadcast an announcement message to inform other systems that an event of potential interest has occurred. Current research is aimed at dealing with issues resulting from integration efforts, such as dealing with potential mismatches of each system's assumptions about the common database, decentralizing network control, and coordinating multiple agents.

  12. From hospital information system components to the medical record and clinical guidelines & protocols.

    PubMed

    Veloso, M; Estevão, N; Ferreira, P; Rodrigues, R; Costa, C T; Barahona, P

    1997-01-01

    This paper introduces an ongoing project towards the development of a new generation HIS, aiming at the integration of clinical and administrative information within a common framework. Its design incorporates explicit knowledge about domain objects and professional activities to be processed by the system together with related knowledge management services and act management services. The paper presents the conceptual model of the proposed HIS architecture, that supports a rich and fully integrated patient data model, enabling the implementation of a dynamic electronic patient record tightly coupled with computerised guideline knowledge bases.

  13. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  14. A proposed clinical decision support architecture capable of supporting whole genome sequence information.

    PubMed

    Welch, Brandon M; Loya, Salvador Rodriguez; Eilbeck, Karen; Kawamoto, Kensaku

    2014-04-04

    Whole genome sequence (WGS) information may soon be widely available to help clinicians personalize the care and treatment of patients. However, considerable barriers exist, which may hinder the effective utilization of WGS information in a routine clinical care setting. Clinical decision support (CDS) offers a potential solution to overcome such barriers and to facilitate the effective use of WGS information in the clinic. However, genomic information is complex and will require significant considerations when developing CDS capabilities. As such, this manuscript lays out a conceptual framework for a CDS architecture designed to deliver WGS-guided CDS within the clinical workflow. To handle the complexity and breadth of WGS information, the proposed CDS framework leverages service-oriented capabilities and orchestrates the interaction of several independently-managed components. These independently-managed components include the genome variant knowledge base, the genome database, the CDS knowledge base, a CDS controller and the electronic health record (EHR). A key design feature is that genome data can be stored separately from the EHR. This paper describes in detail: (1) each component of the architecture; (2) the interaction of the components; and (3) how the architecture attempts to overcome the challenges associated with WGS information. We believe that service-oriented CDS capabilities will be essential to using WGS information for personalized medicine.

  15. A Proposed Clinical Decision Support Architecture Capable of Supporting Whole Genome Sequence Information

    PubMed Central

    Welch, Brandon M.; Rodriguez Loya, Salvador; Eilbeck, Karen; Kawamoto, Kensaku

    2014-01-01

    Whole genome sequence (WGS) information may soon be widely available to help clinicians personalize the care and treatment of patients. However, considerable barriers exist, which may hinder the effective utilization of WGS information in a routine clinical care setting. Clinical decision support (CDS) offers a potential solution to overcome such barriers and to facilitate the effective use of WGS information in the clinic. However, genomic information is complex and will require significant considerations when developing CDS capabilities. As such, this manuscript lays out a conceptual framework for a CDS architecture designed to deliver WGS-guided CDS within the clinical workflow. To handle the complexity and breadth of WGS information, the proposed CDS framework leverages service-oriented capabilities and orchestrates the interaction of several independently-managed components. These independently-managed components include the genome variant knowledge base, the genome database, the CDS knowledge base, a CDS controller and the electronic health record (EHR). A key design feature is that genome data can be stored separately from the EHR. This paper describes in detail: (1) each component of the architecture; (2) the interaction of the components; and (3) how the architecture attempts to overcome the challenges associated with WGS information. We believe that service-oriented CDS capabilities will be essential to using WGS information for personalized medicine. PMID:25411644

  16. [Urban habitants' attitudes toward nature-approximating landscape architecture: taking Hongshan District of Wuhan City, China as a case].

    PubMed

    Yang, Yu-ping; Zhou, Zhi-xiang; Cai, Shao-ping; Gao, Kai; Jia, Ruo

    2011-07-01

    Nature-approximating landscape architecture (NALA) is a concept of sustainable development as applied to landscape architecture, while the urban habitants' awareness and acceptance of NALA idea is the key for the successful application of NALA. Through semi-structured interview, this paper explored the attitudes of the habitants in Hongshan District of Wuhan City toward the NALA design and management, and the influence of the social-economic characteristics of the responders on their attitudes toward the NALA. A fairly low percentage of the responders approved of the NALA design (10.3% - 46.9%) and management (7.4% - 34.9%). The attitudes towards NALA design were mainly affected by the responders' age, and the attitudes toward NALA management were significantly correlated with the responders' age, educational level, and profession. The efficient cause why a large number of responders did not support the NALA was that these responders attached importance to the aesthetic effect of green space, and preferred cleanliness and order. The lack of related ecological knowledge and environmental awareness was the root cause of the lesser support towards NALA. To establish NALA demonstration bases and to intensify the publicity and education of NALA idea and related ecological knowledge could promote an increasing number of urban habitants actively participating in NALA construction.

  17. Conservation Process Model (cpm): a Twofold Scientific Research Scope in the Information Modelling for Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Fiorani, D.; Acierno, M.

    2017-05-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  18. Knowledge-based system for flight information management. Thesis

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1990-01-01

    The use of knowledge-based system (KBS) architectures to manage information on the primary flight display (PFD) of commercial aircraft is described. The PFD information management strategy used tailored the information on the PFD to the tasks the pilot performed. The KBS design and implementation of the task-tailored PFD information management application is described. The knowledge acquisition and subsequent system design of a flight-phase-detection KBS is also described. The flight-phase output of this KBS was used as input to the task-tailored PFD information management KBS. The implementation and integration of this KBS with existing aircraft systems and the other KBS is described. The flight tests are examined of both KBS's, collectively called the Task-Tailored Flight Information Manager (TTFIM), which verified their implementation and integration, and validated the software engineering advantages of the KBS approach in an operational environment.

  19. Development of a component centered fault monitoring and diagnosis knowledge based system for space power system

    NASA Technical Reports Server (NTRS)

    Lee, S. C.; Lollar, Louis F.

    1988-01-01

    The overall approach currently being taken in the development of AMPERES (Autonomously Managed Power System Extendable Real-time Expert System), a knowledge-based expert system for fault monitoring and diagnosis of space power systems, is discussed. The system architecture, knowledge representation, and fault monitoring and diagnosis strategy are examined. A 'component-centered' approach developed in this project is described. Critical issues requiring further study are identified.

  20. 78 FR 9951 - Excepted Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-12

    ...) Not to exceed 3000 positions that require unique cyber security skills and knowledge to perform cyber..., distributed control systems security, cyber incident response, cyber exercise facilitation and management, cyber vulnerability detection and assessment, network and systems engineering, enterprise architecture...

  1. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  2. Big data processing in the cloud - Challenges and platforms

    NASA Astrophysics Data System (ADS)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  3. Toward patient-centered, personalized and personal decision support and knowledge management: a survey.

    PubMed

    Leong, T-Y

    2012-01-01

    This paper summarizes the recent trends and highlights the challenges and opportunities in decision support and knowledge management for patient-centered, personalized, and personal health care. The discussions are based on a broad survey of related references, focusing on the most recent publications. Major advances are examined in the areas of i) shared decision making paradigms, ii) continuity of care infrastructures and architectures, iii) human factors and system design approaches, iv) knowledge management innovations, and v) practical deployment and change considerations. Many important initiatives, projects, and plans with promising results have been identified. The common themes focus on supporting the individual patients who are playing an increasing central role in their own care decision processes. New collaborative decision making paradigms and information infrastructures are required to ensure effective continuity of care. Human factors and usability are crucial for the successful development and deployment of the relevant systems, tools, and aids. Advances in personalized medicine can be achieved through integrating genomic, phenotypic and other biological, individual, and population level information, and gaining useful insights from building and analyzing biological and other models at multiple levels of abstraction. Therefore, new Information and Communication Technologies and evaluation approaches are needed to effectively manage the scale and complexity of biomedical and health information, and adapt to the changing nature of clinical decision support. Recent research in decision support and knowledge management combines heterogeneous information and personal data to provide cost-effective, calibrated, personalized support in shared decision making at the point of care. Current and emerging efforts concentrate on developing or extending conventional paradigms, techniques, systems, and architectures for the new predictive, preemptive, and participatory health care model for patient-centered, personalized medicine. There is also an increasing emphasis on managing complexity with changing care models, processes, and settings.

  4. Knowledge and Skill Competency Values of an Undergraduate University Managed Cooperative Internship Program: A Case Study in Design Education

    ERIC Educational Resources Information Center

    Barbarash, David

    2016-01-01

    Students from the Purdue University landscape architecture program undergo a year-long managed cooperative internship between their junior and senior years of enrollment. During this paid internship students experience the realities of a professional design office outside of the protection of the academic classroom. Through surveys of faculty…

  5. Making Sense of Rocket Science - Building NASA's Knowledge Management Program

    NASA Technical Reports Server (NTRS)

    Holm, Jeanne

    2002-01-01

    The National Aeronautics and Space Administration (NASA) has launched a range of KM activities-from deploying intelligent "know-bots" across millions of electronic sources to ensuring tacit knowledge is transferred across generations. The strategy and implementation focuses on managing NASA's wealth of explicit knowledge, enabling remote collaboration for international teams, and enhancing capture of the key knowledge of the workforce. An in-depth view of the work being done at the Jet Propulsion Laboratory (JPL) shows the integration of academic studies and practical applications to architect, develop, and deploy KM systems in the areas of document management, electronic archives, information lifecycles, authoring environments, enterprise information portals, search engines, experts directories, collaborative tools, and in-process decision capture. These systems, together, comprise JPL's architecture to capture, organize, store, and distribute key learnings for the U.S. exploration of space.

  6. Biology-inspired Architecture for Situation Management

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng

    2006-01-01

    Situation Management is a rapidly developing science combining new techniques for data collection with advanced methods of data fusion to facilitate the process leading to correct decisions prescribing action. Current research focuses on reducing increasing amounts of diverse data to knowledge used by decision makers and on reducing time between observations, decisions and actions. No new technology is more promising for increasing the diversity and fidelity of observations than sensor networks. However, current research on sensor networks concentrates on a centralized network architecture. We believe this trend will not realize the full potential of situation management. We propose a new architecture modeled after biological ecosystems where motes are autonomous and intelligent, yet cooperate with local neighborhoods. Providing a layered approach, they sense and act independently when possible, and cooperate with neighborhoods when necessary. The combination of their local actions results in global effects. While situation management research is currently dominated by military applications, advances envisioned for industrial and business applications have similar requirements. NASA has requirements for intelligent and autonomous systems in future missions that can benefit from advances in situation management. We describe requirements for the Integrated Vehicle Health Management program where our biology-inspired architecture provides a layered approach and decisions can be made at the proper level to improve safety, reduce costs, and improve efficiency in making diagnostic and prognostic assessments of the structural integrity, aerodynamic characteristics, and operation of aircraft.

  7. Concept of operations for knowledge discovery from Big Data across enterprise data warehouses

    NASA Astrophysics Data System (ADS)

    Sukumar, Sreenivas R.; Olama, Mohammed M.; McNair, Allen W.; Nutaro, James J.

    2013-05-01

    The success of data-driven business in government, science, and private industry is driving the need for seamless integration of intra and inter-enterprise data sources to extract knowledge nuggets in the form of correlations, trends, patterns and behaviors previously not discovered due to physical and logical separation of datasets. Today, as volume, velocity, variety and complexity of enterprise data keeps increasing, the next generation analysts are facing several challenges in the knowledge extraction process. Towards addressing these challenges, data-driven organizations that rely on the success of their analysts have to make investment decisions for sustainable data/information systems and knowledge discovery. Options that organizations are considering are newer storage/analysis architectures, better analysis machines, redesigned analysis algorithms, collaborative knowledge management tools, and query builders amongst many others. In this paper, we present a concept of operations for enabling knowledge discovery that data-driven organizations can leverage towards making their investment decisions. We base our recommendations on the experience gained from integrating multi-agency enterprise data warehouses at the Oak Ridge National Laboratory to design the foundation of future knowledge nurturing data-system architectures.

  8. Marshall Application Realignment System (MARS) Architecture

    NASA Technical Reports Server (NTRS)

    Belshe, Andrea; Sutton, Mandy

    2010-01-01

    The Marshall Application Realignment System (MARS) Architecture project was established to meet the certification requirements of the Department of Defense Architecture Framework (DoDAF) V2.0 Federal Enterprise Architecture Certification (FEAC) Institute program and to provide added value to the Marshall Space Flight Center (MSFC) Application Portfolio Management process. The MARS Architecture aims to: (1) address the NASA MSFC Chief Information Officer (CIO) strategic initiative to improve Application Portfolio Management (APM) by optimizing investments and improving portfolio performance, and (2) develop a decision-aiding capability by which applications registered within the MSFC application portfolio can be analyzed and considered for retirement or decommission. The MARS Architecture describes a to-be target capability that supports application portfolio analysis against scoring measures (based on value) and overall portfolio performance objectives (based on enterprise needs and policies). This scoring and decision-aiding capability supports the process by which MSFC application investments are realigned or retired from the application portfolio. The MARS Architecture is a multi-phase effort to: (1) conduct strategic architecture planning and knowledge development based on the DoDAF V2.0 six-step methodology, (2) describe one architecture through multiple viewpoints, (3) conduct portfolio analyses based on a defined operational concept, and (4) enable a new capability to support the MSFC enterprise IT management mission, vision, and goals. This report documents Phase 1 (Strategy and Design), which includes discovery, planning, and development of initial architecture viewpoints. Phase 2 will move forward the process of building the architecture, widening the scope to include application realignment (in addition to application retirement), and validating the underlying architecture logic before moving into Phase 3. The MARS Architecture key stakeholders are most interested in Phase 3 because this is where the data analysis, scoring, and recommendation capability is realized. Stakeholders want to see the benefits derived from reducing the steady-state application base and identify opportunities for portfolio performance improvement and application realignment.

  9. The Software Management Environment (SME)

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  10. Linguistic Model for Engine Power Loss

    DTIC Science & Technology

    2011-11-27

    Intelligent Vehicle Health Management System (IVHMS) for light trucks. In particular, this paper is focused on the system architecture for monitoring...developed for the cooling system of a diesel engine, integrating a priori, ‘expert’ knowledge , sensor data, and the adaptive network-based fuzzy...domain knowledge . However, in a nonlinear system in which not all possible causes to engine power loss are considered and measured, merely relying

  11. Counting Dependence Predictors

    DTIC Science & Technology

    2008-05-02

    sophisticated dependence predictors, such as Store Sets, have been tightly coupled to the fetch and ex- ecution streams, requiring global knowledge of...applicable to any architecture with distributed fetch and distributed memory banks, in which the comprehensive event completion knowledge needed by previous...adapted for Core Fusion [5] by giv- ing its steering management unit (SMU) the responsibilities of the controller core. While Ipek et al. describe how a

  12. Taking the 'work' out of networking: strategies for smarter, simpler network architecture and administration

    NASA Technical Reports Server (NTRS)

    Luna, C. de

    2003-01-01

    This session will help you tune up your skills and knowledge on the latest advances in network design and management, to keep your agency's data communications running at peak performance, with minimal cost and effort.

  13. A Service Oriented Web Application for Learner Knowledge Representation, Management and Sharing Conforming to IMS LIP

    ERIC Educational Resources Information Center

    Lazarinis, Fotis

    2014-01-01

    iLM is a Web based application for representation, management and sharing of IMS LIP conformant user profiles. The tool is developed using a service oriented architecture with emphasis on the easy data sharing. Data elicitation from user profiles is based on the utilization of XQuery scripts and sharing with other applications is achieved through…

  14. a Webgis for the Knowledge and Conservation of the Historical Wall Structures of the 13TH-18TH Centuries

    NASA Astrophysics Data System (ADS)

    Vacca, G.; Pili, D.; Fiorino, D. R.; Pintus, V.

    2017-05-01

    The presented work is part of the research project, titled "Tecniche murarie tradizionali: conoscenza per la conservazione ed il miglioramento prestazionale" (Traditional building techniques: from knowledge to conservation and performance improvement), with the purpose of studying the building techniques of the 13th-18th centuries in the Sardinia Region (Italy) for their knowledge, conservation, and promotion. The end purpose of the entire study is to improve the performance of the examined structures. In particular, the task of the authors within the research project was to build a WebGIS to manage the data collected during the examination and study phases. This infrastructure was entirely built using Open Source software. The work consisted of designing a database built in PostgreSQL and its spatial extension PostGIS, which allows to store and manage feature geometries and spatial data. The data input is performed via a form built in HTML and PHP. The HTML part is based on Bootstrap, an open tools library for websites and web applications. The implementation of this template used both PHP and Javascript code. The PHP code manages the reading and writing of data to the database, using embedded SQL queries. As of today, we surveyed and archived more than 300 buildings, belonging to three main macro categories: fortification architectures, religious architectures, residential architectures. The masonry samples investigated in relation to the construction techniques are more than 150. The database is published on the Internet as a WebGIS built using the Leaflet Javascript open libraries, which allows creating map sites with background maps and navigation, input and query tools. This too uses an interaction of HTML, Javascript, PHP and SQL code.

  15. Enhanced risk management by an emerging multi-agent architecture

    NASA Astrophysics Data System (ADS)

    Lin, Sin-Jin; Hsu, Ming-Fu

    2014-07-01

    Classification in imbalanced datasets has attracted much attention from researchers in the field of machine learning. Most existing techniques tend not to perform well on minority class instances when the dataset is highly skewed because they focus on minimising the forecasting error without considering the relative distribution of each class. This investigation proposes an emerging multi-agent architecture, grounded on cooperative learning, to solve the class-imbalanced classification problem. Additionally, this study deals further with the obscure nature of the multi-agent architecture and expresses comprehensive rules for auditors. The results from this study indicate that the presented model performs satisfactorily in risk management and is able to tackle a highly class-imbalanced dataset comparatively well. Furthermore, the knowledge visualised process, supported by real examples, can assist both internal and external auditors who must allocate limited detecting resources; they can take the rules as roadmaps to modify the auditing programme.

  16. Genetic control of inflorescence architecture in legumes

    PubMed Central

    Benlloch, Reyes; Berbel, Ana; Ali, Latifeh; Gohari, Gholamreza; Millán, Teresa; Madueño, Francisco

    2015-01-01

    The architecture of the inflorescence, the shoot system that bears the flowers, is a main component of the huge diversity of forms found in flowering plants. Inflorescence architecture has also a strong impact on the production of fruits and seeds, and on crop management, two highly relevant agronomical traits. Elucidating the genetic networks that control inflorescence development, and how they vary between different species, is essential to understanding the evolution of plant form and to being able to breed key architectural traits in crop species. Inflorescence architecture depends on the identity and activity of the meristems in the inflorescence apex, which determines when flowers are formed, how many are produced and their relative position in the inflorescence axis. Arabidopsis thaliana, where the genetic control of inflorescence development is best known, has a simple inflorescence, where the primary inflorescence meristem directly produces the flowers, which are thus borne in the main inflorescence axis. In contrast, legumes represent a more complex inflorescence type, the compound inflorescence, where flowers are not directly borne in the main inflorescence axis but, instead, they are formed by secondary or higher order inflorescence meristems. Studies in model legumes such as pea (Pisum sativum) or Medicago truncatula have led to a rather good knowledge of the genetic control of the development of the legume compound inflorescence. In addition, the increasing availability of genetic and genomic tools for legumes is allowing to rapidly extending this knowledge to other grain legume crops. This review aims to describe the current knowledge of the genetic network controlling inflorescence development in legumes. It also discusses how the combination of this knowledge with the use of emerging genomic tools and resources may allow rapid advances in the breeding of grain legume crops. PMID:26257753

  17. NETMARK

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Koga, Dennis (Technical Monitor)

    2002-01-01

    This presentation discuss NASA's proposed NETMARK knowledge management tool which aims 'to control and interoperate with every block in a document, email, spreadsheet, power point, database, etc. across the lifecycle'. Topics covered include: system software requirements and hardware requirements, seamless information systems, computer architecture issues, and potential benefits to NETMARK users.

  18. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1991-01-01

    The main contribution of the effort in the last two years is the introduction of the MOPPS system. After doing extensive literature search, we introduced the system which is described next. MOPPS employs a new solution to the problem of managing programs which solve scientific and engineering applications on a distributed processing environment. Autonomous computers cooperate efficiently in solving large scientific problems with this solution. MOPPS has the advantage of not assuming the presence of any particular network topology or configuration, computer architecture, or operating system. It imposes little overhead on network and processor resources while efficiently managing programs concurrently. The core of MOPPS is an intelligent program manager that builds a knowledge base of the execution performance of the parallel programs it is managing under various conditions. The manager applies this knowledge to improve the performance of future runs. The program manager learns from experience.

  19. An Ontology-based Architecture for Integration of Clinical Trials Management Applications

    PubMed Central

    Shankar, Ravi D.; Martins, Susana B.; O’Connor, Martin; Parrish, David B.; Das, Amar K.

    2007-01-01

    Management of complex clinical trials involves coordinated-use of a myriad of software applications by trial personnel. The applications typically use distinct knowledge representations and generate enormous amount of information during the course of a trial. It becomes vital that the applications exchange trial semantics in order for efficient management of the trials and subsequent analysis of clinical trial data. Existing model-based frameworks do not address the requirements of semantic integration of heterogeneous applications. We have built an ontology-based architecture to support interoperation of clinical trial software applications. Central to our approach is a suite of clinical trial ontologies, which we call Epoch, that define the vocabulary and semantics necessary to represent information on clinical trials. We are continuing to demonstrate and validate our approach with different clinical trials management applications and with growing number of clinical trials. PMID:18693919

  20. Integrated System Health Management: Foundational Concepts, Approach, and Implementation

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    2009-01-01

    A sound basis to guide the community in the conception and implementation of ISHM (Integrated System Health Management) capability in operational systems was provided. The concept of "ISHM Model of a System" and a related architecture defined as a unique Data, Information, and Knowledge (DIaK) architecture were described. The ISHM architecture is independent of the typical system architecture, which is based on grouping physical elements that are assembled to make up a subsystem, and subsystems combine to form systems, etc. It was emphasized that ISHM capability needs to be implemented first at a low functional capability level (FCL), or limited ability to detect anomalies, diagnose, determine consequences, etc. As algorithms and tools to augment or improve the FCL are identified, they should be incorporated into the system. This means that the architecture, DIaK management, and software, must be modular and standards-based, in order to enable systematic augmentation of FCL (no ad-hoc modifications). A set of technologies (and tools) needed to implement ISHM were described. One essential tool is a software environment to create the ISHM Model. The software environment encapsulates DIaK, and an infrastructure to focus DIaK on determining health (detect anomalies, determine causes, determine effects, and provide integrated awareness of the system to the operator). The environment includes gateways to communicate in accordance to standards, specially the IEEE 1451.1 Standard for Smart Sensors and Actuators.

  1. Architecture and the Web.

    ERIC Educational Resources Information Center

    Money, William H.

    Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…

  2. Executive control systems in the engineering design environment. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Hurst, P. W.

    1985-01-01

    An executive control system (ECS) is a software structure for unifying various applications codes into a comprehensive system. It provides a library of applications, a uniform access method through a cental user interface, and a data management facility. A survey of twenty-four executive control systems designed to unify various CAD/CAE applications for use in diverse engineering design environments within government and industry was conducted. The goals of this research were to establish system requirements to survey state-of-the-art architectural design approaches, and to provide an overview of the historical evolution of these systems. Foundations for design are presented and include environmental settings, system requirements, major architectural components, and a system classification scheme based on knowledge of the supported engineering domain(s). An overview of the design approaches used in developing the major architectural components of an ECS is presented with examples taken from the surveyed systems. Attention is drawn to four major areas of ECS development: interdisciplinary usage; standardization; knowledge utilization; and computer science technology transfer.

  3. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  4. Alternative Architectures for Distributed Work in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Billings, Charles E.; Chapman, Roger; Obradovich, Heintz; McCoy, C. Elaine; Orasanu, Judith

    2000-01-01

    The architecture for the National Airspace System (NAS) in the United States has evolved over time to rely heavily on the distribution of tasks and control authority in order to keep cognitive complexity manageable for any one individual. This paper characterizes a number of different subsystems that have been recently incorporated in the NAS. The goal of this discussion is to begin to identify the critical parameters defining the differences among alternative architectures in terms of the locus of control and in terms of access to relevant data and knowledge. At an abstract level, this analysis can be described as an effort to describe alternative "rules of the game" for the NAS.

  5. Concept of Operations for Collaboration and Discovery from Big Data Across Enterprise Data Warehouses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olama, Mohammed M; Nutaro, James J; Sukumar, Sreenivas R

    2013-01-01

    The success of data-driven business in government, science, and private industry is driving the need for seamless integration of intra and inter-enterprise data sources to extract knowledge nuggets in the form of correlations, trends, patterns and behaviors previously not discovered due to physical and logical separation of datasets. Today, as volume, velocity, variety and complexity of enterprise data keeps increasing, the next generation analysts are facing several challenges in the knowledge extraction process. Towards addressing these challenges, data-driven organizations that rely on the success of their analysts have to make investment decisions for sustainable data/information systems and knowledge discovery. Optionsmore » that organizations are considering are newer storage/analysis architectures, better analysis machines, redesigned analysis algorithms, collaborative knowledge management tools, and query builders amongst many others. In this paper, we present a concept of operations for enabling knowledge discovery that data-driven organizations can leverage towards making their investment decisions. We base our recommendations on the experience gained from integrating multi-agency enterprise data warehouses at the Oak Ridge National Laboratory to design the foundation of future knowledge nurturing data-system architectures.« less

  6. Designing and Developing a NASA Research Projects Knowledge Base and Implementing Knowledge Management and Discovery Techniques

    NASA Astrophysics Data System (ADS)

    Dabiru, L.; O'Hara, C. G.; Shaw, D.; Katragadda, S.; Anderson, D.; Kim, S.; Shrestha, B.; Aanstoos, J.; Frisbie, T.; Policelli, F.; Keblawi, N.

    2006-12-01

    The Research Project Knowledge Base (RPKB) is currently being designed and will be implemented in a manner that is fully compatible and interoperable with enterprise architecture tools developed to support NASA's Applied Sciences Program. Through user needs assessment, collaboration with Stennis Space Center, Goddard Space Flight Center, and NASA's DEVELOP Staff personnel insight to information needs for the RPKB were gathered from across NASA scientific communities of practice. To enable efficient, consistent, standard, structured, and managed data entry and research results compilation a prototype RPKB has been designed and fully integrated with the existing NASA Earth Science Systems Components database. The RPKB will compile research project and keyword information of relevance to the six major science focus areas, 12 national applications, and the Global Change Master Directory (GCMD). The RPKB will include information about projects awarded from NASA research solicitations, project investigator information, research publications, NASA data products employed, and model or decision support tools used or developed as well as new data product information. The RPKB will be developed in a multi-tier architecture that will include a SQL Server relational database backend, middleware, and front end client interfaces for data entry. The purpose of this project is to intelligently harvest the results of research sponsored by the NASA Applied Sciences Program and related research program results. We present various approaches for a wide spectrum of knowledge discovery of research results, publications, projects, etc. from the NASA Systems Components database and global information systems and show how this is implemented in SQL Server database. The application of knowledge discovery is useful for intelligent query answering and multiple-layered database construction. Using advanced EA tools such as the Earth Science Architecture Tool (ESAT), RPKB will enable NASA and partner agencies to efficiently identify the significant results for new experiment directions and principle investigators to formulate experiment directions for new proposals.

  7. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  8. An ontological knowledge framework for adaptive medical workflow.

    PubMed

    Dang, Jiangbo; Hedayati, Amir; Hampel, Ken; Toklu, Candemir

    2008-10-01

    As emerging technologies, semantic Web and SOA (Service-Oriented Architecture) allow BPMS (Business Process Management System) to automate business processes that can be described as services, which in turn can be used to wrap existing enterprise applications. BPMS provides tools and methodologies to compose Web services that can be executed as business processes and monitored by BPM (Business Process Management) consoles. Ontologies are a formal declarative knowledge representation model. It provides a foundation upon which machine understandable knowledge can be obtained, and as a result, it makes machine intelligence possible. Healthcare systems can adopt these technologies to make them ubiquitous, adaptive, and intelligent, and then serve patients better. This paper presents an ontological knowledge framework that covers healthcare domains that a hospital encompasses-from the medical or administrative tasks, to hospital assets, medical insurances, patient records, drugs, and regulations. Therefore, our ontology makes our vision of personalized healthcare possible by capturing all necessary knowledge for a complex personalized healthcare scenario involving patient care, insurance policies, and drug prescriptions, and compliances. For example, our ontology facilitates a workflow management system to allow users, from physicians to administrative assistants, to manage, even create context-aware new medical workflows and execute them on-the-fly.

  9. Northeast Artificial Intelligence Consortium (NAIC). Volume 12. Computer Architecture for Very Large Knowledge Bases

    DTIC Science & Technology

    1990-12-01

    data rate to the electronics would be much lower on the average and the data much "richer" in information. Intelligent use of...system bottleneck, a high data rate should be provided by I/O systems. 2. machines with intelligent storage management specially designed for logic...management information processing, surveillance sensors, intelligence data collection and handling, solid state sciences, electromagnetics, and propagation, and electronic reliability/maintainability and compatibility.

  10. Marshall Space Flight Center Propulsion Systems Department (PSD) Knowledge Management (KM) Initiative

    NASA Technical Reports Server (NTRS)

    Caraccioli, Paul; Varnedoe, Tom; Smith, Randy; McCarter, Mike; Wilson, Barry; Porter, Richard

    2006-01-01

    NASA Marshall Space Flight Center's Propulsion Systems Department (PSD) is four months into a fifteen month Knowledge Management (KM) initiative to support enhanced engineering decision making and analyses, faster resolution of anomalies (near-term) and effective, efficient knowledge infused engineering processes, reduced knowledge attrition, and reduced anomaly occurrences (long-term). The near-term objective of this initiative is developing a KM Pilot project, within the context of a 3-5 year KM strategy, to introduce and evaluate the use of KM within PSD. An internal NASA/MSFC PSD KM team was established early in project formulation to maintain a practitioner, user-centric focus throughout the conceptual development, planning and deployment of KM technologies and capabilities within the PSD. The PSD internal team is supported by the University of Alabama's Aging Infrastructure Systems Center of Excellence (AISCE), lntergraph Corporation, and The Knowledge Institute. The principle product of the initial four month effort has been strategic planning of PSD KNI implementation by first determining the "as is" state of KM capabilities and developing, planning and documenting the roadmap to achieve the desired "to be" state. Activities undertaken to suppoth e planning phase have included data gathering; cultural surveys, group work-sessions, interviews, documentation review, and independent research. Assessments and analyses have beon pedormed including industry benchmarking, related local and Agency initiatives, specific tools and techniques used and strategies for leveraging existing resources, people and technology to achieve common KM goals. Key findings captured in the PSD KM Strategic Plan include the system vision, purpose, stakeholders, prioritized strategic objectives mapped to the top ten practitioner needs and analysis of current resource usage. Opportunities identified from research, analyses, cultural1KM surveys and practitioner interviews include: executive and senior management sponsorship, KM awareness, promotion and training, cultural change management, process improvement, leveraging existing resources and new innovative technologies to align with other NASA KM initiatives (convergence: the big picture). To enable results based incremental implementation and future growth of the KM initiative, key performance measures have been identified including stakeholder value, system utility, learning and growth (knowledge capture, sharing, reduced anomaly recurrence), cultural change, process improvement and return-on-investment. The next steps for the initial implementation spiral (focused on SSME Turbomachinery) have been identified, largely based on the organization and compilation of summary level engineering process models, data capture matrices, functional models and conceptual-level svstems architecture. Key elements include detailed KM requirements definition, KM technology architecture assessment, - evaluation and selection, deployable KM Pilot design, development, implementation and evaluation, and justifying full implementation (estimated Return-on-Investment). Features identified for the notional system architecture include the knowledge presentation layer (and its components), knowledge network layer (and its components), knowledge storage layer (and its components), User Interface and capabilities. This paper provides a snapshot of the progress to date, the near term planning for deploying the KM pilot project and a forward look at results based growth of KM capabilities with-in the MSFC PSD.

  11. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  12. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  13. Optimization of knowledge-based systems and expert system building tools

    NASA Technical Reports Server (NTRS)

    Yasuda, Phyllis; Mckellar, Donald

    1993-01-01

    The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.

  14. Flexible Delivery Approach in Architecture and Construction Management Course

    ERIC Educational Resources Information Center

    Chan, Eric

    2013-01-01

    The millennial generation is facing challenges in their career path and they believe that tertiary education can help them to equip better to tackle against. However, some students find it difficult to rush back to classroom due to work commitment. Fortunately, flexible education developed these years allows students to capture knowledge anytime…

  15. The Ontological Architectures in the Application of a Knowledge Management System for Curricular Assessment

    ERIC Educational Resources Information Center

    Olson, Brandon D.

    2012-01-01

    Institutions of higher education are facing increasing pressure to improve the effectiveness and quality of academic programs (Association of Governing Boards, Top public policy issues 2011-2012, 2011). These institutions apply curricular assessment processes as a means to evaluate and improve academic effectiveness and quality. Knowledge…

  16. Successful Architectural Knowledge Sharing: Beware of Emotions

    NASA Astrophysics Data System (ADS)

    Poort, Eltjo R.; Pramono, Agung; Perdeck, Michiel; Clerc, Viktor; van Vliet, Hans

    This chapter presents the analysis and key findings of a survey on architectural knowledge sharing. The responses of 97 architects working in the Dutch IT Industry were analyzed by correlating practices and challenges with project size and success. Impact mechanisms between project size, project success, and architectural knowledge sharing practices and challenges were deduced based on reasoning, experience and literature. We find that architects run into numerous and diverse challenges sharing architectural knowledge, but that the only challenges that have a significant impact are the emotional challenges related to interpersonal relationships. Thus, architects should be careful when dealing with emotions in knowledge sharing.

  17. The electronic encapsulation of knowledge in hydraulics, hydrology and water resources

    NASA Astrophysics Data System (ADS)

    Abbott, Michael B.

    The rapidly developing practice of encapsulating knowledge in electronic media is shown to lead necessarily to the restructuring of the knowledge itself. The consequences of this for hydraulics, hydrology and more general water-resources management are investigated in particular relation to current process-simulation, real-time control and advice-serving systems. The generic properties of the electronic knowledge encapsulator are described, and attention is drawn to the manner in which knowledge 'goes into hiding' through encapsulation. This property is traced in the simple situations of pure mathesis and in the more complex situations of taxinomia using one example each from hydraulics and hydrology. The consequences for systems architectures are explained, pointing to the need for multi-agent architectures for ecological modelling and for more general hydroinformatics systems also. The relevance of these developments is indicated by reference to ongoing projects in which they are currently being realised. In conclusion, some more general epistemological aspects are considered within the same context. As this contribution is so much concerned with the processes of signification and communication, it has been partly shaped by the theory of semiotics, as popularised by Eco ( A Theory of Semiotics, Indiana University, Bloomington, 1977).

  18. A failure management prototype: DR/Rx

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Baker, Carolyn G.; Kelly, Christine M.; Marsh, Christopher A.

    1991-01-01

    This failure management prototype performs failure diagnosis and recovery management of hierarchical, distributed systems. The prototype, which evolved from a series of previous prototypes following a spiral model for development, focuses on two functions: (1) the diagnostic reasoner (DR) performs integrated failure diagnosis in distributed systems; and (2) the recovery expert (Rx) develops plans to recover from the failure. Issues related to expert system prototype design and the previous history of this prototype are discussed. The architecture of the current prototype is described in terms of the knowledge representation and functionality of its components.

  19. Information revolution in nursing and health care: educating for tomorrow's challenge.

    PubMed

    Kooker, B M; Richardson, S S

    1994-06-01

    Current emphasis on the national electronic highway and a national health database for comparative health care reporting demonstrates society's increasing reliance on information technology. The efficient electronic processing and managing of data, information, and knowledge are critical for survival in tomorrow's health care organization. To take a leadership role in this information revolution, informatics nurse specialists must possess competencies that incorporate information science, computer science, and nursing science for successful information system development. In selecting an appropriate informatics educational program or to hire an individual capable of meeting this challenge, nurse administrators must look for the following technical knowledge and skill set: information management principles, system development life cycle, programming languages, file design and access, hardware and network architecture, project management skills, and leadership abilities.

  20. Level-2 Milestone 5588: Deliver Strategic Plan and Initial Scalability Assessment by Advanced Architecture and Portability Specialists Team

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draeger, Erik W.

    This report documents the fact that the work in creating a strategic plan and beginning customer engagements has been completed. The description of milestone is: The newly formed advanced architecture and portability specialists (AAPS) team will develop a strategic plan to meet the goals of 1) sharing knowledge and experience with code teams to ensure that ASC codes run well on new architectures, and 2) supplying skilled computational scientists to put the strategy into practice. The plan will be delivered to ASC management in the first quarter. By the fourth quarter, the team will identify their first customers within PEMmore » and IC, perform an initial assessment and scalability and performance bottleneck for next-generation architectures, and embed AAPS team members with customer code teams to assist with initial portability development within standalone kernels or proxy applications.« less

  1. Flexible augmented reality architecture applied to environmental management

    NASA Astrophysics Data System (ADS)

    Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo

    2003-05-01

    Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.

  2. A viewpoint-based case-based reasoning approach utilising an enterprise architecture ontology for experience management

    NASA Astrophysics Data System (ADS)

    Martin, Andreas; Emmenegger, Sandro; Hinkelmann, Knut; Thönssen, Barbara

    2017-04-01

    The accessibility of project knowledge obtained from experiences is an important and crucial issue in enterprises. This information need about project knowledge can be different from one person to another depending on the different roles he or she has. Therefore, a new ontology-based case-based reasoning (OBCBR) approach that utilises an enterprise ontology is introduced in this article to improve the accessibility of this project knowledge. Utilising an enterprise ontology improves the case-based reasoning (CBR) system through the systematic inclusion of enterprise-specific knowledge. This enterprise-specific knowledge is captured using the overall structure given by the enterprise ontology named ArchiMEO, which is a partial ontological realisation of the enterprise architecture framework (EAF) ArchiMate. This ontological representation, containing historical cases and specific enterprise domain knowledge, is applied in a new OBCBR approach. To support the different information needs of different stakeholders, this OBCBR approach has been built in such a way that different views, viewpoints, concerns and stakeholders can be considered. This is realised using a case viewpoint model derived from the ISO/IEC/IEEE 42010 standard. The introduced approach was implemented as a demonstrator and evaluated using an application case that has been elicited from a business partner in the Swiss research project.

  3. Knowledge Production in an Architectural Practice and a University Architectural Department

    ERIC Educational Resources Information Center

    Winberg, Chris

    2006-01-01

    Processes of knowledge production by professional architects and architects-in-training were studied and compared. Both professionals and students were involved in the production of knowledge about the architectural heritage of historical buildings in Cape Town. In a study of the artefacts produced, observations of the processes by means of which…

  4. A Knowledge Conversion Model Based on the Cognitive Load Theory for Architectural Design Education

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Liao, Shin; Wen, Ming-Hui; Weng, Kuo-Hua

    2017-01-01

    The education of architectural design requires balanced curricular arrangements of respectively theoretical knowledge and practical skills to really help students build their knowledge structures, particularly helping them in solving the problems of cognitive load. The purpose of this study is to establish an architectural design knowledge…

  5. My Three Wishes for Digital Repositories. Building Digital Libraries

    ERIC Educational Resources Information Center

    Huwe, Terence K.

    2005-01-01

    In this column on digital repository management, the author defines three areas within the sphere of digital repositories that need work. The first two pertain to information architecture, while the last one pertains to taking action. The author's first "wish" is for top-notch library Web sites that act as a gateway to any sphere of knowledge. He…

  6. Sustainable clinical knowledge management: an archetype development life cycle.

    PubMed

    Madsen, Maria; Leslie, Heather; Hovenga, Evelyn J S; Heard, Sam

    2010-01-01

    This chapter gives an educational overview of: 1. The significance of having a formal ontology of health care data 2. How openEHR has used an ontological approach to designing an electronic health record 3. The phases of archetype development and key steps in the process 4. The openEHR architecture and integrated development environment.

  7. Contextual and temporal clinical guidelines.

    PubMed Central

    Guarnero, A.; Marzuoli, M.; Molino, G.; Terenziani, P.; Torchio, M.; Vanni, K.

    1998-01-01

    In this paper, we propose an approach for managing clinical guidelines. We sketch a modular architecture, allowing us to separate conceptually distinct aspects in the management and use of clinical guidelines. In particular, we describe the clinical guidelines knowledge representation module and we sketch the acquisition module. The main focus of the paper is the definition of an expressive formalism for representing clinical guidelines, which allows one to deal with the context dependent character of clinical guidelines and takes into account different temporal aspects. PMID:9929306

  8. The neuron classification problem

    PubMed Central

    Bota, Mihail; Swanson, Larry W.

    2007-01-01

    A systematic account of neuron cell types is a basic prerequisite for determining the vertebrate nervous system global wiring diagram. With comprehensive lineage and phylogenetic information unavailable, a general ontology based on structure-function taxonomy is proposed and implemented in a knowledge management system, and a prototype analysis of select regions (including retina, cerebellum, and hypothalamus) presented. The supporting Brain Architecture Knowledge Management System (BAMS) Neuron ontology is online and its user interface allows queries about terms and their definitions, classification criteria based on the original literature and “Petilla Convention” guidelines, hierarchies, and relations—with annotations documenting each ontology entry. Combined with three BAMS modules for neural regions, connections between regions and neuron types, and molecules, the Neuron ontology provides a general framework for physical descriptions and computational modeling of neural systems. The knowledge management system interacts with other web resources, is accessible in both XML and RDF/OWL, is extendible to the whole body, and awaits large-scale data population requiring community participation for timely implementation. PMID:17582506

  9. Image Understanding Architecture

    DTIC Science & Technology

    1991-09-01

    architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers

  10. System and method for integrating and accessing multiple data sources within a data warehouse architecture

    DOEpatents

    Musick, Charles R [Castro Valley, CA; Critchlow, Terence [Livermore, CA; Ganesh, Madhaven [San Jose, CA; Slezak, Tom [Livermore, CA; Fidelis, Krzysztof [Brentwood, CA

    2006-12-19

    A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.

  11. Feasibility of using a knowledge-based system concept for in-flight primary flight display research

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1991-01-01

    A study was conducted to determine the feasibility of using knowledge-based systems architectures for inflight research of primary flight display information management issues. The feasibility relied on the ability to integrate knowledge-based systems with existing onboard aircraft systems. And, given the hardware and software platforms available, the feasibility also depended on the ability to use interpreted LISP software with the real time operation of the primary flight display. In addition to evaluating these feasibility issues, the study determined whether the software engineering advantages of knowledge-based systems found for this application in the earlier workstation study extended to the inflight research environment. To study these issues, two integrated knowledge-based systems were designed to control the primary flight display according to pre-existing specifications of an ongoing primary flight display information management research effort. These two systems were implemented to assess the feasibility and software engineering issues listed. Flight test results were successful in showing the feasibility of using knowledge-based systems inflight with actual aircraft data.

  12. A knowledge-base generating hierarchical fuzzy-neural controller.

    PubMed

    Kandadai, R M; Tien, J M

    1997-01-01

    We present an innovative fuzzy-neural architecture that is able to automatically generate a knowledge base, in an extractable form, for use in hierarchical knowledge-based controllers. The knowledge base is in the form of a linguistic rule base appropriate for a fuzzy inference system. First, we modify Berenji and Khedkar's (1992) GARIC architecture to enable it to automatically generate a knowledge base; a pseudosupervised learning scheme using reinforcement learning and error backpropagation is employed. Next, we further extend this architecture to a hierarchical controller that is able to generate its own knowledge base. Example applications are provided to underscore its viability.

  13. Multimedia Workstations: Electronic Assistants for Health-Care Professionals.

    PubMed

    Degoulet, P; Jean, F-C; Safran, C

    1996-01-01

    The increasing costs of health care and the economic reality has produced an interesting paradox for the health professional to perform more clinical work with fewer support personnel. Moreover, an explosion of the knowledge-base that underlies sound clinical care not only makes effective time management critical, but also knowledge management compelling. A multimedia workstation is an electronic assistant for the busy health professional that can help with administrative tasks and give access to clinical information and knowledge networks. The multimedia nature of processed information reflects an evolution of medical technologies that involve more and more complex objects such as video sequences or digitized signals. Analysis of the 445 Medline-indexed publications for the January 1991 to December 1994 period, that included the word "workstation" either in their title or in their abstract, helps in refining objectives and challenges both for health professionals and decision makers. From an engineering perspective, development of a workstation requires the integration into the same environments of tools to localize, access, manipulate and communicate the required information. The long-term goal is to establish an easy access in a collaborative working environment that gives the end-user the feeling of a single virtual health enterprise, driven by an integrated computer system when the information system relies on a set of heterogeneous and geographically distributed components. Consequences in terms of migration from traditional client/server architectures to more client/network architectures are considered.

  14. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  15. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  16. Virtual management of radiology examinations in the virtual radiology environment using common object request broker architecture services.

    PubMed

    Martinez, R; Rozenblit, J; Cook, J F; Chacko, A K; Timboe, H L

    1999-05-01

    In the Department of Defense (DoD), US Army Medical Command is now embarking on an extremely exciting new project--creating a virtual radiology environment (VRE) for the management of radiology examinations. The business of radiology in the military is therefore being reengineered on several fronts by the VRE Project. In the VRE Project, a set of intelligent agent algorithms determine where examinations are to routed for reading bases on a knowledge base of the entire VRE. The set of algorithms, called the Meta-Manager, is hierarchical and uses object-based communications between medical treatment facilities (MTFs) and medical centers that have digital imaging network picture archiving and communications systems (DIN-PACS) networks. The communications is based on use of common object request broker architecture (CORBA) objects and services to send patient demographics and examination images from DIN-PACS networks in the MTFs to the DIN-PACS networks at the medical centers for diagnosis. The Meta-Manager is also responsible for updating the diagnosis at the originating MTF. CORBA services are used to perform secure message communications between DIN-PACS nodes in the VRE network. The Meta-Manager has a fail-safe architecture that allows the master Meta-Manager function to float to regional Meta-Manager sites in case of server failure. A prototype of the CORBA-based Meta-Manager is being developed by the University of Arizona's Computer Engineering Research Laboratory using the unified modeling language (UML) as a design tool. The prototype will implement the main functions described in the Meta-Manager design specification. The results of this project are expected to reengineer the process of radiology in the military and have extensions to commercial radiology environments.

  17. Conceptual Modeling in the Time of the Revolution: Part II

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.

  18. Federated health information architecture: Enabling healthcare providers and policymakers to use data for decision-making.

    PubMed

    Kumar, Manish; Mostafa, Javed; Ramaswamy, Rohit

    2018-05-01

    Health information systems (HIS) in India, as in most other developing countries, support public health management but fail to enable healthcare providers to use data for delivering quality services. Such a failure is surprising, given that the population healthcare data that the system collects are aggregated from patient records. An important reason for this failure is that the health information architecture (HIA) of the HIS is designed primarily to serve the information needs of policymakers and program managers. India has recognised the architectural gaps in its HIS and proposes to develop an integrated HIA. An enabling HIA that attempts to balance the autonomy of local systems with the requirements of a centralised monitoring agency could meet the diverse information needs of various stakeholders. Given the lack of in-country knowledge and experience in designing such an HIA, this case study was undertaken to analyse HIS in the Bihar state of India and to understand whether it would enable healthcare providers, program managers and policymakers to use data for decision-making. Based on a literature review and data collected from interviews with key informants, this article proposes a federated HIA, which has the potential to improve HIS efficiency; provide flexibility for local innovation; cater to the diverse information needs of healthcare providers, program managers and policymakers; and encourage data-based decision-making.

  19. Foundational model of structural connectivity in the nervous system with a schema for wiring diagrams, connectome, and basic plan architecture

    PubMed Central

    Swanson, Larry W.; Bota, Mihail

    2010-01-01

    The nervous system is a biological computer integrating the body's reflex and voluntary environmental interactions (behavior) with a relatively constant internal state (homeostasis)—promoting survival of the individual and species. The wiring diagram of the nervous system's structural connectivity provides an obligatory foundational model for understanding functional localization at molecular, cellular, systems, and behavioral organization levels. This paper provides a high-level, downwardly extendible, conceptual framework—like a compass and map—for describing and exploring in neuroinformatics systems (such as our Brain Architecture Knowledge Management System) the structural architecture of the nervous system's basic wiring diagram. For this, the Foundational Model of Connectivity's universe of discourse is the structural architecture of nervous system connectivity in all animals at all resolutions, and the model includes two key elements—a set of basic principles and an internally consistent set of concepts (defined vocabulary of standard terms)—arranged in an explicitly defined schema (set of relationships between concepts) allowing automatic inferences. In addition, rules and procedures for creating and modifying the foundational model are considered. Controlled vocabularies with broad community support typically are managed by standing committees of experts that create and refine boundary conditions, and a set of rules that are available on the Web. PMID:21078980

  20. Integrating knowledge and control into hypermedia-based training environments: Experiments with HyperCLIPS

    NASA Technical Reports Server (NTRS)

    Hill, Randall W., Jr.

    1990-01-01

    The issues of knowledge representation and control in hypermedia-based training environments are discussed. The main objective is to integrate the flexible presentation capability of hypermedia with a knowledge-based approach to lesson discourse management. The instructional goals and their associated concepts are represented in a knowledge representation structure called a 'concept network'. Its functional usages are many: it is used to control the navigation through a presentation space, generate tests for student evaluation, and model the student. This architecture was implemented in HyperCLIPS, a hybrid system that creates a bridge between HyperCard, a popular hypertext-like system used for building user interfaces to data bases and other applications, and CLIPS, a highly portable government-owned expert system shell.

  1. Material/historic Reality: Catching the Transformation. From a Case of Applied Research to the Trans-Disciplinary Approach to Preserve Architecture

    NASA Astrophysics Data System (ADS)

    Aveta, A.; Marino, B. G.; Amore, R.

    2017-05-01

    The present paper aims at dealing with some issues of knowledge of the architectural heritage. Given the increasing use of the innovative technologies in the field of the cultural heritage it is important to focus on their usefulness and potentialities in order to the conservation project management. The role of the new survey techniques and the accurate representations of the dimensional, structural and material consistency of the historic buildings and their context is mandatory and can influence the restoration choices. Starting from a recent applied research concerning a significant and symbolic monument of Naples, Castel Nuovo, the paper intends to highlight not only the importance of the integration of the specialist surveys, but also the role of the critical interpretation. The results of the different disciplines involved in the knowledge process have to be evaluated critically in view of the conservation of the tangible and intangible values. Furthermore, catching the complexity of architecture of the past depends on the capacity to maintain a close and constant contactwith the building physicality and also on a complex methodology which is inclusive of new interpretative instruments which could increase a virtuous hermeneutic circle.

  2. Supporting diagnosis and treatment in medical care based on Big Data processing.

    PubMed

    Lupşe, Oana-Sorina; Crişan-Vida, Mihaela; Stoicu-Tivadar, Lăcrămioara; Bernard, Elena

    2014-01-01

    With information and data in all domains growing every day, it is difficult to manage and extract useful knowledge for specific situations. This paper presents an integrated system architecture to support the activity in the Ob-Gin departments with further developments in using new technology to manage Big Data processing - using Google BigQuery - in the medical domain. The data collected and processed with Google BigQuery results from different sources: two Obstetrics & Gynaecology Departments, the TreatSuggest application - an application for suggesting treatments, and a home foetal surveillance system. Data is uploaded in Google BigQuery from Bega Hospital Timişoara, Romania. The analysed data is useful for the medical staff, researchers and statisticians from public health domain. The current work describes the technological architecture and its processing possibilities that in the future will be proved based on quality criteria to lead to a better decision process in diagnosis and public health.

  3. A "Knowledge Trading Game" for Collaborative Design Learning in an Architectural Design Studio

    ERIC Educational Resources Information Center

    Wang, Wan-Ling; Shih, Shen-Guan; Chien, Sheng-Fen

    2010-01-01

    Knowledge-sharing and resource exchange are the key to the success of collaborative design learning. In an architectural design studio, design knowledge entails learning efforts that need to accumulate and recombine dispersed and complementary pieces of knowledge. In this research, firstly, "Knowledge Trading Game" is proposed to be a way for…

  4. Marshall Space Flight Center Propulsion Systems Department (PSD) KM Initiative

    NASA Technical Reports Server (NTRS)

    Caraccioli, Paul; Varnadoe, Tom; McCarter, Mike

    2006-01-01

    NASA Marshall Space Flight Center s Propulsion Systems Department (PSD) is four months into a fifteen month Knowledge Management (KM) initiative to support enhanced engineering decision making and analyses, faster resolution of anomalies (near-term) and effective, efficient knowledge infused engineering processes, reduced knowledge attrition, and reduced anomaly occurrences (long-term). The near-term objective of this initiative is developing a KM Pilot project, within the context of a 3-5 year KM strategy, to introduce and evaluate the use of KM within PSD. An internal NASA/MSFC PSD KM team was established early in project formulation to maintain a practitioner, user-centric focus throughout the conceptual development, planning and deployment of KM technologies and capabilities with in the PSD. The PSD internal team is supported by the University of Alabama's Aging Infrastructure Systems Center Of Excellence (AISCE), Intergraph Corporation, and The Knowledge Institute. The principle product of the initial four month effort has been strategic planning of PSD KM implementation by first determining the "as is" state of KM capabilities and developing, planning and documenting the roadmap to achieve the desired "to be" state. Activities undertaken to support the planning phase have included data gathering; cultural surveys, group work-sessions, interviews, documentation review, and independent research. Assessments and analyses have been performed including industry benchmarking, related local and Agency initiatives, specific tools and techniques used and strategies for leveraging existing resources, people and technology to achieve common KM goals. Key findings captured in the PSD KM Strategic Plan include the system vision, purpose, stakeholders, prioritized strategic objectives mapped to the top ten practitioner needs and analysis of current resource usage. Opportunities identified from research, analyses, cultural/KM surveys and practitioner interviews include: executive and senior management sponsorship, KM awareness, promotion and training, cultural change management, process improvement, leveraging existing resources and new innovative technologies to align with other NASA KM initiatives (convergence: the big picture). To enable results based incremental implementation and future growth of the KM initiative, key performance measures have been identified including stakeholder value, system utility, learning and growth (knowledge capture, sharing, reduced anomaly recurrence), cultural change, process improvement and return-on-investment. The next steps for the initial implementation spiral (focused on SSME Turbomachinery) have been identified, largely based on the organization and compilation of summary level engineering process models, data capture matrices, functional models and conceptual-level systems architecture. Key elements include detailed KM requirements definition, KM technology architecture assessment, evaluation and selection, deployable KM Pilot design, development, implementation and evaluation, and justifying full implementation (estimated Return-on-Investment). Features identified for the notional system architecture include the knowledge presentation layer (and its components), knowledge network layer (and its components), knowledge storage layer (and its components), User Interface and capabilities. This paper provides a snapshot of the progress to date, the near term planning for deploying the KM pilot project and a forward look at results based growth of KM capabilities with-in the MSFC PSD.

  5. 2007 National Security Space Policy and Architecture Symposium

    DTIC Science & Technology

    2007-02-02

    Tactical Satellite (TacSat)-2 Experiment Successful Launch, 16 Dec 06, Orbital Minotaur Ground Terminal – China Lake Capability: • Field tasking/data...AIRSS risk • Develop, build, and flight qualify wide- field -of-view, full-Earth staring sensor • FX-AIRSS flight experiment : investigate data...Demonstration Critical Field Experiments Government Industry Technology Knowledge Transfer (NLT this step) Managing Risks: • Program • Technical

  6. United States Navy Health Care Providers' Attitudes and Satisfaction toward the Usability of the Navy's Primary Learning Portal and Learning Management System

    ERIC Educational Resources Information Center

    Catanese, Anthony Peter

    2013-01-01

    The purpose of this study was to investigate if the architectural design factors affected usability of Navy Knowledge Online (NKO) technology along with the user dissatisfaction associated through restricted achievements of online education and training. In this study, attitudes, satisfaction, obstacles, and providers' demographics were also…

  7. Design and Architecture of Collaborative Online Communities: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Aviv, Reuven; Erlich, Zippy; Ravid, Gilad

    2004-01-01

    This paper considers four aspects of online communities. Design, mechanisms, architecture, and the constructed knowledge. We hypothesize that different designs of communities drive different mechanisms, which give rise to different architectures, which in turn result in different levels of collaborative knowledge construction. To test this chain…

  8. Architectural Innovation: The Reconfiguration of Existing Product Technologies and the Failure of Established Firms.

    ERIC Educational Resources Information Center

    Henderson, Rebecca M.; Clark, Kim B.

    1990-01-01

    Using an empirical study of the semiconductor photolithographic alignment equipment industry, this paper shows that architectural innovations destroy the usefulness of established firms' architectural knowledge. Because this knowledge is embedded in the firms' structure and information-processing procedures, the destruction is hard to detect.…

  9. An Object-Oriented Architecture for Intelligent Tutoring Systems. Technical Report No. LSP-3.

    ERIC Educational Resources Information Center

    Bonar, Jeffrey; And Others

    This technical report describes a generic architecture for building intelligent tutoring systems which is developed around objects that represent the knowledge elements to be taught by the tutor. Each of these knowledge elements, called "bites," inherits both a knowledge organization describing the kind of knowledge represented and…

  10. 41 CFR 102-77.15 - Who funds the Art-in-Architecture efforts?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-Architecture efforts? 102-77.15 Section 102-77.15 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.15 Who funds the Art-in-Architecture efforts? To the extent not...

  11. 41 CFR 102-77.15 - Who funds the Art-in-Architecture efforts?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...-Architecture efforts? 102-77.15 Section 102-77.15 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.15 Who funds the Art-in-Architecture efforts? To the extent not...

  12. 41 CFR 102-77.15 - Who funds the Art-in-Architecture efforts?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...-Architecture efforts? 102-77.15 Section 102-77.15 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.15 Who funds the Art-in-Architecture efforts? To the extent not...

  13. 41 CFR 102-77.15 - Who funds the Art-in-Architecture efforts?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...-Architecture efforts? 102-77.15 Section 102-77.15 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.15 Who funds the Art-in-Architecture efforts? To the extent not...

  14. An architecture for rule based system explanation

    NASA Technical Reports Server (NTRS)

    Fennel, T. R.; Johannes, James D.

    1990-01-01

    A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.

  15. Continual planning and scheduling for managing patient tests in hospital laboratories.

    PubMed

    Marinagi, C C; Spyropoulos, C D; Papatheodorou, C; Kokkotos, S

    2000-10-01

    Hospital laboratories perform examination tests upon patients, in order to assist medical diagnosis or therapy progress. Planning and scheduling patient requests for examination tests is a complicated problem because it concerns both minimization of patient stay in hospital and maximization of laboratory resources utilization. In the present paper, we propose an integrated patient-wise planning and scheduling system which supports the dynamic and continual nature of the problem. The proposed combination of multiagent and blackboard architecture allows the dynamic creation of agents that share a set of knowledge sources and a knowledge base to service patient test requests.

  16. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis.

    PubMed

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.

  17. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis

    PubMed Central

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242

  18. A Novel Architecture for E-Learning Knowledge Assessment Systems

    ERIC Educational Resources Information Center

    Gierlowski, Krzysztof; Nowicki, Krzysztof

    2009-01-01

    In this article we propose a novel e-learning system, dedicated strictly to knowledge assessment tasks. In its functioning it utilizes web-based technologies, but its design differs radically from currently popular e-learning solutions which rely mostly on thin-client architecture. Our research proved that such architecture, while well suited for…

  19. A functional-structural kiwifruit vine model integrating architecture, carbon dynamics and effects of the environment.

    PubMed

    Cieslak, Mikolaj; Seleznyova, Alla N; Hanan, Jim

    2011-04-01

    Functional-structural modelling can be used to increase our understanding of how different aspects of plant structure and function interact, identify knowledge gaps and guide priorities for future experimentation. By integrating existing knowledge of the different aspects of the kiwifruit (Actinidia deliciosa) vine's architecture and physiology, our aim is to develop conceptual and mathematical hypotheses on several of the vine's features: (a) plasticity of the vine's architecture; (b) effects of organ position within the canopy on its size; (c) effects of environment and horticultural management on shoot growth, light distribution and organ size; and (d) role of carbon reserves in early shoot growth. Using the L-system modelling platform, a functional-structural plant model of a kiwifruit vine was created that integrates architectural development, mechanistic modelling of carbon transport and allocation, and environmental and management effects on vine and fruit growth. The branching pattern was captured at the individual shoot level by modelling axillary shoot development using a discrete-time Markov chain. An existing carbon transport resistance model was extended to account for several source/sink components of individual plant elements. A quasi-Monte Carlo path-tracing algorithm was used to estimate the absorbed irradiance of each leaf. Several simulations were performed to illustrate the model's potential to reproduce the major features of the vine's behaviour. The model simulated vine growth responses that were qualitatively similar to those observed in experiments, including the plastic response of shoot growth to local carbon supply, the branching patterns of two Actinidia species, the effect of carbon limitation and topological distance on fruit size and the complex behaviour of sink competition for carbon. The model is able to reproduce differences in vine and fruit growth arising from various experimental treatments. This implies it will be a valuable tool for refining our understanding of kiwifruit growth and for identifying strategies to improve production.

  20. Software Architecture Evaluation in Global Software Development Projects

    NASA Astrophysics Data System (ADS)

    Salger, Frank

    Due to ever increasing system complexity, comprehensive methods for software architecture evaluation become more and more important. This is further stressed in global software development (GSD), where the software architecture acts as a central knowledge and coordination mechanism. However, existing methods for architecture evaluation do not take characteristics of GSD into account. In this paper we discuss what aspects are specific for architecture evaluations in GSD. Our experiences from GSD projects at Capgemini sd&m indicate, that architecture evaluations differ in how rigorously one has to assess modularization, architecturally relevant processes, knowledge transfer and process alignment. From our project experiences, we derive nine good practices, the compliance to which should be checked in architecture evaluations in GSD. As an example, we discuss how far the standard architecture evaluation method used at Capgemini sd&m already considers the GSD-specific good practices, and outline what extensions are necessary to achieve a comprehensive architecture evaluation framework for GSD.

  1. A generic architecture for an adaptive, interoperable and intelligent type 2 diabetes mellitus care system.

    PubMed

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Schulz, Stefan

    2015-01-01

    Chronic diseases such as Type 2 Diabetes Mellitus (T2DM) constitute a big burden to the global health economy. T2DM Care Management requires a multi-disciplinary and multi-organizational approach. Because of different languages and terminologies, education, experiences, skills, etc., such an approach establishes a special interoperability challenge. The solution is a flexible, scalable, business-controlled, adaptive, knowledge-based, intelligent system following a systems-oriented, architecture-centric, ontology-based and policy-driven approach. The architecture of real systems is described, using the basics and principles of the Generic Component Model (GCM). For representing the functional aspects of a system the Business Process Modeling Notation (BPMN) is used. The system architecture obtained is presented using a GCM graphical notation, class diagrams and BPMN diagrams. The architecture-centric approach considers the compositional nature of the real world system and its functionalities, guarantees coherence, and provides right inferences. The level of generality provided in this paper facilitates use case specific adaptations of the system. By that way, intelligent, adaptive and interoperable T2DM care systems can be derived from the presented model as presented in another publication.

  2. Study on establishment of Body of Knowledge of Taiwan's Traditional Wooden Structure Technology

    NASA Astrophysics Data System (ADS)

    Huang, M. T.; Chiou, S. C.; Hsu, T. W.; Su, P. C.

    2015-08-01

    The timber technology of the Taiwan traditional architecture is brought by the immigrants in the Southern Fujian of China in the early, which has been inherited for a hundred years. In the past, these traditional timber technologies were taught by mentoring, however, due to the change of the social form, the construction of the traditional architecture was faded away, and what is gradually replaced is the repair work of the traditional architecture, therefore, the construction method of the timber technology, use form of the tool and other factors are very different from previous one, and the core technology is faced with the dilemma of endangered loss. There are many relevant studies on architectural style, construction method of technology, schools of craftsman, technical capacity of craftsman and other timber technologies, or the technology preservation is carried out by dictating the historical record, studying the skills and other ways, but for the timber craftsman repairing the traditional architecture on the front line, there is still space for discussing whether to maintain the original construction method and maintain the due repair quality for the core technology. This paper classified the timber technology knowledge with the document analysis method and expert interview method, carried out the architecture analysis of knowledge hierarchy, and finally, built the preliminary framework of the timber technology knowledge system of the Taiwan traditional architecture, and built the standard formulation available for craftsman training and skills identification by virtue of the knowledge system, so that the craftsman did not affect the technical capacity due to the change of the knowledge instruction system, thus, affecting the repair quality of the traditional architecture; and in addition, the building of the database system can also be derived by means of the knowledge structure, so as to integrate the consistency of the contents of core technical capacity. It can be used as the interpretation data; the knowledge is standardized and the authority file is established, which is regarded as a technical specification, so that the technology is standardized, thus, avoid loss or distort.

  3. An information extraction framework for cohort identification using electronic health records.

    PubMed

    Liu, Hongfang; Bielinski, Suzette J; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B; Jonnalagadda, Siddhartha R; Ravikumar, K E; Wu, Stephen T; Kullo, Iftikhar J; Chute, Christopher G

    2013-01-01

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework.

  4. NASA Enterprise Architecture and Its Use in Transition of Research Results to Operations

    NASA Astrophysics Data System (ADS)

    Frisbie, T. E.; Hall, C. M.

    2006-12-01

    Enterprise architecture describes the design of the components of an enterprise, their relationships and how they support the objectives of that enterprise. NASA Stennis Space Center leads several projects involving enterprise architecture tools used to gather information on research assets within NASA's Earth Science Division. In the near future, enterprise architecture tools will link and display the relevant requirements, parameters, observatories, models, decision systems, and benefit/impact information relationships and map to the Federal Enterprise Architecture Reference Models. Components configured within the enterprise architecture serving the NASA Applied Sciences Program include the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool. The Earth Science Components Knowledge Base systematically catalogues NASA missions, sensors, models, data products, model products, and network partners appropriate for consideration in NASA Earth Science applications projects. The Systems Components database is a centralized information warehouse of NASA's Earth Science research assets and a critical first link in the implementation of enterprise architecture. The Earth Science Architecture Tool is used to analyze potential NASA candidate systems that may be beneficial to decision-making capabilities of other Federal agencies. Use of the current configuration of NASA enterprise architecture (the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool) has far exceeded its original intent and has tremendous potential for the transition of research results to operational entities.

  5. Modeling and Improving Information Flows in the Development of Large Business Applications

    NASA Astrophysics Data System (ADS)

    Schneider, Kurt; Lübke, Daniel

    Designing a good architecture for an application is a wicked problem. Therefore, experience and knowledge are considered crucial for informing work in software architecture. However, many organizations do not pay sufficient attention to experience exploitation and architectural learning. Many users of information systems are not aware of the options and the needs to report problems and requirements. They often do not have time to describe a problem encountered in sufficient detail for developers to remove it. And there may be a lengthy process for providing feedback. Hence, the knowledge about problems and potential solutions is not shared effectively. Architectural knowledge needs to include evaluative feedback as well as decisions and their reasons (rationale).

  6. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    NASA Astrophysics Data System (ADS)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate Observing System EPOS), it also supports integrated approaches to include sensor networks that provide complementary processes for dynamic monitoring. Moreover, the integration of such observatories into superordinate research infrastructures (federation of virtual observatories) will be enabled.

  7. An architecture for intelligent task interruption

    NASA Technical Reports Server (NTRS)

    Sharma, D. D.; Narayan, Srini

    1990-01-01

    In the design of real time systems the capability for task interruption is often considered essential. The problem of task interruption in knowledge-based domains is examined. It is proposed that task interruption can be often avoided by using appropriate functional architectures and knowledge engineering principles. Situations for which task interruption is indispensable, a preliminary architecture based on priority hierarchies is described.

  8. The development of a post-test diagnostic system for rocket engines

    NASA Technical Reports Server (NTRS)

    Zakrajsek, June F.

    1991-01-01

    An effort was undertaken by NASA to develop an automated post-test, post-flight diagnostic system for rocket engines. The automated system is designed to be generic and to automate the rocket engine data review process. A modular, distributed architecture with a generic software core was chosen to meet the design requirements. The diagnostic system is initially being applied to the Space Shuttle Main Engine data review process. The system modules currently under development are the session/message manager, and portions of the applications section, the component analysis section, and the intelligent knowledge server. An overview is presented of a rocket engine data review process, the design requirements and guidelines, the architecture and modules, and the projected benefits of the automated diagnostic system.

  9. OntoGene web services for biomedical text mining.

    PubMed

    Rinaldi, Fabio; Clematide, Simon; Marques, Hernani; Ellendorff, Tilia; Romacker, Martin; Rodriguez-Esteban, Raul

    2014-01-01

    Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges,with top ranked results in several of them.

  10. OntoGene web services for biomedical text mining

    PubMed Central

    2014-01-01

    Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges, with top ranked results in several of them. PMID:25472638

  11. Lessons about Virtual-Environment Software Systems from 20 years of VE building

    PubMed Central

    Taylor, Russell M.; Jerald, Jason; VanderKnyff, Chris; Wendt, Jeremy; Borland, David; Marshburn, David; Sherman, William R.; Whitton, Mary C.

    2010-01-01

    What are desirable and undesirable features of virtual-environment (VE) software architectures? What should be present (and absent) from such systems if they are to be optimally useful? How should they be structured? To help answer these questions we present experience from application designers, toolkit designers, and VE system architects along with examples of useful features from existing systems. Topics are organized under the major headings of: 3D space management, supporting display hardware, interaction, event management, time management, computation, portability, and the observation that less can be better. Lessons learned are presented as discussion of the issues, field experiences, nuggets of knowledge, and case studies. PMID:20567602

  12. Influential aspects of leader’s Bourdieu capitals on Malaysian landscape architecture subordinates’ creativity

    NASA Astrophysics Data System (ADS)

    Zahari, R.; Ariffin, M. H.; Othman, N.

    2018-02-01

    Free Trade Agreements as implemented by Malaysian government calls out local businesses such as landscape architecture consultant firm to explore internationally and strengthen their performance to compete locally. Performance of landscape architecture firm as a design firm depends entirely on creativity of the subordinates in the firm. Past research has neglected studying the influence of a leader’s capitals on subordinates’ creativity, especially in Malaysian landscape architecture firms. The aim of this research is to investigate the influence of subordinates’ perceptions of the leader’s Bourdieu capitals towards promoting subordinate’s creative behaviours in Malaysian Landscape Architecture firms. The sample chosen for this research are subordinates in registered landscape architecture firm. Data was collected using qualitative semi-structured interviews with 13 respondents and analysed using Qualitative Category Coding. Aspects of the leader’s social capital (i.e. knowledge acquisition, problem solving, motivation boosting), human capital (guidance, demotivating leadership, experiential knowledge, knowledge acquisition), and emotional capital (chemistry with leader, respect, knowledge acquisition, trust, understanding, self-inflicted demotivation) that influence subordinates’ creativity were uncovered from the data. The main finding is that the leader’s capitals promote the subordinate landscape architects or assistant landscape architect to be more creative based on three main things, first is knowledge acquisition, motivation, and ability for the leader to influence through positive relationship. The finding contributes to a new way of understanding the leader’s characteristics that influence subordinates’ creativity.

  13. Development of Information and Knowledge Architectures and an Associated Framework and Methodology for System Management of a Global Reserve Currency

    ERIC Educational Resources Information Center

    Cardullo, Mario W.

    2013-01-01

    The global financial system appears to be heading for a major financial crisis. This crisis is being driven by a growing global debt. This crisis is not limited to nations that are heavily in debt such as Greece, Spain, Portugal, Ireland, Italy or Cyprus but to such others as the United States. While there has been a great deal of emphasis on…

  14. 41 CFR 102-77.15 - Who funds the Art-in-Architecture efforts?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Who funds the Art-in... Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.15 Who funds the Art-in-Architecture efforts? To the extent not...

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soldevilla, M.; Salmons, S.; Espinosa, B.

    The new application BDDR (Reactor database) has been developed at CEA in order to manage nuclear reactors technological and operating data. This application is a knowledge management tool which meets several internal needs: -) to facilitate scenario studies for any set of reactors, e.g. non-proliferation assessments; -) to make core physics studies easier, whatever the reactor design (PWR-Pressurized Water Reactor-, BWR-Boiling Water Reactor-, MAGNOX- Magnesium Oxide reactor-, CANDU - CANada Deuterium Uranium-, FBR - Fast Breeder Reactor -, etc.); -) to preserve the technological data of all reactors (past and present, power generating or experimental, naval propulsion,...) in a uniquemore » repository. Within the application database are enclosed location data and operating history data as well as a tree-like structure containing numerous technological data. These data address all kinds of reactors features and components. A few neutronics data are also included (neutrons fluxes). The BDDR application is based on open-source technologies and thin client/server architecture. The software architecture has been made flexible enough to allow for any change. (authors)« less

  16. A knowledge-based system for patient image pre-fetching in heterogeneous database environments--modeling, design, and evaluation.

    PubMed

    Wei, C P; Hu, P J; Sheng, O R

    2001-03-01

    When performing primary reading on a newly taken radiological examination, a radiologist often needs to reference relevant prior images of the same patient for confirmation or comparison purposes. Support of such image references is of clinical importance and may have significant effects on radiologists' examination reading efficiency, service quality, and work satisfaction. To effectively support such image reference needs, we proposed and developed a knowledge-based patient image pre-fetching system, addressing several challenging requirements of the application that include representation and learning of image reference heuristics and management of data-intensive knowledge inferencing. Moreover, the system demands an extensible and maintainable architecture design capable of effectively adapting to a dynamic environment characterized by heterogeneous and autonomous data source systems. In this paper, we developed a synthesized object-oriented entity- relationship model, a conceptual model appropriate for representing radiologists' prior image reference heuristics that are heuristic oriented and data intensive. We detailed the system architecture and design of the knowledge-based patient image pre-fetching system. Our architecture design is based on a client-mediator-server framework, capable of coping with a dynamic environment characterized by distributed, heterogeneous, and highly autonomous data source systems. To adapt to changes in radiologists' patient prior image reference heuristics, ID3-based multidecision-tree induction and CN2-based multidecision induction learning techniques were developed and evaluated. Experimentally, we examined effects of the pre-fetching system we created on radiologists' examination readings. Preliminary results show that the knowledge-based patient image pre-fetching system more accurately supports radiologists' patient prior image reference needs than the current practice adopted at the study site and that radiologists may become more efficient, consultatively effective, and better satisfied when supported by the pre-fetching system than when relying on the study site's pre-fetching practice.

  17. Architecture and Initial Development of a Knowledge-as-a-Service Activator for Computable Knowledge Objects for Health.

    PubMed

    Flynn, Allen J; Boisvert, Peter; Gittlen, Nate; Gross, Colin; Iott, Brad; Lagoze, Carl; Meng, George; Friedman, Charles P

    2018-01-01

    The Knowledge Grid (KGrid) is a research and development program toward infrastructure capable of greatly decreasing latency between the publication of new biomedical knowledge and its widespread uptake into practice. KGrid comprises digital knowledge objects, an online Library to store them, and an Activator that uses them to provide Knowledge-as-a-Service (KaaS). KGrid's Activator enables computable biomedical knowledge, held in knowledge objects, to be rapidly deployed at Internet-scale in cloud computing environments for improved health. Here we present the Activator, its system architecture and primary functions.

  18. Optimization of knowledge sharing through multi-forum using cloud computing architecture

    NASA Astrophysics Data System (ADS)

    Madapusi Vasudevan, Sriram; Sankaran, Srivatsan; Muthuswamy, Shanmugasundaram; Ram, N. Sankar

    2011-12-01

    Knowledge sharing is done through various knowledge sharing forums which requires multiple logins through multiple browser instances. Here a single Multi-Forum knowledge sharing concept is introduced which requires only one login session which makes user to connect multiple forums and display the data in a single browser window. Also few optimization techniques are introduced here to speed up the access time using cloud computing architecture.

  19. Software synthesis using generic architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.

  20. Mapping Research in Landscape Architecture: Balancing Supply of Academic Knowledge and Demand of Professional Practice

    ERIC Educational Resources Information Center

    Chen, Zheng; Miller, Patrick A.; Clements, Terry L.; Kim, Mintai

    2017-01-01

    With increasing academic research in the past few decades, the knowledge scope of landscape architecture has expanded from traditional focus on aesthetics to a broad range of ecological, cultural and psychological issues. In order to understand how academic research and knowledge expansion may have redefined the practice, two surveys were…

  1. 77 FR 74178 - Notice of Intent To Grant Exclusive Patent License: Kismet Management Fund LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-13

    ...,248: Program Control for Resource Management Architecture and Corresponding Programs//U.S. Patent No. 7,171,654: System Specification Language for Resource Management Architecture and Corresponding... Architecture and Corresponding Programs//U.S. Patent No. 7,552,438: Resource Management Device. DATES: Anyone...

  2. An Information Extraction Framework for Cohort Identification Using Electronic Health Records

    PubMed Central

    Liu, Hongfang; Bielinski, Suzette J.; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B.; Jonnalagadda, Siddhartha R.; Ravikumar, K.E.; Wu, Stephen T.; Kullo, Iftikhar J.; Chute, Christopher G

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework. PMID:24303255

  3. The Present of Architectural Psychology Researches in China- Based on the Bibliometric Analysis and Knowledge Mapping

    NASA Astrophysics Data System (ADS)

    Zhu, LeiYe; Wang, Qi; Xu, JunHua; Wu, Qing; Jin, MeiDong; Liao, RongJun; Wang, HaiBin

    2018-03-01

    Architectural Psychology is an interdisciplinary subject of psychology and architecture that focuses on architectural design by using Gestalt psychology, cognitive psychology and other related psychology principles. Researchers from China have achieved fruitful achievements in the field of architectural psychology during past thirty-three years. To reveal the current situation of the field in China, 129 related papers from the China National Knowledge Infrastructure (CNKI) were analyzed by CiteSpace II software. The results show that: (1) the studies of the field in China have been started since 1984 and the annual number of the papers increased dramatically from 2008 and reached a historical peak in 2016. Shanxi Architecture tops the list of contributing publishing journals; Wuhan University, Southwest Jiaotong University and Chongqing University are the best performer among the contributing organizations. (2) “Environmental Psychology”, “Architectural Design” and “Architectural Psychology” are the most frequency keywords. The frontiers of the field in China are “architectural creation” and “environmental psychology” while the popular research topics were“residential environment”, “spatial environment”, “environmental psychology”, “architectural theory” and “architectural psychology”.

  4. A multiprocessing architecture for real-time monitoring

    NASA Technical Reports Server (NTRS)

    Schmidt, James L.; Kao, Simon M.; Read, Jackson Y.; Weitzenkamp, Scott M.; Laffey, Thomas J.

    1988-01-01

    A multitasking architecture for performing real-time monitoring and analysis using knowledge-based problem solving techniques is described. To handle asynchronous inputs and perform in real time, the system consists of three or more distributed processes which run concurrently and communicate via a message passing scheme. The Data Management Process acquires, compresses, and routes the incoming sensor data to other processes. The Inference Process consists of a high performance inference engine that performs a real-time analysis on the state and health of the physical system. The I/O Process receives sensor data from the Data Management Process and status messages and recommendations from the Inference Process, updates its graphical displays in real time, and acts as the interface to the console operator. The distributed architecture has been interfaced to an actual spacecraft (NASA's Hubble Space Telescope) and is able to process the incoming telemetry in real-time (i.e., several hundred data changes per second). The system is being used in two locations for different purposes: (1) in Sunnyville, California at the Space Telescope Test Control Center it is used in the preflight testing of the vehicle; and (2) in Greenbelt, Maryland at NASA/Goddard it is being used on an experimental basis in flight operations for health and safety monitoring.

  5. TRIAD: The Translational Research Informatics and Data Management Grid

    PubMed Central

    Payne, P.; Ervin, D.; Dhaval, R.; Borlawsky, T.; Lai, A.

    2011-01-01

    Objective Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. Methods A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis “pipelines.” Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile “working interoperability” between data, information, and knowledge resources. Results Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile “working interoperability” between distributed data and knowledge sources. Conclusion Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling “working interoperability” in heterogeneous biomedical environments. PMID:23616879

  6. TRIAD: The Translational Research Informatics and Data Management Grid.

    PubMed

    Payne, P; Ervin, D; Dhaval, R; Borlawsky, T; Lai, A

    2011-01-01

    Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis "pipelines." Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile "working interoperability" between data, information, and knowledge resources. Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile "working interoperability" between distributed data and knowledge sources. Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling "working interoperability" in heterogeneous biomedical environments.

  7. Enterprise Management Network Architecture Distributed Knowledge Base Support

    DTIC Science & Technology

    1990-11-01

    Advantages Potentially, this makes a distributed system more powerful than a conventional, centralized one in two ways: " First, it can be more reliable...does not completely apply [35]. The grain size of the processors measures the individual problem-solving power of the agents. In this definition...problem-solving power amounts to the conceptual size of a single action taken by an agent visible to the other agents in the system. If the grain is coarse

  8. DIAMS revisited: Taming the variety of knowledge in fault diagnosis expert systems

    NASA Technical Reports Server (NTRS)

    Haziza, M.; Ayache, S.; Brenot, J.-M.; Cayrac, D.; Vo, D.-P.

    1994-01-01

    The DIAMS program, initiated in 1986, led to the development of a prototype expert system, DIAMS-1 dedicated to the Telecom 1 Attitude and Orbit Control System, and to a near-operational system, DIAMS-2, covering a whole satellite (the Telecom 2 platform and its interfaces with the payload), which was installed in the Satellite Control Center in 1993. The refinement of the knowledge representation and reasoning is now being studied, focusing on the introduction of appropriate handling of incompleteness, uncertainty and time, and keeping in mind operational constraints. For the latest generation of the tool, DIAMS-3, a new architecture has been proposed, that enables the cooperative exploitation of various models and knowledge representations. On the same baseline, new solutions enabling higher integration of diagnostic systems in the operational environment and cooperation with other knowledge intensive systems such as data analysis, planning or procedure management tools have been introduced.

  9. Natural language processing: an introduction.

    PubMed

    Nadkarni, Prakash M; Ohno-Machado, Lucila; Chapman, Wendy W

    2011-01-01

    To provide an overview and tutorial of natural language processing (NLP) and modern NLP-system design. This tutorial targets the medical informatics generalist who has limited acquaintance with the principles behind NLP and/or limited knowledge of the current state of the art. We describe the historical evolution of NLP, and summarize common NLP sub-problems in this extensive field. We then provide a synopsis of selected highlights of medical NLP efforts. After providing a brief description of common machine-learning approaches that are being used for diverse NLP sub-problems, we discuss how modern NLP architectures are designed, with a summary of the Apache Foundation's Unstructured Information Management Architecture. We finally consider possible future directions for NLP, and reflect on the possible impact of IBM Watson on the medical field.

  10. Case Study: Using The OMG SWRADIO Profile and SDR Forum Input for NASA's Space Telecommunications Radio System

    NASA Technical Reports Server (NTRS)

    Briones, Janette C.; Handler, Louis M.; Hall, Steve C.; Reinhart, Richard C.; Kacpura, Thomas J.

    2009-01-01

    The Space Telecommunication Radio System (STRS) standard is a Software Defined Radio (SDR) architecture standard developed by NASA. The goal of STRS is to reduce NASA s dependence on custom, proprietary architectures with unique and varying interfaces and hardware and support reuse of waveforms across platforms. The STRS project worked with members of the Object Management Group (OMG), Software Defined Radio Forum, and industry partners to leverage existing standards and knowledge. This collaboration included investigating the use of the OMG s Platform-Independent Model (PIM) SWRadio as the basis for an STRS PIM. This paper details the influence of the OMG technologies on the STRS update effort, findings in the STRS/SWRadio mapping, and provides a summary of the SDR Forum recommendations.

  11. Natural language processing: an introduction

    PubMed Central

    Ohno-Machado, Lucila; Chapman, Wendy W

    2011-01-01

    Objectives To provide an overview and tutorial of natural language processing (NLP) and modern NLP-system design. Target audience This tutorial targets the medical informatics generalist who has limited acquaintance with the principles behind NLP and/or limited knowledge of the current state of the art. Scope We describe the historical evolution of NLP, and summarize common NLP sub-problems in this extensive field. We then provide a synopsis of selected highlights of medical NLP efforts. After providing a brief description of common machine-learning approaches that are being used for diverse NLP sub-problems, we discuss how modern NLP architectures are designed, with a summary of the Apache Foundation's Unstructured Information Management Architecture. We finally consider possible future directions for NLP, and reflect on the possible impact of IBM Watson on the medical field. PMID:21846786

  12. NOAA Ecosystem Data Assembly Center for the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Parsons, A. R.; Beard, R. H.; Arnone, R. A.; Cross, S. L.; Comar, P. G.; May, N.; Strange, T. P.

    2006-12-01

    Through research programs at the NOAA Northern Gulf of Mexico Cooperative Institute (CI), NOAA is establishing an Ecosystem Data Assembly Center (EDAC) for the Gulf of Mexico. The EDAC demonstrates the utility of integrating many heterogeneous data types and streams used to characterized and identify ecosystems for the purpose of determining the health of ecosystems and identifying applications of the data within coastal resource management activities. Data streams include meteorological, physical oceanographic, ocean color, benthic, biogeochemical surveys, fishery, as well as fresh water fluxes (rainfall and river flow). Additionally the EDAC will provide an interface to the ecosystem data through an ontology based on the Coastal/Marine Ecological Classification System (CMECS). Applications of the ontological approach within the EDAC will be applied to increase public knowledge on habitat and ecosystem awareness. The EDAC plans to leverage companion socioeconomic studies to identify the essential data needed for continued EDAC operations. All data-management architectures and practices within the EDAC ensure interoperability with the Integrated Ocean Observing System (IOOS) national backbone by incorporating the IOOS Data Management and Communications Plan. Proven data protocols, standards, formats, applications, practices and architectures developed by the EDAC will be transitioned to the NOAA National Data Centers.

  13. 2008 Year in Review

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge Fernando

    2008-01-01

    In February of 2008; NASA Stennis Space Center (SSC), NASA Kennedy Space Center (KSC), and The Applied Research Laboratory at Penn State University demonstrated a pilot implementation of an Integrated System Health Management (ISHM) capability at the Launch Complex 20 of KSC. The following significant accomplishments are associated with this development: (1) implementation of an architecture for ground operations ISHM, based on networked intelligent elements; (2) Use of standards for management of data, information, and knowledge (DIaK) leading to modular ISHM implementation with interoperable elements communicating according to standards (three standards were used: IEEE 1451 family of standards for smart sensors and actuators, Open Systems Architecture for Condition Based Maintenance (OSA-CBM) standard for communicating DIaK describing the condition of elements of a system, and the OPC standard for communicating data); (3) ISHM implementation using interoperable modules addressing health management of subsystems; and (4) use of a physical intelligent sensor node (smart network element or SNE capable of providing data and health) along with classic sensors originally installed in the facility. An operational demonstration included detection of anomalies (sensor failures, leaks, etc.), determination of causes and effects, communication among health nodes, and user interfaces.

  14. Overview of the Smart Network Element Architecture and Recent Innovations

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M.; Mata, Carlos T.; Oostdyk, Rebecca L.

    2008-01-01

    In industrial environments, system operators rely on the availability and accuracy of sensors to monitor processes and detect failures of components and/or processes. The sensors must be networked in such a way that their data is reported to a central human interface, where operators are tasked with making real-time decisions based on the state of the sensors and the components that are being monitored. Incorporating health management functions at this central location aids the operator by automating the decision-making process to suggest, and sometimes perform, the action required by current operating conditions. Integrated Systems Health Management (ISHM) aims to incorporate data from many sources, including real-time and historical data and user input, and extract information and knowledge from that data to diagnose failures and predict future failures of the system. By distributing health management processing to lower levels of the architecture, there is less bandwidth required for ISHM, enhanced data fusion, make systems and processes more robust, and improved resolution for the detection and isolation of failures in a system, subsystem, component, or process. The Smart Network Element (SNE) has been developed at NASA Kennedy Space Center to perform intelligent functions at sensors and actuators' level in support of ISHM.

  15. Flight elements: Fault detection and fault management

    NASA Technical Reports Server (NTRS)

    Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.

    1990-01-01

    Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.

  16. A Core Knowledge Architecture of Visual Working Memory

    ERIC Educational Resources Information Center

    Wood, Justin N.

    2011-01-01

    Visual working memory (VWM) is widely thought to contain specialized buffers for retaining spatial and object information: a "spatial-object architecture." However, studies of adults, infants, and nonhuman animals show that visual cognition builds on core knowledge systems that retain more specialized representations: (1) spatiotemporal…

  17. Quantitative descriptions of rice plant architecture and their application

    PubMed Central

    Li, Xumeng; Wang, Xiaohui; Peng, Yulin; Wei, Hailin; Zhu, Xinguang; Chang, Shuoqi; Li, Ming; Li, Tao; Huang, Huang

    2017-01-01

    Plant architecture is an important agronomic trait, and improving plant architecture has attracted the attention of scientists for decades, particularly studies to create desirable plant architecture for high grain yields through breeding and culture practices. However, many important structural phenotypic traits still lack quantitative description and modeling on structural-functional relativity. This study defined new architecture indices (AIs) derived from the digitalized plant architecture using the virtual blade method. The influences of varieties and crop management on these indices and the influences of these indices on biomass accumulation were analyzed using field experiment data at two crop growth stages: early and late panicle initiation. The results indicated that the vertical architecture indices (LAI, PH, 90%-DRI, MDI, 90%-LI) were significantly influenced by variety, water, nitrogen management and the interaction of water and nitrogen, and compact architecture indices (H-CI, Q-CI, 90%-LI, 50%-LI) were significantly influenced by nitrogen management and the interaction of variety and water. Furthermore, there were certain trends in the influence of variety, water, and nitrogen management on AIs. Biomass accumulation has a positive linear correlation with vertical architecture indices and has a quadratic correlation with compact architecture indices, respectively. Furthermore, the combination of vertical and compact architecture indices is the indicator for evaluating the effects of plant architecture on biomass accumulation. PMID:28545144

  18. Quantitative descriptions of rice plant architecture and their application.

    PubMed

    Li, Xumeng; Wang, Xiaohui; Peng, Yulin; Wei, Hailin; Zhu, Xinguang; Chang, Shuoqi; Li, Ming; Li, Tao; Huang, Huang

    2017-01-01

    Plant architecture is an important agronomic trait, and improving plant architecture has attracted the attention of scientists for decades, particularly studies to create desirable plant architecture for high grain yields through breeding and culture practices. However, many important structural phenotypic traits still lack quantitative description and modeling on structural-functional relativity. This study defined new architecture indices (AIs) derived from the digitalized plant architecture using the virtual blade method. The influences of varieties and crop management on these indices and the influences of these indices on biomass accumulation were analyzed using field experiment data at two crop growth stages: early and late panicle initiation. The results indicated that the vertical architecture indices (LAI, PH, 90%-DRI, MDI, 90%-LI) were significantly influenced by variety, water, nitrogen management and the interaction of water and nitrogen, and compact architecture indices (H-CI, Q-CI, 90%-LI, 50%-LI) were significantly influenced by nitrogen management and the interaction of variety and water. Furthermore, there were certain trends in the influence of variety, water, and nitrogen management on AIs. Biomass accumulation has a positive linear correlation with vertical architecture indices and has a quadratic correlation with compact architecture indices, respectively. Furthermore, the combination of vertical and compact architecture indices is the indicator for evaluating the effects of plant architecture on biomass accumulation.

  19. Unified web-based network management based on distributed object orientated software agents

    NASA Astrophysics Data System (ADS)

    Djalalian, Amir; Mukhtar, Rami; Zukerman, Moshe

    2002-09-01

    This paper presents an architecture that provides a unified web interface to managed network devices that support CORBA, OSI or Internet-based network management protocols. A client gains access to managed devices through a web browser, which is used to issue management operations and receive event notifications. The proposed architecture is compatible with both the OSI Management reference Model and CORBA. The steps required for designing the building blocks of such architecture are identified.

  20. Northeast Artificial Intelligence Consortium Annual Report - 1988. Volume 12. Computer Architectures for Very Large Knowledge Bases

    DTIC Science & Technology

    1989-10-01

    Vol. 18, No. 5, 1975, pp. 253-263. [CAR84] D.B. Carlin, J.P. Bednarz, CJ. Kaiser, J.C. Connolly, M.G. Harvey , "Multichannel optical recording using... Kellog [31] takes a similar approach as ILEX in the sense that it uses existing systems rather than developing specialized hardwares (the Xerox 1100...parallel complexity. In Proceedings of the International Conference on Database Theory, pages 1-30, September 1986. [31] C. Kellog . From data management to

  1. Proceedings of the Annual Acquisition Research Symposium (2nd), Acquisition Research: The Foundation for Innovation, Held in Monterey, California on 18-19 May 2005

    DTIC Science & Technology

    2005-05-01

    Earned Value, Enterprise Architecture, Entropy, Markov Models, Perron - Frobenius Theorem 1. INTRODUCTION: THE PROBLEM CONTEXT For knowledge-intensive...trust. Academy of Management Review, 20, 709-734. McEvily, B., Perrone , V. & Zaheer, A. (2003). Trust as an organizing principle. Organization...dE(t)/dt < 0 Constant or increasing estimate variability for less capable organizations. That is, [2] dE(t)/dt > 0 4.1. The Perron

  2. Distributed, cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.

  3. Web-Based Course Management and Web Services

    ERIC Educational Resources Information Center

    Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.

    2004-01-01

    The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…

  4. Connecting Architecture and Implementation

    NASA Astrophysics Data System (ADS)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  5. A Priori Knowledge and Heuristic Reasoning in Architectural Design.

    ERIC Educational Resources Information Center

    Rowe, Peter G.

    1982-01-01

    It is proposed that the various classes of a priori knowledge incorporated in heuristic reasoning processes exert a strong influence over architectural design activity. Some design problems require exercise of some provisional set of rules, inference, or plausible strategy which requires heuristic reasoning. A case study illustrates this concept.…

  6. Information Architecture and the Comic Arts: Knowledge Structure and Access

    ERIC Educational Resources Information Center

    Farmer, Lesley S. J.

    2015-01-01

    This article explains information architecture, focusing on comic arts' features for representing and structuring knowledge. Then it details information design theory and information behaviors relative to this format, also noting visual literacy. Next , applications of comic arts in education are listed. With this background, several research…

  7. EO Domain Specific Knowledge Enabled Services (KES-B)

    NASA Astrophysics Data System (ADS)

    Varas, J.; Busto, J.; Torguet, R.

    2004-09-01

    This paper recovers and describes a number of major statements with respect to the vision, mission and technological approaches of the Technological Research Project (TRP) "EO Domain Specific Knowledge Enabled Services" (project acronym KES-B), which is currently under development at the European Space Research Institute (ESRIN) under contract "16397/02/I- SB". Resulting from the on-going R&D activities, the KES-B project aims are to demonstrate with a prototype system the feasibility of the application of innovative knowledge-based technologies to provide services for easy, scheduled and controlled exploitation of EO resources (e.g.: data, algorithms, procedures, storage, processors, ...), to automate the generation of products, and to support users in easily identifying and accessing the required information or products by using their own vocabulary, domain knowledge and preferences. The ultimate goals of KES-B are summarized in the provision of the two main types of KES services: 1st the Search service (also referred to as Product Exploitation or Information Retrieval; and 2nd the Production service (also referred to as Information Extraction), with the strategic advantage that they are enabled by Knowledge consolidated (formalized) within the system. The KES-B system technical solution approach is driven by a strong commitment for the adoption of industry (XML-based) language standards, aiming to have an interoperable, scalable and flexible operational prototype. In that sense, the Search KES services builds on the basis of the adoption of consolidated and/or emergent W3C semantic-web standards. Remarkably the languages/models Dublin Core (DC), Universal Resource Identifier (URI), Resource Description Framework (RDF) and Ontology Web Language (OWL), and COTS like Protege [1] and JENA [2] are being integrated in the system as building bricks for the construction of the KES based Search services. On the other hand, the Production KES services builds on top of workflow management standards and tools. In this side, the Business Process Execution Language (BPEL), the Web Services Definition Language (WSDL), and the Collaxa [3] COTS tool for workflow management are being integrated for the construction of the KES-B Production Services. The KES-B platform (web portal and web-server) architecture is build on the basis of the J2EE reference architecture. These languages represent the mean for the codification of the different types of knowledge that are to be formalized in the system. This representing the ontological architecture of the system. This shall enable in fact the interoperability with other KES-based systems committing as well to those standards. The motivation behind this vision is pointing towards the construction of the Semantic-web based GRID supply- chain infrastructure for EO-services, in line with the INSPIRE initiative suggestions.

  8. Patterns-Based IS Change Management in SMEs

    NASA Astrophysics Data System (ADS)

    Makna, Janis; Kirikova, Marite

    The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.

  9. Clinical engineering and risk management in healthcare technological process using architecture framework.

    PubMed

    Signori, Marcos R; Garcia, Renato

    2010-01-01

    This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.

  10. Towards sustainable infrastructure management: knowledge-based service-oriented computing framework for visual analytics

    NASA Astrophysics Data System (ADS)

    Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd

    2009-05-01

    Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.

  11. The Study on Collaborative Manufacturing Platform Based on Agent

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-yan; Qu, Zheng-geng

    To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.

  12. The Muon Conditions Data Management:. Database Architecture and Software Infrastructure

    NASA Astrophysics Data System (ADS)

    Verducci, Monica

    2010-04-01

    The management of the Muon Conditions Database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored and their analysis. The Muon conditions database is responsible for almost all of the 'non-event' data and detector quality flags storage needed for debugging of the detector operations and for performing the reconstruction and the analysis. In particular for the early data, the knowledge of the detector performance, the corrections in term of efficiency and calibration will be extremely important for the correct reconstruction of the events. In this work, an overview of the entire Muon conditions database architecture is given, in particular the different sources of the data and the storage model used, including the database technology associated. Particular emphasis is given to the Data Quality chain: the flow of the data, the analysis and the final results are described. In addition, the description of the software interfaces used to access to the conditions data are reported, in particular, in the ATLAS Offline Reconstruction framework ATHENA environment.

  13. High temperature semiconductor diode laser pumps for high energy laser applications

    NASA Astrophysics Data System (ADS)

    Campbell, Jenna; Semenic, Tadej; Guinn, Keith; Leisher, Paul O.; Bhunia, Avijit; Mashanovitch, Milan; Renner, Daniel

    2018-02-01

    Existing thermal management technologies for diode laser pumps place a significant load on the size, weight and power consumption of High Power Solid State and Fiber Laser systems, thus making current laser systems very large, heavy, and inefficient in many important practical applications. To mitigate this thermal management burden, it is desirable for diode pumps to operate efficiently at high heat sink temperatures. In this work, we have developed a scalable cooling architecture, based on jet-impingement technology with industrial coolant, for efficient cooling of diode laser bars. We have demonstrated 60% electrical-to-optical efficiency from a 9xx nm two-bar laser stack operating with propylene-glycolwater coolant, at 50 °C coolant temperature. To our knowledge, this is the highest efficiency achieved from a diode stack using 50 °C industrial fluid coolant. The output power is greater than 100 W per bar. Stacks with additional laser bars are currently in development, as this cooler architecture is scalable to a 1 kW system. This work will enable compact and robust fiber-coupled diode pump modules for high energy laser applications.

  14. Heterogeneous Spacecraft Networks

    NASA Technical Reports Server (NTRS)

    Nakamura, Yosuke (Inventor); Faber, Nicolas T. (Inventor); Frost, Chad R. (Inventor); Alena, Richard L. (Inventor)

    2018-01-01

    The present invention provides a heterogeneous spacecraft network including a network management architecture to facilitate communication between a plurality of operations centers and a plurality of data user communities. The network management architecture includes a plurality of network nodes in communication with the plurality of operations centers. The present invention also provides a method of communication for a heterogeneous spacecraft network. The method includes: transmitting data from a first space segment to a first ground segment; transmitting the data from the first ground segment to a network management architecture; transmitting data from a second space segment to a second ground segment, the second space and ground segments having incompatible communication systems with the first space and ground segments; transmitting the data from the second ground station to the network management architecture; and, transmitting data from the network management architecture to a plurality of data user communities.

  15. Multiprocessor architectural study

    NASA Technical Reports Server (NTRS)

    Kosmala, A. L.; Stanten, S. F.; Vandever, W. H.

    1972-01-01

    An architectural design study was made of a multiprocessor computing system intended to meet functional and performance specifications appropriate to a manned space station application. Intermetrics, previous experience, and accumulated knowledge of the multiprocessor field is used to generate a baseline philosophy for the design of a future SUMC* multiprocessor. Interrupts are defined and the crucial questions of interrupt structure, such as processor selection and response time, are discussed. Memory hierarchy and performance is discussed extensively with particular attention to the design approach which utilizes a cache memory associated with each processor. The ability of an individual processor to approach its theoretical maximum performance is then analyzed in terms of a hit ratio. Memory management is envisioned as a virtual memory system implemented either through segmentation or paging. Addressing is discussed in terms of various register design adopted by current computers and those of advanced design.

  16. ELISA, a demonstrator environment for information systems architecture design

    NASA Technical Reports Server (NTRS)

    Panem, Chantal

    1994-01-01

    This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.

  17. Clinical Decision Support Systems (CDSS) for preventive management of COPD patients.

    PubMed

    Velickovski, Filip; Ceccaroni, Luigi; Roca, Josep; Burgos, Felip; Galdiz, Juan B; Marina, Nuria; Lluch-Ariet, Magí

    2014-11-28

    The use of information and communication technologies to manage chronic diseases allows the application of integrated care pathways, and the optimization and standardization of care processes. Decision support tools can assist in the adherence to best-practice medicine in critical decision points during the execution of a care pathway. The objectives are to design, develop, and assess a clinical decision support system (CDSS) offering a suite of services for the early detection and assessment of chronic obstructive pulmonary disease (COPD), which can be easily integrated into a healthcare providers' work-flow. The software architecture model for the CDSS, interoperable clinical-knowledge representation, and inference engine were designed and implemented to form a base CDSS framework. The CDSS functionalities were iteratively developed through requirement-adjustment/development/validation cycles using enterprise-grade software-engineering methodologies and technologies. Within each cycle, clinical-knowledge acquisition was performed by a health-informatics engineer and a clinical-expert team. A suite of decision-support web services for (i) COPD early detection and diagnosis, (ii) spirometry quality-control support, (iii) patient stratification, was deployed in a secured environment on-line. The CDSS diagnostic performance was assessed using a validation set of 323 cases with 90% specificity, and 96% sensitivity. Web services were integrated in existing health information system platforms. Specialized decision support can be offered as a complementary service to existing policies of integrated care for chronic-disease management. The CDSS was able to issue recommendations that have a high degree of accuracy to support COPD case-finding. Integration into healthcare providers' work-flow can be achieved seamlessly through the use of a modular design and service-oriented architecture that connect to existing health information systems.

  18. Clinical Decision Support Systems (CDSS) for preventive management of COPD patients

    PubMed Central

    2014-01-01

    Background The use of information and communication technologies to manage chronic diseases allows the application of integrated care pathways, and the optimization and standardization of care processes. Decision support tools can assist in the adherence to best-practice medicine in critical decision points during the execution of a care pathway. Objectives The objectives are to design, develop, and assess a clinical decision support system (CDSS) offering a suite of services for the early detection and assessment of chronic obstructive pulmonary disease (COPD), which can be easily integrated into a healthcare providers' work-flow. Methods The software architecture model for the CDSS, interoperable clinical-knowledge representation, and inference engine were designed and implemented to form a base CDSS framework. The CDSS functionalities were iteratively developed through requirement-adjustment/development/validation cycles using enterprise-grade software-engineering methodologies and technologies. Within each cycle, clinical-knowledge acquisition was performed by a health-informatics engineer and a clinical-expert team. Results A suite of decision-support web services for (i) COPD early detection and diagnosis, (ii) spirometry quality-control support, (iii) patient stratification, was deployed in a secured environment on-line. The CDSS diagnostic performance was assessed using a validation set of 323 cases with 90% specificity, and 96% sensitivity. Web services were integrated in existing health information system platforms. Conclusions Specialized decision support can be offered as a complementary service to existing policies of integrated care for chronic-disease management. The CDSS was able to issue recommendations that have a high degree of accuracy to support COPD case-finding. Integration into healthcare providers' work-flow can be achieved seamlessly through the use of a modular design and service-oriented architecture that connect to existing health information systems. PMID:25471545

  19. Architectural Large Constructed Environment. Modeling and Interaction Using Dynamic Simulations

    NASA Astrophysics Data System (ADS)

    Fiamma, P.

    2011-09-01

    How to use for the architectural design, the simulation coming from a large size data model? The topic is related to the phase coming usually after the acquisition of the data, during the construction of the model and especially after, when designers must have an interaction with the simulation, in order to develop and verify their idea. In the case of study, the concept of interaction includes the concept of real time "flows". The work develops contents and results that can be part of the large debate about the current connection between "architecture" and "movement". The focus of the work, is to realize a collaborative and participative virtual environment on which different specialist actors, client and final users can share knowledge, targets and constraints to better gain the aimed result. The goal is to have used a dynamic micro simulation digital resource that allows all the actors to explore the model in powerful and realistic way and to have a new type of interaction in a complex architectural scenario. On the one hand, the work represents a base of knowledge that can be implemented more and more; on the other hand the work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. The architectural design before, and the architectural fact after, both happen in a sort of "Spatial Analysis System". The way is open to offer to this "system", knowledge and theories, that can support architectural design work for every application and scale. We think that the presented work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. Architecture like a spatial configuration, that can be reconfigurable too through designing.

  20. Inter-organizational design: exploring the relationship between formal architecture and ICT investments

    NASA Astrophysics Data System (ADS)

    Iubatti, Daniela; Masciarelli, Francesca; Simboli, Alberto

    This chapter aims to explore how the information-processing capabilities that emerge from a network structure affect the diffusion of innovation in a multidivisional organization. In particular, this study analyzes the role of firm investments in ICT to facilitate communication and knowledge diffusion. Using a qualitative approach, we investigate the behavior of an Italian multinational firm, Engineering S.p.A., analyzing our data using a content analysis procedure. Our results show the limited role of ICT in favoring knowledge exchange both inside and outside the firm's divisions: traditional communication patterns are generally preferred over the use of technologies for information sharing. Additionally, we find that key individuals who play a central role in the firm's communication network are unable to use ICTs for knowledge transfer. We conclude that this is the result of a strategic decision to keep top management practically unchanged since the firm was established. Therefore, key individuals act as filters to knowledge flows. Knowledge, in particular tacit knowledge, is transferred from key individuals to other actors through face-to-face contacts, thereby creating a diseconomy for the organization.

  1. Management Architecture and Solutions for French Tactical Systems

    DTIC Science & Technology

    2006-10-01

    RTO-MP-IST-062 3 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Management Architecture and Solutions for French Tactical Systems Vincent...COTTIGNIES THALES Land & Joint Systems – Battlespace Transformation Center 160 Boulevard de Valmy - BP 82 92704 Colombes Cedex FRANCE ...planning, configuration and monitoring of Systems. Then, given the limitations of existing Management System Architecture, an innovative design based on

  2. Knowledge engineering for adverse drug event prevention: on the design and development of a uniform, contextualized and sustainable knowledge-based framework.

    PubMed

    Koutkias, Vassilis; Kilintzis, Vassilis; Stalidis, George; Lazou, Katerina; Niès, Julie; Durand-Texte, Ludovic; McNair, Peter; Beuscart, Régis; Maglaveras, Nicos

    2012-06-01

    The primary aim of this work was the development of a uniform, contextualized and sustainable knowledge-based framework to support adverse drug event (ADE) prevention via Clinical Decision Support Systems (CDSSs). In this regard, the employed methodology involved first the systematic analysis and formalization of the knowledge sources elaborated in the scope of this work, through which an application-specific knowledge model has been defined. The entire framework architecture has been then specified and implemented by adopting Computer Interpretable Guidelines (CIGs) as the knowledge engineering formalism for its construction. The framework integrates diverse and dynamic knowledge sources in the form of rule-based ADE signals, all under a uniform Knowledge Base (KB) structure, according to the defined knowledge model. Equally important, it employs the means to contextualize the encapsulated knowledge, in order to provide appropriate support considering the specific local environment (hospital, medical department, language, etc.), as well as the mechanisms for knowledge querying, inference, sharing, and management. In this paper, we present thoroughly the establishment of the proposed knowledge framework by presenting the employed methodology and the results obtained as regards implementation, performance and validation aspects that highlight its applicability and virtue in medication safety. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Integrated System Health Management: Pilot Operational Implementation in a Rocket Engine Test Stand

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Schmalzel, John L.; Morris, Jonathan A.; Turowski, Mark P.; Franzl, Richard

    2010-01-01

    This paper describes a credible implementation of integrated system health management (ISHM) capability, as a pilot operational system. Important core elements that make possible fielding and evolution of ISHM capability have been validated in a rocket engine test stand, encompassing all phases of operation: stand-by, pre-test, test, and post-test. The core elements include an architecture (hardware/software) for ISHM, gateways for streaming real-time data from the data acquisition system into the ISHM system, automated configuration management employing transducer electronic data sheets (TEDS?s) adhering to the IEEE 1451.4 Standard for Smart Sensors and Actuators, broadcasting and capture of sensor measurements and health information adhering to the IEEE 1451.1 Standard for Smart Sensors and Actuators, user interfaces for management of redlines/bluelines, and establishment of a health assessment database system (HADS) and browser for extensive post-test analysis. The ISHM system was installed in the Test Control Room, where test operators were exposed to the capability. All functionalities of the pilot implementation were validated during testing and in post-test data streaming through the ISHM system. The implementation enabled significant improvements in awareness about the status of the test stand, and events and their causes/consequences. The architecture and software elements embody a systems engineering, knowledge-based approach; in conjunction with object-oriented environments. These qualities are permitting systematic augmentation of the capability and scaling to encompass other subsystems.

  4. Component-Level Electronic-Assembly Repair (CLEAR) System Architecture

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Bradish, Martin A.; Juergens, Jeffrey R.; Lewis, Michael J.; Vrnak, Daniel R.

    2011-01-01

    This document captures the system architecture for a Component-Level Electronic-Assembly Repair (CLEAR) capability needed for electronics maintenance and repair of the Constellation Program (CxP). CLEAR is intended to improve flight system supportability and reduce the mass of spares required to maintain the electronics of human rated spacecraft on long duration missions. By necessity it allows the crew to make repairs that would otherwise be performed by Earth based repair depots. Because of practical knowledge and skill limitations of small spaceflight crews they must be augmented by Earth based support crews and automated repair equipment. This system architecture covers the complete system from ground-user to flight hardware and flight crew and defines an Earth segment and a Space segment. The Earth Segment involves database management, operational planning, and remote equipment programming and validation processes. The Space Segment involves the automated diagnostic, test and repair equipment required for a complete repair process. This document defines three major subsystems including, tele-operations that links the flight hardware to ground support, highly reconfigurable diagnostics and test instruments, and a CLEAR Repair Apparatus that automates the physical repair process.

  5. The Architecture of Risk for Type 2 Diabetes: Understanding Asia in the Context of Global Findings

    PubMed Central

    Attia, John; Oldmeadow, Christopher; Scott, Rodney J.; Holliday, Elizabeth G.

    2014-01-01

    The prevalence of Type 2 diabetes is rising rapidly in both developed and developing countries. Asia is developing as the epicentre of the escalating pandemic, reflecting rapid transitions in demography, migration, diet, and lifestyle patterns. The effective management of Type 2 diabetes in Asia may be complicated by differences in prevalence, risk factor profiles, genetic risk allele frequencies, and gene-environment interactions between different Asian countries, and between Asian and other continental populations. To reduce the worldwide burden of T2D, it will be important to understand the architecture of T2D susceptibility both within and between populations. This review will provide an overview of known genetic and nongenetic risk factors for T2D, placing the results from Asian studies in the context of broader global research. Given recent evidence from large-scale genetic studies of T2D, we place special emphasis on emerging knowledge about the genetic architecture of T2D and the potential contribution of genetic effects to population differences in risk. PMID:24744783

  6. Expert system shell to reason on large amounts of data

    NASA Technical Reports Server (NTRS)

    Giuffrida, Gionanni

    1994-01-01

    The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.

  7. Architectural Physics: Lighting.

    ERIC Educational Resources Information Center

    Hopkinson, R. G.

    The author coordinates the many diverse branches of knowledge which have dealt with the field of lighting--physiology, psychology, engineering, physics, and architectural design. Part I, "The Elements of Architectural Physics", discusses the physiological aspects of lighting, visual performance, lighting design, calculations and measurements of…

  8. Evolutionary Local Search of Fuzzy Rules through a novel Neuro-Fuzzy encoding method.

    PubMed

    Carrascal, A; Manrique, D; Ríos, J; Rossi, C

    2003-01-01

    This paper proposes a new approach for constructing fuzzy knowledge bases using evolutionary methods. We have designed a genetic algorithm that automatically builds neuro-fuzzy architectures based on a new indirect encoding method. The neuro-fuzzy architecture represents the fuzzy knowledge base that solves a given problem; the search for this architecture takes advantage of a local search procedure that improves the chromosomes at each generation. Experiments conducted both on artificially generated and real world problems confirm the effectiveness of the proposed approach.

  9. A study of diverse clinical decision support rule authoring environments and requirements for integration

    PubMed Central

    2012-01-01

    Background Efficient rule authoring tools are critical to allow clinical Knowledge Engineers (KEs), Software Engineers (SEs), and Subject Matter Experts (SMEs) to convert medical knowledge into machine executable clinical decision support rules. The goal of this analysis was to identify the critical success factors and challenges of a fully functioning Rule Authoring Environment (RAE) in order to define requirements for a scalable, comprehensive tool to manage enterprise level rules. Methods The authors evaluated RAEs in active use across Partners Healthcare, including enterprise wide, ambulatory only, and system specific tools, with a focus on rule editors for reminder and medication rules. We conducted meetings with users of these RAEs to discuss their general experience and perceived advantages and limitations of these tools. Results While the overall rule authoring process is similar across the 10 separate RAEs, the system capabilities and architecture vary widely. Most current RAEs limit the ability of the clinical decision support (CDS) interventions to be standardized, sharable, interoperable, and extensible. No existing system meets all requirements defined by knowledge management users. Conclusions A successful, scalable, integrated rule authoring environment will need to support a number of key requirements and functions in the areas of knowledge representation, metadata, terminology, authoring collaboration, user interface, integration with electronic health record (EHR) systems, testing, and reporting. PMID:23145874

  10. Phenotypic and genotypic data integration and exploration through a web-service architecture.

    PubMed

    Nuzzo, Angelo; Riva, Alberto; Bellazzi, Riccardo

    2009-10-15

    Linking genotypic and phenotypic information is one of the greatest challenges of current genetics research. The definition of an Information Technology infrastructure to support this kind of studies, and in particular studies aimed at the analysis of complex traits, which require the definition of multifaceted phenotypes and the integration genotypic information to discover the most prevalent diseases, is a paradigmatic goal of Biomedical Informatics. This paper describes the use of Information Technology methods and tools to develop a system for the management, inspection and integration of phenotypic and genotypic data. We present the design and architecture of the Phenotype Miner, a software system able to flexibly manage phenotypic information, and its extended functionalities to retrieve genotype information from external repositories and to relate it to phenotypic data. For this purpose we developed a module to allow customized data upload by the user and a SOAP-based communications layer to retrieve data from existing biomedical knowledge management tools. In this paper we also demonstrate the system functionality by an example application of the system in which we analyze two related genomic datasets. In this paper we show how a comprehensive, integrated and automated workbench for genotype and phenotype integration can facilitate and improve the hypothesis generation process underlying modern genetic studies.

  11. Mayo clinical Text Analysis and Knowledge Extraction System (cTAKES): architecture, component evaluation and applications

    PubMed Central

    Masanz, James J; Ogren, Philip V; Zheng, Jiaping; Sohn, Sunghwan; Kipper-Schuler, Karin C; Chute, Christopher G

    2010-01-01

    We aim to build and evaluate an open-source natural language processing system for information extraction from electronic medical record clinical free-text. We describe and evaluate our system, the clinical Text Analysis and Knowledge Extraction System (cTAKES), released open-source at http://www.ohnlp.org. The cTAKES builds on existing open-source technologies—the Unstructured Information Management Architecture framework and OpenNLP natural language processing toolkit. Its components, specifically trained for the clinical domain, create rich linguistic and semantic annotations. Performance of individual components: sentence boundary detector accuracy=0.949; tokenizer accuracy=0.949; part-of-speech tagger accuracy=0.936; shallow parser F-score=0.924; named entity recognizer and system-level evaluation F-score=0.715 for exact and 0.824 for overlapping spans, and accuracy for concept mapping, negation, and status attributes for exact and overlapping spans of 0.957, 0.943, 0.859, and 0.580, 0.939, and 0.839, respectively. Overall performance is discussed against five applications. The cTAKES annotations are the foundation for methods and modules for higher-level semantic processing of clinical free-text. PMID:20819853

  12. A web-based system architecture for ontology-based data integration in the domain of IT benchmarking

    NASA Astrophysics Data System (ADS)

    Pfaff, Matthias; Krcmar, Helmut

    2018-03-01

    In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.

  13. "Re-Casting Terra Nullius Design-Blindness": Better Teaching of Indigenous Knowledge and Protocols in Australian Architecture Education

    ERIC Educational Resources Information Center

    Tucker, Richard; Choy, Darryl Low; Heyes, Scott; Revell, Grant; Jones, David

    2018-01-01

    This paper reviews the current status and focus of Australian Architecture programs with respect to Indigenous Knowledge and the extent to which these tertiary programs currently address reconciliation and respect to Indigenous Australians in relation to their professional institutions and accreditation policies. The paper draws upon the findings…

  14. Knowledge-based vision for space station object motion detection, recognition, and tracking

    NASA Technical Reports Server (NTRS)

    Symosek, P.; Panda, D.; Yalamanchili, S.; Wehner, W., III

    1987-01-01

    Computer vision, especially color image analysis and understanding, has much to offer in the area of the automation of Space Station tasks such as construction, satellite servicing, rendezvous and proximity operations, inspection, experiment monitoring, data management and training. Knowledge-based techniques improve the performance of vision algorithms for unstructured environments because of their ability to deal with imprecise a priori information or inaccurately estimated feature data and still produce useful results. Conventional techniques using statistical and purely model-based approaches lack flexibility in dealing with the variabilities anticipated in the unstructured viewing environment of space. Algorithms developed under NASA sponsorship for Space Station applications to demonstrate the value of a hypothesized architecture for a Video Image Processor (VIP) are presented. Approaches to the enhancement of the performance of these algorithms with knowledge-based techniques and the potential for deployment of highly-parallel multi-processor systems for these algorithms are discussed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berra, P.B.; Chung, S.M.; Hachem, N.I.

    This article presents techniques for managing a very large data/knowledge base to support multiple inference-mechanisms for logic programming. Because evaluation of goals can require accessing data from the extensional database, or EDB, in very general ways, one must often resort to indexing on all fields of the extensional database facts. This presents a formidable management problem in that the index data may be larger than the EDB itself. This problem becomes even more serious in this case of very large data/knowledge bases (hundreds of gigabytes), since considerably more hardware will be required to process and store the index data. Inmore » order to reduce the amount of index data considerably without losing generality, the authors form a surrogate file, which is a hashing transformation of the facts. Superimposed code words (SCW), concatenated code words (CCW), and transformed inverted lists (TIL) are possible structures for the surrogate file. since these transformations are quite regular and compact, the authors consider possible computer architecture for the processing of the surrogate file.« less

  16. A Reference Architecture for Space Information Management

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.

    2006-01-01

    We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.

  17. Flood Risk Assessments of Architectural Heritage - Case of Changgyeonggung Palace

    NASA Astrophysics Data System (ADS)

    Lee, Hyosang; Kim, Ji-sung; Lee, Ho-jin

    2014-05-01

    The risk of natural disasters such as flood and earthquake has increased due to recent extreme weather events. Therefore, the necessity of the risk management system to protect architectural properties, a cultural heritage of humanity, from natural disasters has been consistently felt. The solutions for managing flood risk focusing on architectural heritage are suggested and applied to protect Changgyeonggung Palace, a major palace heritage in Seoul. After the probable rainfall scenario for risk assessment (frequency: 100 years, 200 years, and 500 years) and the scenario of a probable maximum precipitation (PMP) are made and a previous rainfall event (from July 26th to 28th in 2011) is identified, they are used for the model (HEC-HMS, SWMM) to assess flood risk of certain areas covering Changgyeonggung Palace to do flood amount. Such flood amount makes it possible to identify inundation risks based on GIS models to assess flood risk of individual architectural heritage. The results of assessing such risk are used to establish the disaster risk management system that managers of architectural properties can utilize. According to the results of assessing flood risk of Changgyeonggung Palace, inundation occurs near outlets of Changgyeonggung Palace and sections of river channel for all scenarios of flood risk but the inundation risk of major architectural properties was estimated low. The methods for assessing flood risk of architectural heritage proposed in this study and the risk management system for Changgyeonggung Palace using the methods show thorough solutions for flood risk management and the possibility of using the solutions seems high. A comprehensive management system for architectural heritage will be established in the future through the review on diverse factors for disasters.

  18. DAsHER CD: Developing a Data-Oriented Human-Centric Enterprise Architecture for EarthCube

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Yu, M.; Sun, M.; Qin, H.; Robinson, E.

    2015-12-01

    One of the biggest challenges that face Earth scientists is the resource discovery, access, and sharing in a desired fashion. EarthCube is targeted to enable geoscientists to address the challenges by fostering community-governed efforts that develop a common cyberinfrastructure for the purpose of collecting, accessing, analyzing, sharing and visualizing all forms of data and related resources, through the use of advanced technological and computational capabilities. Here we design an Enterprise Architecture (EA) for EarthCube to facilitate the knowledge management, communication and human collaboration in pursuit of the unprecedented data sharing across the geosciences. The design results will provide EarthCube a reference framework for developing geoscience cyberinfrastructure collaborated by different stakeholders, and identifying topics which should invoke high interest in the community. The development of this EarthCube EA framework leverages popular frameworks, such as Zachman, Gartner, DoDAF, and FEAF. The science driver of this design is the needs from EarthCube community, including the analyzed user requirements from EarthCube End User Workshop reports and EarthCube working group roadmaps, and feedbacks or comments from scientists obtained by organizing workshops. The final product of this Enterprise Architecture is a four-volume reference document: 1) Volume one is this document and comprises an executive summary of the EarthCube architecture, serving as an overview in the initial phases of architecture development; 2) Volume two is the major body of the design product. It outlines all the architectural design components or viewpoints; 3) Volume three provides taxonomy of the EarthCube enterprise augmented with semantics relations; 4) Volume four describes an example of utilizing this architecture for a geoscience project.

  19. Generalized Information Architecture for Managing Requirements in IBM?s Rational DOORS(r) Application.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner

    When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rationalmore » DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201« less

  20. Examining the architecture of cellular computing through a comparative study with a computer

    PubMed Central

    Wang, Degeng; Gribskov, Michael

    2005-01-01

    The computer and the cell both use information embedded in simple coding, the binary software code and the quadruple genomic code, respectively, to support system operations. A comparative examination of their system architecture as well as their information storage and utilization schemes is performed. On top of the code, both systems display a modular, multi-layered architecture, which, in the case of a computer, arises from human engineering efforts through a combination of hardware implementation and software abstraction. Using the computer as a reference system, a simplistic mapping of the architectural components between the two is easily detected. This comparison also reveals that a cell abolishes the software–hardware barrier through genomic encoding for the constituents of the biochemical network, a cell's ‘hardware’ equivalent to the computer central processing unit (CPU). The information loading (gene expression) process acts as a major determinant of the encoded constituent's abundance, which, in turn, often determines the ‘bandwidth’ of a biochemical pathway. Cellular processes are implemented in biochemical pathways in parallel manners. In a computer, on the other hand, the software provides only instructions and data for the CPU. A process represents just sequentially ordered actions by the CPU and only virtual parallelism can be implemented through CPU time-sharing. Whereas process management in a computer may simply mean job scheduling, coordinating pathway bandwidth through the gene expression machinery represents a major process management scheme in a cell. In summary, a cell can be viewed as a super-parallel computer, which computes through controlled hardware composition. While we have, at best, a very fragmented understanding of cellular operation, we have a thorough understanding of the computer throughout the engineering process. The potential utilization of this knowledge to the benefit of systems biology is discussed. PMID:16849179

  1. Examining the architecture of cellular computing through a comparative study with a computer.

    PubMed

    Wang, Degeng; Gribskov, Michael

    2005-06-22

    The computer and the cell both use information embedded in simple coding, the binary software code and the quadruple genomic code, respectively, to support system operations. A comparative examination of their system architecture as well as their information storage and utilization schemes is performed. On top of the code, both systems display a modular, multi-layered architecture, which, in the case of a computer, arises from human engineering efforts through a combination of hardware implementation and software abstraction. Using the computer as a reference system, a simplistic mapping of the architectural components between the two is easily detected. This comparison also reveals that a cell abolishes the software-hardware barrier through genomic encoding for the constituents of the biochemical network, a cell's "hardware" equivalent to the computer central processing unit (CPU). The information loading (gene expression) process acts as a major determinant of the encoded constituent's abundance, which, in turn, often determines the "bandwidth" of a biochemical pathway. Cellular processes are implemented in biochemical pathways in parallel manners. In a computer, on the other hand, the software provides only instructions and data for the CPU. A process represents just sequentially ordered actions by the CPU and only virtual parallelism can be implemented through CPU time-sharing. Whereas process management in a computer may simply mean job scheduling, coordinating pathway bandwidth through the gene expression machinery represents a major process management scheme in a cell. In summary, a cell can be viewed as a super-parallel computer, which computes through controlled hardware composition. While we have, at best, a very fragmented understanding of cellular operation, we have a thorough understanding of the computer throughout the engineering process. The potential utilization of this knowledge to the benefit of systems biology is discussed.

  2. The Need for Software Architecture Evaluation in the Acquisition of Software-Intensive Sysetms

    DTIC Science & Technology

    2014-01-01

    Function and Performance Specification GIG Global Information Grid ISO International Standard Organisation MDA Model Driven Architecture...architecture and design, which is a key part of knowledge-based economy UNCLASSIFIED DSTO-TR-2936 UNCLASSIFIED 24  Allow Australian SMEs to

  3. Using ArchE in the Classroom: One Experience

    DTIC Science & Technology

    2007-09-01

    The Architecture Expert (ArchE) tool serves as a software architecture design assistant. It embodies knowledge of quality attributes and the relation...between the achievement of quality attribute requirements and architecture design . This technical note describes the use of a pre-alpha release of

  4. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  5. Knowledge-acquisition tools for medical knowledge-based systems.

    PubMed

    Lanzola, G; Quaglini, S; Stefanelli, M

    1995-03-01

    Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.

  6. EXPECT: Explicit Representations for Flexible Acquisition

    NASA Technical Reports Server (NTRS)

    Swartout, BIll; Gil, Yolanda

    1995-01-01

    To create more powerful knowledge acquisition systems, we not only need better acquisition tools, but we need to change the architecture of the knowledge based systems we create so that their structure will provide better support for acquisition. Current acquisition tools permit users to modify factual knowledge but they provide limited support for modifying problem solving knowledge. In this paper, the authors argue that this limitation (and others) stem from the use of incomplete models of problem-solving knowledge and inflexible specification of the interdependencies between problem-solving and factual knowledge. We describe the EXPECT architecture which addresses these problems by providing an explicit representation for problem-solving knowledge and intent. Using this more explicit representation, EXPECT can automatically derive the interdependencies between problem-solving and factual knowledge. By deriving these interdependencies from the structure of the knowledge-based system itself EXPECT supports more flexible and powerful knowledge acquisition.

  7. Molecular basis of angiosperm tree architecture.

    PubMed

    Hollender, Courtney A; Dardick, Chris

    2015-04-01

    The architecture of trees greatly impacts the productivity of orchards and forestry plantations. Amassing greater knowledge on the molecular genetics that underlie tree form can benefit these industries, as well as contribute to basic knowledge of plant developmental biology. This review describes the fundamental components of branch architecture, a prominent aspect of tree structure, as well as genetic and hormonal influences inferred from studies in model plant systems and from trees with non-standard architectures. The bulk of the molecular and genetic data described here is from studies of fruit trees and poplar, as these species have been the primary subjects of investigation in this field of science. No claim to original US Government works. New Phytologist © 2014 New Phytologist Trust.

  8. Artificial intelligence costs, benefits, risks for selected spacecraft ground system automation scenarios

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walter F.; Silverman, Barry G.; Kahn, Martha; Hexmoor, Henry

    1988-01-01

    In response to a number of high-level strategy studies in the early 1980s, expert systems and artificial intelligence (AI/ES) efforts for spacecraft ground systems have proliferated in the past several years primarily as individual small to medium scale applications. It is useful to stop and assess the impact of this technology in view of lessons learned to date, and hopefully, to determine if the overall strategies of some of the earlier studies both are being followed and still seem relevant. To achieve that end four idealized ground system automation scenarios and their attendant AI architecture are postulated and benefits, risks, and lessons learned are examined and compared. These architectures encompass: (1) no AI (baseline), (2) standalone expert systems, (3) standardized, reusable knowledge base management systems (KBMS), and (4) a futuristic unattended automation scenario. The resulting artificial intelligence lessons learned, benefits, and risks for spacecraft ground system automation scenarios are described.

  9. Artificial intelligence costs, benefits, and risks for selected spacecraft ground system automation scenarios

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walter F.; Silverman, Barry G.; Kahn, Martha; Hexmoor, Henry

    1988-01-01

    In response to a number of high-level strategy studies in the early 1980s, expert systems and artificial intelligence (AI/ES) efforts for spacecraft ground systems have proliferated in the past several years primarily as individual small to medium scale applications. It is useful to stop and assess the impact of this technology in view of lessons learned to date, and hopefully, to determine if the overall strategies of some of the earlier studies both are being followed and still seem relevant. To achieve that end four idealized ground system automation scenarios and their attendant AI architecture are postulated and benefits, risks, and lessons learned are examined and compared. These architectures encompass: (1) no AI (baseline); (2) standalone expert systems; (3) standardized, reusable knowledge base management systems (KBMS); and (4) a futuristic unattended automation scenario. The resulting artificial intelligence lessons learned, benefits, and risks for spacecraft ground system automation scenarios are described.

  10. Developing a New Framework for Integration and Teaching of Computer Aided Architectural Design (CAAD) in Nigerian Schools of Architecture

    ERIC Educational Resources Information Center

    Uwakonye, Obioha; Alagbe, Oluwole; Oluwatayo, Adedapo; Alagbe, Taiye; Alalade, Gbenga

    2015-01-01

    As a result of globalization of digital technology, intellectual discourse on what constitutes the basic body of architectural knowledge to be imparted to future professionals has been on the increase. This digital revolution has brought to the fore the need to review the already overloaded architectural education curriculum of Nigerian schools of…

  11. Integrating planning, execution, and learning

    NASA Technical Reports Server (NTRS)

    Kuokka, Daniel R.

    1989-01-01

    To achieve the goal of building an autonomous agent, the usually disjoint capabilities of planning, execution, and learning must be used together. An architecture, called MAX, within which cognitive capabilities can be purposefully and intelligently integrated is described. The architecture supports the codification of capabilities as explicit knowledge that can be reasoned about. In addition, specific problem solving, learning, and integration knowledge is developed.

  12. The architecture of personality.

    PubMed

    Cervone, David

    2004-01-01

    This article presents a theoretical framework for analyzing psychological systems that contribute to the variability, consistency, and cross-situational coherence of personality functioning. In the proposed knowledge-and-appraisal personality architecture (KAPA), personality structures and processes are delineated by combining 2 principles: distinctions (a) between knowledge structures and appraisal processes and (b) among intentional cognitions with varying directions of fit, with the latter distinction differentiating among beliefs, evaluative standards, and aims. Basic principles of knowledge activation and use illuminate relations between knowledge and appraisal, yielding a synthetic account of personality structures and processes. Novel empirical data illustrate the heuristic value of the knowledge/appraisal distinction by showing how self-referent and situational knowledge combine to foster cross-situational coherence in appraisals of self-efficacy.

  13. What the Logs Can Tell You: Mediation to Implement Feedback in Training

    NASA Technical Reports Server (NTRS)

    Maluf, David; Wiederhold, Gio; Abou-Khalil, Ali; Norvig, Peter (Technical Monitor)

    2000-01-01

    The problem addressed by Mediation to Implement Feedback in Training (MIFT) is to customize the feedback from training exercizes by exploiting knowledge about the training scenario, training objectives, and specific student/teacher needs. We achieve this by inserting an intelligent mediation layer into the information flow from observations collected during training exercises to the display and user interface. Knowledge about training objectives, scenarios, and tasks is maintained in the mediating layer. A designer constraint is that domain experts must be able to extend mediators by adding domain-specific knowledge that supports additional aggregations, abstractions, and views of the results of training exercises. The MIFT mediation concept is intended to be integrated with existing military training exercise management tools and reduce the cost of developing and maintaining separate feedback and evaluation tools for every training simulator and every set of customer needs. The MIFT Architecture is designed as a set of independently reusable components which interact with each other through standardized formalisms such as the Knowledge Interchange Format (KIF) and Knowledge Query and Manipulation Language (KQML).

  14. Reference architecture of application services for personal wellbeing information management.

    PubMed

    Tuomainen, Mika; Mykkänen, Juha

    2011-01-01

    Personal information management has been proposed as an important enabler for individual empowerment concerning citizens' wellbeing and health information. In the MyWellbeing project in Finland, a strictly citizen-driven concept of "Coper" and related architectural and functional guidelines have been specified. We present a reference architecture and a set of identified application services to support personal wellbeing information management. In addition, the related standards and developments are discussed.

  15. A digital protection system incorporating knowledge based learning

    NASA Astrophysics Data System (ADS)

    Watson, Karan; Russell, B. Don; McCall, Kurt

    A digital system architecture used to diagnoses the operating state and health of electric distribution lines and to generate actions for line protection is presented. The architecture is described functionally and to a limited extent at the hardware level. This architecture incorporates multiple analysis and fault-detection techniques utilizing a variety of parameters. In addition, a knowledge-based decision maker, a long-term memory retention and recall scheme, and a learning environment are described. Preliminary laboratory implementations of the system elements have been completed. Enhanced protection for electric distribution feeders is provided by this system. Advantages of the system are enumerated.

  16. Harnessing the Risk-Related Data Supply Chain: An Information Architecture Approach to Enriching Human System Research and Operations Knowledge

    NASA Technical Reports Server (NTRS)

    Buquo, Lynn E.; Johnson-Throop, Kathy A.

    2011-01-01

    An Information Architecture facilitates the understanding and, hence, harnessing of the human system risk-related data supply chain which enhances the ability to securely collect, integrate, and share data assets that improve human system research and operations. By mapping the risk-related data flow from raw data to useable information and knowledge (think of it as a data supply chain), the Human Research Program (HRP) and Space Life Science Directorate (SLSD) are building an information architecture plan to leverage their existing, and often shared, IT infrastructure.

  17. SigmaCLIPSE = presentation management + NASA CLI PS + SQL

    NASA Technical Reports Server (NTRS)

    Weiss, Bernard P., Jr.

    1990-01-01

    SigmaCLIPSE provides an expert systems and 'intelligent' data base development program for diverse systems integration environments that require support for automated reasoning and expert systems technology, presentation management, and access to 'intelligent' SQL data bases. The SigmaCLIPSE technology and and its integrated ability to access 4th generation application development and decision support tools through a portable SQL interface, comprises a sophisticated software development environment for solving knowledge engineering and expert systems development problems in information intensive commercial environments -- financial services, health care, and distributed process control -- where the expert system must be extendable -- a major architectural advantage of NASA CLIPS. SigmaCLIPSE is a research effort intended to test the viability of merging SQL data bases with expert systems technology.

  18. A conceptual cognitive architecture for robots to learn behaviors from demonstrations in robotic aid area.

    PubMed

    Tan, Huan; Liang, Chen

    2011-01-01

    This paper proposes a conceptual hybrid cognitive architecture for cognitive robots to learn behaviors from demonstrations in robotic aid situations. Unlike the current cognitive architectures, this architecture puts concentration on the requirements of the safety, the interaction, and the non-centralized processing in robotic aid situations. Imitation learning technologies for cognitive robots have been integrated into this architecture for rapidly transferring the knowledge and skills between human teachers and robots.

  19. Pumping up the volume - vacuole biogenesis in Arabidopsis thaliana.

    PubMed

    Krüger, Falco; Schumacher, Karin

    2017-07-08

    Plant architecture follows the need to collect CO 2, solar energy, water and mineral nutrients via large surface areas. It is by the presence of a central vacuole that fills much of the cell volume that plants manage to grow at low metabolic cost. In addition vacuoles buffer the fluctuating supply of essential nutrients and help to detoxify the cytosol when plants are challenged by harmful molecules. Despite their large size and multiple important functions, our knowledge of vacuole biogenesis and the machinery underlying their amazing dynamics is still fragmentary. In this review, we try to reconcile past and present models for vacuole biogenesis with the current knowledge of multiple parallel vacuolar trafficking pathways and the molecular machineries driving membrane fusion and organelle shape. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Developing a New Thesaurus for Art and Architecture.

    ERIC Educational Resources Information Center

    Petersen, Toni

    1990-01-01

    This description of the development of the Art and Architecture Thesaurus from 1979 to the present explains the processes and policies that were used to construct a language designed to represent knowledge in art and architecture, as well as to be a surrogate for the image and object being described. (EAM)

  1. The Architecture of "Educare": Motion and Emotion in Postwar Educational Spaces

    ERIC Educational Resources Information Center

    Kozlovsky, Roy

    2010-01-01

    This essay explores the interplay between educational and architectural methodologies for analysing the school environment. It historicises the affinity between architectural and educational practices and modes of knowledge pertaining to the child's body during the period of postwar reconstruction in England to argue that educational spaces were…

  2. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    ERIC Educational Resources Information Center

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  3. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IVV) Program, with Software Assurance Research Program support, extracted FM architectures across the IVV portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IVV projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management.

  4. Knowledge-Based Systems Research

    DTIC Science & Technology

    1990-08-24

    P. S., Laird, J. E., Newell, A. and McCarl, R. 1991. A Preliminary Analysis of the SOAR Architecture as a Basis for General Intelligence . Artifcial ...on reverse of neceSSjr’y gnd identify by block nhmber) FIELD I GRO’= SUB-C.OROUC Artificial Intelligence , Blackboard Systems, U°nstraint Satisfaction...knowledge acquisition; symbolic simulation; logic-based systems with self-awareness; SOAR, an architecture for general intelligence and learning

  5. Re-Engineering Complex Legacy Systems at NASA

    NASA Technical Reports Server (NTRS)

    Ruszkowski, James; Meshkat, Leila

    2010-01-01

    The Flight Production Process (FPP) Re-engineering project has established a Model-Based Systems Engineering (MBSE) methodology and the technological infrastructure for the design and development of a reference, product-line architecture as well as an integrated workflow model for the Mission Operations System (MOS) for human space exploration missions at NASA Johnson Space Center. The design and architectural artifacts have been developed based on the expertise and knowledge of numerous Subject Matter Experts (SMEs). The technological infrastructure developed by the FPP Re-engineering project has enabled the structured collection and integration of this knowledge and further provides simulation and analysis capabilities for optimization purposes. A key strength of this strategy has been the judicious combination of COTS products with custom coding. The lean management approach that has led to the success of this project is based on having a strong vision for the whole lifecycle of the project and its progress over time, a goal-based design and development approach, a small team of highly specialized people in areas that are critical to the project, and an interactive approach for infusing new technologies into existing processes. This project, which has had a relatively small amount of funding, is on the cutting edge with respect to the utilization of model-based design and systems engineering. An overarching challenge that was overcome by this project was to convince upper management of the needs and merits of giving up more conventional design methodologies (such as paper-based documents and unwieldy and unstructured flow diagrams and schedules) in favor of advanced model-based systems engineering approaches.

  6. 41 CFR 102-76.65 - What standards must facilities subject to the Architectural Barriers Act meet?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What standards must... Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.65 What standards...

  7. 41 CFR 102-76.65 - What standards must facilities subject to the Architectural Barriers Act meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What standards must... Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.65 What standards...

  8. 41 CFR 102-76.65 - What standards must facilities subject to the Architectural Barriers Act meet?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What standards must... Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.65 What standards...

  9. 41 CFR 102-76.65 - What standards must facilities subject to the Architectural Barriers Act meet?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What standards must... Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.65 What standards...

  10. 41 CFR 102-76.65 - What standards must facilities subject to the Architectural Barriers Act meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What standards must... Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.65 What standards...

  11. 41 CFR 102-76.60 - To which facilities does the Architectural Barriers Act apply?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false To which facilities does... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.60 To which facilities does the...

  12. 41 CFR 102-76.60 - To which facilities does the Architectural Barriers Act apply?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false To which facilities does... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.60 To which facilities does the...

  13. 41 CFR 102-76.60 - To which facilities does the Architectural Barriers Act apply?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false To which facilities does... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.60 To which facilities does the...

  14. 41 CFR 102-76.60 - To which facilities does the Architectural Barriers Act apply?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false To which facilities does... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.60 To which facilities does the...

  15. 41 CFR 102-76.60 - To which facilities does the Architectural Barriers Act apply?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false To which facilities does... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Architectural Barriers Act § 102-76.60 To which facilities does the...

  16. Integrating software architectures for distributed simulations and simulation analysis communities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less

  17. Implementing partnership-driven clinical federated electronic health record data sharing networks.

    PubMed

    Stephens, Kari A; Anderson, Nicholas; Lin, Ching-Ping; Estiri, Hossein

    2016-09-01

    Building federated data sharing architectures requires supporting a range of data owners, effective and validated semantic alignment between data resources, and consistent focus on end-users. Establishing these resources requires development methodologies that support internal validation of data extraction and translation processes, sustaining meaningful partnerships, and delivering clear and measurable system utility. We describe findings from two federated data sharing case examples that detail critical factors, shared outcomes, and production environment results. Two federated data sharing pilot architectures developed to support network-based research associated with the University of Washington's Institute of Translational Health Sciences provided the basis for the findings. A spiral model for implementation and evaluation was used to structure iterations of development and support knowledge share between the two network development teams, which cross collaborated to support and manage common stages. We found that using a spiral model of software development and multiple cycles of iteration was effective in achieving early network design goals. Both networks required time and resource intensive efforts to establish a trusted environment to create the data sharing architectures. Both networks were challenged by the need for adaptive use cases to define and test utility. An iterative cyclical model of development provided a process for developing trust with data partners and refining the design, and supported measureable success in the development of new federated data sharing architectures. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Architecture Governance: The Importance of Architecture Governance for Achieving Operationally Responsive Ground Systems

    NASA Technical Reports Server (NTRS)

    Kolar, Mike; Estefan, Jeff; Giovannoni, Brian; Barkley, Erik

    2011-01-01

    Topics covered (1) Why Governance and Why Now? (2) Characteristics of Architecture Governance (3) Strategic Elements (3a) Architectural Principles (3b) Architecture Board (3c) Architecture Compliance (4) Architecture Governance Infusion Process. Governance is concerned with decision making (i.e., setting directions, establishing standards and principles, and prioritizing investments). Architecture governance is the practice and orientation by which enterprise architectures and other architectures are managed and controlled at an enterprise-wide level

  19. HYDRA: A Middleware-Oriented Integrated Architecture for e-Procurement in Supply Chains

    NASA Astrophysics Data System (ADS)

    Alor-Hernandez, Giner; Aguilar-Lasserre, Alberto; Juarez-Martinez, Ulises; Posada-Gomez, Ruben; Cortes-Robles, Guillermo; Garcia-Martinez, Mario Alberto; Gomez-Berbis, Juan Miguel; Rodriguez-Gonzalez, Alejandro

    The Service-Oriented Architecture (SOA) development paradigm has emerged to improve the critical issues of creating, modifying and extending solutions for business processes integration, incorporating process automation and automated exchange of information between organizations. Web services technology follows the SOA's principles for developing and deploying applications. Besides, Web services are considered as the platform for SOA, for both intra- and inter-enterprise communication. However, an SOA does not incorporate information about occurring events into business processes, which are the main features of supply chain management. These events and information delivery are addressed in an Event-Driven Architecture (EDA). Taking this into account, we propose a middleware-oriented integrated architecture that offers a brokering service for the procurement of products in a Supply Chain Management (SCM) scenario. As salient contributions, our system provides a hybrid architecture combining features of both SOA and EDA and a set of mechanisms for business processes pattern management, monitoring based on UML sequence diagrams, Web services-based management, event publish/subscription and reliable messaging service.

  20. Advanced information processing system: Input/output network management software

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  1. On Some Aspects of Study on Dimensions and Proportions of Church Architecture

    NASA Astrophysics Data System (ADS)

    Kolobaeva, T. V.

    2017-11-01

    Architecture forms and arranges the environment required for a comfortable life and human activity. The modern principles of architectural space arrangement and form making are represented by a reliable system of buildings which are used in design. Architects apply these principles and knowledge of space arrangement in regard to the study of special and regulatory literature when performing a particular creative task. This system of accumulated knowledge is perceived in the form of an existing stereotype with no regard for understanding of the form making and experience inherent to the architects and thinkers of previous ages. We make an attempt to restore this connection as the form-making specific regularities known by ancient architects should be taken into account. The paper gives an insight into some aspects of traditional dimensions and proportions of church architecture.

  2. Executive control systems in the engineering design environment

    NASA Technical Reports Server (NTRS)

    Hurst, P. W.; Pratt, T. W.

    1985-01-01

    Executive Control Systems (ECSs) are software structures for the unification of various engineering design application programs into comprehensive systems with a central user interface (uniform access) method and a data management facility. Attention is presently given to the most significant determinations of a research program conducted for 24 ECSs, used in government and industry engineering design environments to integrate CAD/CAE applications programs. Characterizations are given for the systems' major architectural components and the alternative design approaches considered in their development. Attention is given to ECS development prospects in the areas of interdisciplinary usage, standardization, knowledge utilization, and computer science technology transfer.

  3. Can big data transform electronic health records into learning health systems?

    PubMed

    Harper, Ellen

    2014-01-01

    In the United States and globally, healthcare delivery is in the midst of an acute transformation with the adoption and use of health information technology (health IT) thus generating increasing amounts of patient care data available in computable form. Secure and trusted use of these data, beyond their original purpose can change the way we think about business, health, education, and innovation in the years to come. "Big Data" is data whose scale, diversity, and complexity require new architecture, techniques, algorithms, and analytics to manage it and extract value and hidden knowledge from it.

  4. Net-Centric Information and Knowledge Management and Dissemination for Data-to-Decision C2 Applications Using Intelligent Agents and Service-Oriented Architectures

    DTIC Science & Technology

    2011-11-01

    data. s to make time rations. TITA lish and Subs ge messages i y FBCB2 and y gathering an rt tactical deci ontinuous asse s, including ions through...well as one ad from data-t y spans multip and control command (M red, Fused, a eports. (BCW) tric agent-bas y from TITA sts to dismoun...new ong all TITA current state of nit, and proces emination Sup ge transport s echelons a ort mechanis mentation of X le dynamic n alized or a pur

  5. A practical approach for active camera coordination based on a fusion-driven multi-agent system

    NASA Astrophysics Data System (ADS)

    Bustamante, Alvaro Luis; Molina, José M.; Patricio, Miguel A.

    2014-04-01

    In this paper, we propose a multi-agent system architecture to manage spatially distributed active (or pan-tilt-zoom) cameras. Traditional video surveillance algorithms are of no use for active cameras, and we have to look at different approaches. Such multi-sensor surveillance systems have to be designed to solve two related problems: data fusion and coordinated sensor-task management. Generally, architectures proposed for the coordinated operation of multiple cameras are based on the centralisation of management decisions at the fusion centre. However, the existence of intelligent sensors capable of decision making brings with it the possibility of conceiving alternative decentralised architectures. This problem is approached by means of a MAS, integrating data fusion as an integral part of the architecture for distributed coordination purposes. This paper presents the MAS architecture and system agents.

  6. IPG Job Manager v2.0 Design Documentation

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2003-01-01

    This viewgraph presentation provides a high-level design of the IPG Job Manager, and satisfies its Master Requirement Specification v2.0 Revision 1.0, 01/29/2003. The presentation includes a Software Architecture/Functional Overview with the following: Job Model; Job Manager Client/Server Architecture; Job Manager Client (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Job Manager Server (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Development Environment; Project Plan; Requirement Traceability.

  7. 41 CFR 102-77.20 - With whom should Federal agencies collaborate with when commissioning and selecting art for...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false With whom should Federal...-77.20 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77...

  8. Component-Based Approach in Learning Management System Development

    ERIC Educational Resources Information Center

    Zaitseva, Larisa; Bule, Jekaterina; Makarov, Sergey

    2013-01-01

    The paper describes component-based approach (CBA) for learning management system development. Learning object as components of e-learning courses and their metadata is considered. The architecture of learning management system based on CBA being developed in Riga Technical University, namely its architecture, elements and possibilities are…

  9. The Expert Project Management System (EPMS)

    NASA Technical Reports Server (NTRS)

    Silverman, Barry G.; Diakite, Coty

    1986-01-01

    Successful project managers (PMs) have been shown to rely on 'intuition,' experience, and analogical reasoning heuristics. For new PMs to be trained and experienced PMs to avoid repeating others' mistakes, it is necessary to make the knowledge and heuristics of successful PMs more widely available. The preparers have evolved a model of PM thought processes over the last decade that is now ready to be implemented as a generic PM aid. This aid consists of a series of 'specialist' expert systems (CRITIC, LIBRARIAN, IDEA MAN, CRAFTSMAN, and WRITER) that communicate with each other via a 'blackboard' architecture. The various specialist expert systems are driven to support PM training and problem solving since any 'answers' they pass to the blackboard are subjected to conflict identification (AGENDA FORMULATOR) and GOAL SETTER inference engines.

  10. Rapid Development: A Content Analysis Comparison of Literature and Purposive Sampling of AFRL Rapid Reaction Projects

    DTIC Science & Technology

    2011-12-01

    systems engineering technical and technical management processes. Technical Planning, Stakeholders Requirements Development, and Architecture Design were...Stakeholder Requirements Definition, Architecture Design and Technical Planning. A purposive sampling of AFRL rapid development program managers and engineers...emphasize one process over another however Architecture Design , Implementation scored higher among Technical Processes. Decision Analysis, Technical

  11. Application of Integration of HBIM and VR Technology to 3D Immersive Digital Management—Take Han Type Traditional Architecture as an Example

    NASA Astrophysics Data System (ADS)

    Lin, Y.-C.

    2017-08-01

    HBIM technology makes great contributions to 3D digital preservation and management of the existing traditional architectures, and VR technology has also been gradually emphasized by 3D users in recent years, especially 3D immersive situation makes users more likely to experience the real space field. Taking Han type traditional architecture with relatively complex geometrical structure as an example, this research carries out digital preservation through HBIM technology and tries to switch to VR platform to allow users to enter 3D immersive scene for management and display. It is shown in the research results that the application of integration of HBIM and VR technology to Han type traditional architecture needs to consider 3D digital model of the architecture, and the number of polygon shall be controlled below about 2 million, which can make the operation in VR environment more smooth; the integration of two technologies can achieve the purpose of 3D immersive digital management, which can provide the humanized application close to the real experience for the display of subsequent management of ancient relics and architectural aesthetics.

  12. TMN: Introduction and interpretation

    NASA Astrophysics Data System (ADS)

    Pras, Aiko

    An overview of Telecommunications Management Network (TMN) status is presented. Its relation with Open System Interconnection (OSI) systems management is given and the commonalities and distinctions are identified. Those aspects that distinguish TMN from OSI management are introduced; TMN's functional and physical architectures and TMN's logical layered architecture are discussed. An analysis of the concepts used by these architectures (reference point, interface, function block, and building block) is given. The use of these concepts to express geographical distribution and functional layering is investigated. This aspect is interesting to understand how OSI management protocols can be used in a TMN environment. A statement regarding applicability of TMN as a model that helps the designers of (management) networks is given.

  13. Students' Knowledge Sources and Knowledge Sharing in the Design Studio--An Exploratory Study

    ERIC Educational Resources Information Center

    Chiu, Sheng-Hsiao

    2010-01-01

    Architectural design is a knowledge-intensive activity; however, students frequently lack sufficient knowledge when they practice design. Collaborative learning can supplement the students' insufficient expertise. Successful collaborative learning relies on knowledge sharing between students. This implies that the peers are a considerable design…

  14. A development framework for distributed artificial intelligence

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1989-01-01

    The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.

  15. Utilizing Expert Knowledge in Estimating Future STS Costs

    NASA Technical Reports Server (NTRS)

    Fortner, David B.; Ruiz-Torres, Alex J.

    2004-01-01

    A method of estimating the costs of future space transportation systems (STSs) involves classical activity-based cost (ABC) modeling combined with systematic utilization of the knowledge and opinions of experts to extend the process-flow knowledge of existing systems to systems that involve new materials and/or new architectures. The expert knowledge is particularly helpful in filling gaps that arise in computational models of processes because of inconsistencies in historical cost data. Heretofore, the costs of planned STSs have been estimated following a "top-down" approach that tends to force the architectures of new systems to incorporate process flows like those of the space shuttles. In this ABC-based method, one makes assumptions about the processes, but otherwise follows a "bottoms up" approach that does not force the new system architecture to incorporate a space-shuttle-like process flow. Prototype software has been developed to implement this method. Through further development of software, it should be possible to extend the method beyond the space program to almost any setting in which there is a need to estimate the costs of a new system and to extend the applicable knowledge base in order to make the estimate.

  16. An integrative architecture for a sensor-supported trust management system.

    PubMed

    Trček, Denis

    2012-01-01

    Trust plays a key role not only in e-worlds and emerging pervasive computing environments, but also already for millennia in human societies. Trust management solutions that have being around now for some fifteen years were primarily developed for the above mentioned cyber environments and they are typically focused on artificial agents, sensors, etc. However, this paper presents extensions of a new methodology together with architecture for trust management support that is focused on humans and human-like agents. With this methodology and architecture sensors play a crucial role. The architecture presents an already deployable tool for multi and interdisciplinary research in various areas where humans are involved. It provides new ways to obtain an insight into dynamics and evolution of such structures, not only in pervasive computing environments, but also in other important areas like management and decision making support.

  17. Software Management Environment (SME) concepts and architecture, revision 1

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1992-01-01

    This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.

  18. OXC management and control system architecture with scalability, maintenance, and distributed managing environment

    NASA Astrophysics Data System (ADS)

    Park, Soomyung; Joo, Seong-Soon; Yae, Byung-Ho; Lee, Jong-Hyun

    2002-07-01

    In this paper, we present the Optical Cross-Connect (OXC) Management Control System Architecture, which has the scalability and robust maintenance and provides the distributed managing environment in the optical transport network. The OXC system we are developing, which is divided into the hardware and the internal and external software for the OXC system, is made up the OXC subsystem with the Optical Transport Network (OTN) sub layers-hardware and the optical switch control system, the signaling control protocol subsystem performing the User-to-Network Interface (UNI) and Network-to-Network Interface (NNI) signaling control, the Operation Administration Maintenance & Provisioning (OAM&P) subsystem, and the network management subsystem. And the OXC management control system has the features that can support the flexible expansion of the optical transport network, provide the connectivity to heterogeneous external network elements, be added or deleted without interrupting OAM&P services, be remotely operated, provide the global view and detail information for network planner and operator, and have Common Object Request Broker Architecture (CORBA) based the open system architecture adding and deleting the intelligent service networking functions easily in future. To meet these considerations, we adopt the object oriented development method in the whole developing steps of the system analysis, design, and implementation to build the OXC management control system with the scalability, the maintenance, and the distributed managing environment. As a consequently, the componentification for the OXC operation management functions of each subsystem makes the robust maintenance, and increases code reusability. Also, the component based OXC management control system architecture will have the flexibility and scalability in nature.

  19. An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency

    NASA Astrophysics Data System (ADS)

    Phillips, Dewanne Marie

    Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software architecture framework and acquisition methodology to improve the resiliency of space systems from a software perspective with an emphasis on the early phases of the systems engineering life cycle. This methodology involves seven steps: 1) Define technical resiliency requirements, 1a) Identify standards/policy for software resiliency, 2) Develop a request for proposal (RFP)/statement of work (SOW) for resilient space systems software, 3) Define software resiliency goals for space systems, 4) Establish software resiliency quality attributes, 5) Perform architectural tradeoffs and identify risks, 6) Conduct architecture assessments as part of the procurement process, and 7) Ascertain space system software architecture resiliency metrics. Data illustrates that software vulnerabilities can lead to opportunities for malicious cyber activities, which could degrade the space mission capability for the user community. Reducing the number of vulnerabilities by improving architecture and software system engineering practices can contribute to making space systems more resilient. Since cyber-attacks are enabled by shortfalls in software, robust software engineering practices and an architectural design are foundational to resiliency, which is a quality that allows the system to "take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". To achieve software resiliency for space systems, acquirers and suppliers must identify relevant factors and systems engineering practices to apply across the lifecycle, in software requirements analysis, architecture development, design, implementation, verification and validation, and maintenance phases.

  20. A Lovely Building for Difficult Knowledge: The Architecture of the Canadian Museum for Human Rights

    ERIC Educational Resources Information Center

    Wodtke, Larissa

    2015-01-01

    One only needs to look at the Canadian Museum for Human Rights (CMHR) logo, with its abstract outline of the CMHR building, to see the way in which the museum's architecture has come to stand for the CMHR's immaterial meanings and content. The CMHR's architecture becomes a material intersection of discourses of cosmopolitanism, human rights, and…

  1. Teaching History of Architecture--Moving from a Knowledge Transfer to a Multi-Participative Methodology Based on IT Tools

    ERIC Educational Resources Information Center

    Cimadomo, Guido

    2014-01-01

    The changes that the European Higher Education Area (EHEA) framework obliged the School of Architecture of Malaga, University of Malaga. to make to its "History of Architecture" course are discussed in this paper. It was taken up as an opportunity to modify the whole course, introducing creative teaching and "imaginative…

  2. A Distributed Intelligent E-Learning System

    ERIC Educational Resources Information Center

    Kristensen, Terje

    2016-01-01

    An E-learning system based on a multi-agent (MAS) architecture combined with the Dynamic Content Manager (DCM) model of E-learning, is presented. We discuss the benefits of using such a multi-agent architecture. Finally, the MAS architecture is compared with a pure service-oriented architecture (SOA). This MAS architecture may also be used within…

  3. An Architecture for Performance Optimization in a Collaborative Knowledge-Based Approach for Wireless Sensor Networks

    PubMed Central

    Gadeo-Martos, Manuel Angel; Fernandez-Prieto, Jose Angel; Canada-Bago, Joaquin; Velasco, Juan Ramon

    2011-01-01

    Over the past few years, Intelligent Spaces (ISs) have received the attention of many Wireless Sensor Network researchers. Recently, several studies have been devoted to identify their common capacities and to set up ISs over these networks. However, little attention has been paid to integrating Fuzzy Rule-Based Systems into collaborative Wireless Sensor Networks for the purpose of implementing ISs. This work presents a distributed architecture proposal for collaborative Fuzzy Rule-Based Systems embedded in Wireless Sensor Networks, which has been designed to optimize the implementation of ISs. This architecture includes the following: (a) an optimized design for the inference engine; (b) a visual interface; (c) a module to reduce the redundancy and complexity of the knowledge bases; (d) a module to evaluate the accuracy of the new knowledge base; (e) a module to adapt the format of the rules to the structure used by the inference engine; and (f) a communications protocol. As a real-world application of this architecture and the proposed methodologies, we show an application to the problem of modeling two plagues of the olive tree: prays (olive moth, Prays oleae Bern.) and repilo (caused by the fungus Spilocaea oleagina). The results show that the architecture presented in this paper significantly decreases the consumption of resources (memory, CPU and battery) without a substantial decrease in the accuracy of the inferred values. PMID:22163687

  4. An architecture for performance optimization in a collaborative knowledge-based approach for wireless sensor networks.

    PubMed

    Gadeo-Martos, Manuel Angel; Fernandez-Prieto, Jose Angel; Canada-Bago, Joaquin; Velasco, Juan Ramon

    2011-01-01

    Over the past few years, Intelligent Spaces (ISs) have received the attention of many Wireless Sensor Network researchers. Recently, several studies have been devoted to identify their common capacities and to set up ISs over these networks. However, little attention has been paid to integrating Fuzzy Rule-Based Systems into collaborative Wireless Sensor Networks for the purpose of implementing ISs. This work presents a distributed architecture proposal for collaborative Fuzzy Rule-Based Systems embedded in Wireless Sensor Networks, which has been designed to optimize the implementation of ISs. This architecture includes the following: (a) an optimized design for the inference engine; (b) a visual interface; (c) a module to reduce the redundancy and complexity of the knowledge bases; (d) a module to evaluate the accuracy of the new knowledge base; (e) a module to adapt the format of the rules to the structure used by the inference engine; and (f) a communications protocol. As a real-world application of this architecture and the proposed methodologies, we show an application to the problem of modeling two plagues of the olive tree: prays (olive moth, Prays oleae Bern.) and repilo (caused by the fungus Spilocaea oleagina). The results show that the architecture presented in this paper significantly decreases the consumption of resources (memory, CPU and battery) without a substantial decrease in the accuracy of the inferred values.

  5. Reusable Rocket Engine Advanced Health Management System. Architecture and Technology Evaluation: Summary

    NASA Technical Reports Server (NTRS)

    Pettit, C. D.; Barkhoudarian, S.; Daumann, A. G., Jr.; Provan, G. M.; ElFattah, Y. M.; Glover, D. E.

    1999-01-01

    In this study, we proposed an Advanced Health Management System (AHMS) functional architecture and conducted a technology assessment for liquid propellant rocket engine lifecycle health management. The purpose of the AHMS is to improve reusable rocket engine safety and to reduce between-flight maintenance. During the study, past and current reusable rocket engine health management-related projects were reviewed, data structures and health management processes of current rocket engine programs were assessed, and in-depth interviews with rocket engine lifecycle and system experts were conducted. A generic AHMS functional architecture, with primary focus on real-time health monitoring, was developed. Fourteen categories of technology tasks and development needs for implementation of the AHMS were identified, based on the functional architecture and our assessment of current rocket engine programs. Five key technology areas were recommended for immediate development, which (1) would provide immediate benefits to current engine programs, and (2) could be implemented with minimal impact on the current Space Shuttle Main Engine (SSME) and Reusable Launch Vehicle (RLV) engine controllers.

  6. Engineering Knowledge for Assistive Living

    NASA Astrophysics Data System (ADS)

    Chen, Liming; Nugent, Chris

    This paper introduces a knowledge based approach to assistive living in smart homes. It proposes a system architecture that makes use of knowledge in the lifecycle of assistive living. The paper describes ontology based knowledge engineering practices and discusses mechanisms for exploiting knowledge for activity recognition and assistance. It presents system implementation and experiments, and discusses initial results.

  7. 28 CFR 91.22 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...), pre-architectural programming, architectural design, preservation, construction, administration, construction management, or project management costs. Construction does not include the purchase of land. [61... U.S.C. 450b(e). (e) Construction means the erection, acquisition, renovation, repair, remodeling, or...

  8. 28 CFR 91.22 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...), pre-architectural programming, architectural design, preservation, construction, administration, construction management, or project management costs. Construction does not include the purchase of land. [61... U.S.C. 450b(e). (e) Construction means the erection, acquisition, renovation, repair, remodeling, or...

  9. 28 CFR 91.22 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...), pre-architectural programming, architectural design, preservation, construction, administration, construction management, or project management costs. Construction does not include the purchase of land. [61... U.S.C. 450b(e). (e) Construction means the erection, acquisition, renovation, repair, remodeling, or...

  10. 28 CFR 91.22 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...), pre-architectural programming, architectural design, preservation, construction, administration, construction management, or project management costs. Construction does not include the purchase of land. [61... U.S.C. 450b(e). (e) Construction means the erection, acquisition, renovation, repair, remodeling, or...

  11. 28 CFR 91.22 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...), pre-architectural programming, architectural design, preservation, construction, administration, construction management, or project management costs. Construction does not include the purchase of land. [61... U.S.C. 450b(e). (e) Construction means the erection, acquisition, renovation, repair, remodeling, or...

  12. An Integrative Architecture for a Sensor-Supported Trust Management System

    PubMed Central

    Trček, Denis

    2012-01-01

    Trust plays a key role not only in e-worlds and emerging pervasive computing environments, but also already for millennia in human societies. Trust management solutions that have being around now for some fifteen years were primarily developed for the above mentioned cyber environments and they are typically focused on artificial agents, sensors, etc. However, this paper presents extensions of a new methodology together with architecture for trust management support that is focused on humans and human-like agents. With this methodology and architecture sensors play a crucial role. The architecture presents an already deployable tool for multi and interdisciplinary research in various areas where humans are involved. It provides new ways to obtain an insight into dynamics and evolution of such structures, not only in pervasive computing environments, but also in other important areas like management and decision making support. PMID:23112628

  13. Prognostics and health management system for hydropower plant based on fog computing and docker container

    NASA Astrophysics Data System (ADS)

    Xiao, Jian; Zhang, Mingqiang; Tian, Haiping; Huang, Bo; Fu, Wenlong

    2018-02-01

    In this paper, a novel prognostics and health management system architecture for hydropower plant equipment was proposed based on fog computing and Docker container. We employed the fog node to improve the real-time processing ability of improving the cloud architecture-based prognostics and health management system and overcome the problems of long delay time, network congestion and so on. Then Storm-based stream processing of fog node was present and could calculate the health index in the edge of network. Moreover, the distributed micros-service and Docker container architecture of hydropower plants equipment prognostics and health management was also proposed. Using the micro service architecture proposed in this paper, the hydropower unit can achieve the goal of the business intercommunication and seamless integration of different equipment and different manufacturers. Finally a real application case is given in this paper.

  14. Soil conservation service landscape resource management

    Treesearch

    Sally Schauman; Carolyn Adams

    1979-01-01

    SCS Landscape Resource Management (LRM) is the application of landscape architecture to SCS conservation activities. LRM includes but is not limited to visual resource management. LRM can be summarized in three principles: (1) SCS landscape architecture considers the landscape as a composite of ecological, social and visual resources; (2) SCS landscapes exist in the...

  15. The Architecture of Personality

    ERIC Educational Resources Information Center

    Cervone, Daniel

    2004-01-01

    This article presents a theoretical framework for analyzing psychological systems that contribute to the variability, consistency, and cross-situational coherence of personality functioning. In the proposed knowledge-and-appraisal personality architecture (KAPA), personality structures and processes are delineated by combining 2 principles:…

  16. A Collaborative Extensible User Environment for Simulation and Knowledge Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Lansing, Carina S.; Porter, Ellen A.

    2015-06-01

    In scientific simulation, scientists use measured data to create numerical models, execute simulations and analyze results from advanced simulators executing on high performance computing platforms. This process usually requires a team of scientists collaborating on data collection, model creation and analysis, and on authorship of publications and data. This paper shows that scientific teams can benefit from a user environment called Akuna that permits subsurface scientists in disparate locations to collaborate on numerical modeling and analysis projects. The Akuna user environment is built on the Velo framework that provides both a rich client environment for conducting and analyzing simulations andmore » a Web environment for data sharing and annotation. Akuna is an extensible toolset that integrates with Velo, and is designed to support any type of simulator. This is achieved through data-driven user interface generation, use of a customizable knowledge management platform, and an extensible framework for simulation execution, monitoring and analysis. This paper describes how the customized Velo content management system and the Akuna toolset are used to integrate and enhance an effective collaborative research and application environment. The extensible architecture of Akuna is also described and demonstrates its usage for creation and execution of a 3D subsurface simulation.« less

  17. Data Model Management for Space Information Systems

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool. We will also describe the current effort to provide interoperability with the European Space Agency (ESA)/Planetary Science Archive (PSA) which is critically dependent on a common data model.

  18. 41 CFR 102-77.25 - Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... responsibilities to provide national visibility for Art-in-Architecture? 102-77.25 Section 102-77.25 Public... MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.25 Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture? Yes, Federal...

  19. 41 CFR 102-77.25 - Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... responsibilities to provide national visibility for Art-in-Architecture? 102-77.25 Section 102-77.25 Public... MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.25 Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture? Yes, Federal...

  20. 41 CFR 102-77.25 - Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... responsibilities to provide national visibility for Art-in-Architecture? 102-77.25 Section 102-77.25 Public... MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.25 Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture? Yes, Federal...

  1. 41 CFR 102-77.25 - Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... responsibilities to provide national visibility for Art-in-Architecture? 102-77.25 Section 102-77.25 Public... MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.25 Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture? Yes, Federal...

  2. 41 CFR 102-77.25 - Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... responsibilities to provide national visibility for Art-in-Architecture? 102-77.25 Section 102-77.25 Public... MANAGEMENT REGULATION REAL PROPERTY 77-ART-IN-ARCHITECTURE Art-in-Architecture § 102-77.25 Do Federal agencies have responsibilities to provide national visibility for Art-in-Architecture? Yes, Federal...

  3. The architecture challenge: Future artificial-intelligence systems will require sophisticated architectures, and knowledge of the brain might guide their construction.

    PubMed

    Baldassarre, Gianluca; Santucci, Vieri Giuliano; Cartoni, Emilio; Caligiore, Daniele

    2017-01-01

    In this commentary, we highlight a crucial challenge posed by the proposal of Lake et al. to introduce key elements of human cognition into deep neural networks and future artificial-intelligence systems: the need to design effective sophisticated architectures. We propose that looking at the brain is an important means of facing this great challenge.

  4. Computer Architects.

    ERIC Educational Resources Information Center

    Betts, Janelle Lyon

    2001-01-01

    Describes a high school art assignment in which students utilize Appleworks or Claris Works to design their own house, after learning about architectural styles and how to use the computer program. States that the project develops student computer skills and increases student knowledge about architecture. (CMK)

  5. Architectural Design of a LMS with LTSA-Conformance

    ERIC Educational Resources Information Center

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper illustrates an approach for architectural design of a Learning Management System (LMS), which is verifiable against the Learning Technology System Architecture (LTSA) conformance rules. We introduce a new method for software architectural design that extends the Unified Modeling Language (UML) component diagram with the formal…

  6. A multi-agent architecture for geosimulation of moving agents

    NASA Astrophysics Data System (ADS)

    Vahidnia, Mohammad H.; Alesheikh, Ali A.; Alavipanah, Seyed Kazem

    2015-10-01

    In this paper, a novel architecture is proposed in which an axiomatic derivation system in the form of first-order logic facilitates declarative explanation and spatial reasoning. Simulation of environmental perception and interaction between autonomous agents is designed with a geographic belief-desire-intention and a request-inform-query model. The architecture has a complementary quantitative component that supports collaborative planning based on the concept of equilibrium and game theory. This new architecture presents a departure from current best practices geographic agent-based modelling. Implementation tasks are discussed in some detail, as well as scenarios for fleet management and disaster management.

  7. Developing intelligent transportation systems using the national ITS architecture : an executive edition for senior transportation managers

    DOT National Transportation Integrated Search

    1998-07-01

    This document has been produced to provide senior transportation managers of state and local departments of transportation with practical guidance for deploying Intelligent Transportation Systems (ITS) consistent with the National ITS Architecture. T...

  8. Programmable bandwidth management in software-defined EPON architecture

    NASA Astrophysics Data System (ADS)

    Li, Chengjun; Guo, Wei; Wang, Wei; Hu, Weisheng; Xia, Ming

    2016-07-01

    This paper proposes a software-defined EPON architecture which replaces the hardware-implemented DBA module with reprogrammable DBA module. The DBA module allows pluggable bandwidth allocation algorithms among multiple ONUs adaptive to traffic profiles and network states. We also introduce a bandwidth management scheme executed at the controller to manage the customized DBA algorithms for all date queues of ONUs. Our performance investigation verifies the effectiveness of this new EPON architecture, and numerical results show that software-defined EPONs can achieve less traffic delay and provide better support to service differentiation in comparison with traditional EPONs.

  9. From Architectural Photogrammetry Toward Digital Architectural Heritage Education

    NASA Astrophysics Data System (ADS)

    Baik, A.; Alitany, A.

    2018-05-01

    This paper considers the potential of using the documentation approach proposed for the heritage buildings in Historic Jeddah, Saudi Arabia (as a case study) by using the close-range photogrammetry / the Architectural Photogrammetry techniques as a new academic experiment in digital architectural heritage education. Moreover, different than most of engineering educational techniques related to architecture education, this paper will be focusing on the 3-D data acquisition technology as a tool to document and to learn the principals of the digital architectural heritage documentation. The objective of this research is to integrate the 3-D modelling and visualisation knowledge for the purposes of identifying, designing and evaluating an effective engineering educational experiment. Furthermore, the students will learn and understand the characteristics of the historical building while learning more advanced 3-D modelling and visualisation techniques. It can be argued that many of these technologies alone are difficult to improve the education; therefore, it is important to integrate them in an educational framework. This should be in line with the educational ethos of the academic discipline. Recently, a number of these technologies and methods have been effectively used in education sectors and other purposes; such as in the virtual museum. However, these methods are not directly coincided with the traditional education and teaching architecture. This research will be introduced the proposed approach as a new academic experiment in the architecture education sector. The new teaching approach will be based on the Architectural Photogrammetry to provide semantically rich models. The academic experiment will require students to have suitable knowledge in both Photogrammetry applications to engage with the process.

  10. Rocket Testing and Integrated System Health Management

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Schmalzel, John

    2005-01-01

    Integrated System Health Management (ISHM) describes a set of system capabilities that in aggregate perform: determination of condition for each system element, detection of anomalies, diagnosis of causes for anomalies, and prognostics for future anomalies and system behavior. The ISHM should also provide operators with situational awareness of the system by integrating contextual and timely data, information, and knowledge (DIaK) as needed. ISHM capabilities can be implemented using a variety of technologies and tools. This chapter provides an overview of ISHM contributing technologies and describes in further detail a novel implementation architecture along with associated taxonomy, ontology, and standards. The operational ISHM testbed is based on a subsystem of a rocket engine test stand. Such test stands contain many elements that are common to manufacturing systems, and thereby serve to illustrate the potential benefits and methodologies of the ISHM approach for intelligent manufacturing.

  11. Real-time artificial intelligence issues in the development of the adaptive tactical navigator

    NASA Technical Reports Server (NTRS)

    Green, Peter E.; Glasson, Douglas P.; Pomarede, Jean-Michel L.; Acharya, Narayan A.

    1987-01-01

    Adaptive Tactical Navigation (ATN) is a laboratory prototype of a knowledge based system to provide navigation system management and decision aiding in the next generation of tactical aircraft. ATN's purpose is to manage a set of multimode navigation equipment, dynamically selecting the best equipment to use in accordance with mission goals and phase, threat environment, equipment malfunction status, and battle damage. ATN encompasses functions as diverse as sensor data interpretation, diagnosis, and planning. Real time issues that were identified in ATN and the approaches used to address them are addressed. Functional requirements and a global architecture for the ATN system are described. Decision making with time constraints are discussed. Two subproblems are identified; making decisions with incomplete information and with limited resources. Approaches used in ATN to address real time performance are described and simulation results are discussed.

  12. Security in the Cache and Forward Architecture for the Next Generation Internet

    NASA Astrophysics Data System (ADS)

    Hadjichristofi, G. C.; Hadjicostis, C. N.; Raychaudhuri, D.

    The future Internet architecture will be comprised predominately of wireless devices. It is evident at this stage that the TCP/IP protocol that was developed decades ago will not properly support the required network functionalities since contemporary communication profiles tend to be data-driven rather than host-based. To address this paradigm shift in data propagation, a next generation architecture has been proposed, the Cache and Forward (CNF) architecture. This research investigates security aspects of this new Internet architecture. More specifically, we discuss content privacy, secure routing, key management and trust management. We identify security weaknesses of this architecture that need to be addressed and we derive security requirements that should guide future research directions. Aspects of the research can be adopted as a step-stone as we build the future Internet.

  13. Ecological literacy and beyond: Problem-based learning for future professionals.

    PubMed

    Lewinsohn, Thomas M; Attayde, José Luiz; Fonseca, Carlos Roberto; Ganade, Gislene; Jorge, Leonardo Ré; Kollmann, Johannes; Overbeck, Gerhard E; Prado, Paulo Inácio; Pillar, Valério D; Popp, Daniela; da Rocha, Pedro L B; Silva, Wesley Rodrigues; Spiekermann, Annette; Weisser, Wolfgang W

    2015-03-01

    Ecological science contributes to solving a broad range of environmental problems. However, lack of ecological literacy in practice often limits application of this knowledge. In this paper, we highlight a critical but often overlooked demand on ecological literacy: to enable professionals of various careers to apply scientific knowledge when faced with environmental problems. Current university courses on ecology often fail to persuade students that ecological science provides important tools for environmental problem solving. We propose problem-based learning to improve the understanding of ecological science and its usefulness for real-world environmental issues that professionals in careers as diverse as engineering, public health, architecture, social sciences, or management will address. Courses should set clear learning objectives for cognitive skills they expect students to acquire. Thus, professionals in different fields will be enabled to improve environmental decision-making processes and to participate effectively in multidisciplinary work groups charged with tackling environmental issues.

  14. Automating a human factors evaluation of graphical user interfaces for NASA applications: An update on CHIMES

    NASA Technical Reports Server (NTRS)

    Jiang, Jian-Ping; Murphy, Elizabeth D.; Bailin, Sidney C.; Truszkowski, Walter F.

    1993-01-01

    Capturing human factors knowledge about the design of graphical user interfaces (GUI's) and applying this knowledge on-line are the primary objectives of the Computer-Human Interaction Models (CHIMES) project. The current CHIMES prototype is designed to check a GUI's compliance with industry-standard guidelines, general human factors guidelines, and human factors recommendations on color usage. Following the evaluation, CHIMES presents human factors feedback and advice to the GUI designer. The paper describes the approach to modeling human factors guidelines, the system architecture, a new method developed to convert quantitative RGB primaries into qualitative color representations, and the potential for integrating CHIMES with user interface management systems (UIMS). Both the conceptual approach and its implementation are discussed. This paper updates the presentation on CHIMES at the first International Symposium on Ground Data Systems for Spacecraft Control.

  15. Diverter AI based decision aid, phases 1 and 2

    NASA Technical Reports Server (NTRS)

    Sexton, George A.; Bayles, Scott J.; Patterson, Robert W.; Schulke, Duane A.; Williams, Deborah C.

    1989-01-01

    It was determined that a system to incorporate artificial intelligence (AI) into airborne flight management computers is feasible. The AI functions that would be most useful to the pilot are to perform situational assessment, evaluate outside influences on the contemplated rerouting, perform flight planning/replanning, and perform maneuver planning. A study of the software architecture and software tools capable of demonstrating Diverter was also made. A skeletal planner known as the Knowledge Acquisition Development Tool (KADET), which is a combination script-based and rule-based system, was used to implement the system. A prototype system was developed which demonstrates advanced in-flight planning/replanning capabilities.

  16. Mountain research

    NASA Astrophysics Data System (ADS)

    The newly incorporated International Mountain Society (IMS) will in May begin publication of an interdisciplinary scientific journal, Mountain Research and Development. The quarterly will be copublished with the United National University; additional support will come from UNESCO.A primary objective of IMS is to ‘help solve mountain land-use problems by developing a foundation of scientific and technical knowledge on which to base management decisions,’ according to Jack D. Ives, president of the Boulder-based organization. ‘The Society is strongly committed to the belief that a rational worldwide approach to mountain problems must involve a wide range of disciplines in the natural and human sciences, medicine, architecture, engineering, and technology.’

  17. A three-dimensional architecture of vertically aligned multilayer graphene facilitates heat dissipation across joint solid surfaces

    NASA Astrophysics Data System (ADS)

    Liang, Qizhen; Yao, Xuxia; Wang, Wei; Wong, C. P.

    2012-02-01

    Low operation temperature and efficient heat dissipation are important for device life and speed in current electronic and photonic technologies. Being ultra-high thermally conductive, graphene is a promising material candidate for heat dissipation improvement in devices. In the application, graphene is expected to be vertically stacked between contact solid surfaces in order to facilitate efficient heat dissipation and reduced interfacial thermal resistance across contact solid surfaces. However, as an ultra-thin membrane-like material, graphene is susceptible to Van der Waals forces and usually tends to be recumbent on substrates. Thereby, direct growth of vertically aligned free-standing graphene on solid substrates in large scale is difficult and rarely available in current studies, bringing significant barriers in graphene's application as thermal conductive media between joint solid surfaces. In this work, a three-dimensional vertically aligned multi-layer graphene architecture is constructed between contacted Silicon/Silicon surfaces with pure Indium as a metallic medium. Significantly higher equivalent thermal conductivity and lower contact thermal resistance of vertically aligned multilayer graphene are obtained, compared with those of their recumbent counterpart. This finding provides knowledge of vertically aligned graphene architectures, which may not only facilitate current demanding thermal management but also promote graphene's widespread applications such as electrodes for energy storage devices, polymeric anisotropic conductive adhesives, etc.

  18. The NASA Navigator Program Ground Based Archives at the Michelson Science Center: Supporting the Search for Habitable Planets

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Ciardi, D. R.; Good, J. C.; Laity, A. C.; Zhang, A.

    2006-07-01

    At ADASS XIV, we described how the W. M. Keck Observatory Archive (KOA) re-uses and extends the component based architecture of the NASA/IPAC Infrared Science Archive (IRSA) to ingest and serve level 0 observations made with HIRES, the High Resolution Echelle Spectrometer. Since August 18, the KOA has ingested 325 GB of data from 135 nights of observations. The architecture exploits a service layer between the mass storage layer and the user interface. This service layer consists of standalone utilities called through a simple executive that perform generic query and retrieval functions, such as query generation, database table sub-setting, and return page generation etc. It has been extended to implement proprietary access to data through deployment of query management middleware developed for the National Virtual Observatory. The MSC archives have recently extended this design to query and retrieve complex data sets describing the properties of potential target stars for the Terrestrial Planet Finder (TPF) missions. The archives can now support knowledge based retrieval, as well as data retrieval. This paper describes how extensions to the IRSA architecture, which is applicable across all wavelengths and astronomical datatypes, supports the design and development of the MSC NP archives at modest cost.

  19. Architectural and Functional Design of an Environmental Information Network.

    DTIC Science & Technology

    1984-04-30

    study was accomplished under contract F08635-83-C-013(,, Task 83- 2 for Headquarters Air Force Engineering and Services Center, Engineering and Services...election Procedure ............................... 11 2 General Architecture of Distributed Data Management System...o.......60 A-1 Schema Architecture .......... o-.................. .... 74 A- 2 MULTIBASE Component Architecture

  20. Multi-Agent Diagnosis and Control of an Air Revitalization System for Life Support in Space

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Kowing, Jeffrey; Nieten, Joseph; Graham, Jeffrey s.; Schreckenghost, Debra; Bonasso, Pete; Fleming, Land D.; MacMahon, Matt; Thronesbery, Carroll

    2000-01-01

    An architecture of interoperating agents has been developed to provide control and fault management for advanced life support systems in space. In this adjustable autonomy architecture, software agents coordinate with human agents and provide support in novel fault management situations. This architecture combines the Livingstone model-based mode identification and reconfiguration (MIR) system with the 3T architecture for autonomous flexible command and control. The MIR software agent performs model-based state identification and diagnosis. MIR identifies novel recovery configurations and the set of commands required for the recovery. The AZT procedural executive and the human operator use the diagnoses and recovery recommendations, and provide command sequencing. User interface extensions have been developed to support human monitoring of both AZT and MIR data and activities. This architecture has been demonstrated performing control and fault management for an oxygen production system for air revitalization in space. The software operates in a dynamic simulation testbed.

  1. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  2. Neurally and mathematically motivated architecture for language and thought.

    PubMed

    Perlovsky, L I; Ilin, R

    2010-01-01

    Neural structures of interaction between thinking and language are unknown. This paper suggests a possible architecture motivated by neural and mathematical considerations. A mathematical requirement of computability imposes significant constraints on possible architectures consistent with brain neural structure and with a wealth of psychological knowledge. How language interacts with cognition. Do we think with words, or is thinking independent from language with words being just labels for decisions? Why is language learned by the age of 5 or 7, but acquisition of knowledge represented by learning to use this language knowledge takes a lifetime? This paper discusses hierarchical aspects of language and thought and argues that high level abstract thinking is impossible without language. We discuss a mathematical technique that can model the joint language-thought architecture, while overcoming previously encountered difficulties of computability. This architecture explains a contradiction between human ability for rational thoughtful decisions and irrationality of human thinking revealed by Tversky and Kahneman; a crucial role in this contradiction might be played by language. The proposed model resolves long-standing issues: how the brain learns correct words-object associations; why animals do not talk and think like people. We propose the role played by language emotionality in its interaction with thought. We relate the mathematical model to Humboldt's "firmness" of languages; and discuss possible influence of language grammar on its emotionality. Psychological and brain imaging experiments related to the proposed model are discussed. Future theoretical and experimental research is outlined.

  3. Neurally and Mathematically Motivated Architecture for Language and Thought

    PubMed Central

    Perlovsky, L.I; Ilin, R

    2010-01-01

    Neural structures of interaction between thinking and language are unknown. This paper suggests a possible architecture motivated by neural and mathematical considerations. A mathematical requirement of computability imposes significant constraints on possible architectures consistent with brain neural structure and with a wealth of psychological knowledge. How language interacts with cognition. Do we think with words, or is thinking independent from language with words being just labels for decisions? Why is language learned by the age of 5 or 7, but acquisition of knowledge represented by learning to use this language knowledge takes a lifetime? This paper discusses hierarchical aspects of language and thought and argues that high level abstract thinking is impossible without language. We discuss a mathematical technique that can model the joint language-thought architecture, while overcoming previously encountered difficulties of computability. This architecture explains a contradiction between human ability for rational thoughtful decisions and irrationality of human thinking revealed by Tversky and Kahneman; a crucial role in this contradiction might be played by language. The proposed model resolves long-standing issues: how the brain learns correct words-object associations; why animals do not talk and think like people. We propose the role played by language emotionality in its interaction with thought. We relate the mathematical model to Humboldt’s “firmness” of languages; and discuss possible influence of language grammar on its emotionality. Psychological and brain imaging experiments related to the proposed model are discussed. Future theoretical and experimental research is outlined. PMID:21673788

  4. A new flight control and management system architecture and configuration

    NASA Astrophysics Data System (ADS)

    Kong, Fan-e.; Chen, Zongji

    2006-11-01

    The advanced fighter should possess the performance such as super-sound cruising, stealth, agility, STOVL(Short Take-Off Vertical Landing),powerful communication and information processing. For this purpose, it is not enough only to improve the aerodynamic and propulsion system. More importantly, it is necessary to enhance the control system. A complete flight control system provides not only autopilot, auto-throttle and control augmentation, but also the given mission management. F-22 and JSF possess considerably outstanding flight control system on the basis of pave pillar and pave pace avionics architecture. But their control architecture is not enough integrated. The main purpose of this paper is to build a novel fighter control system architecture. The control system constructed on this architecture should be enough integrated, inexpensive, fault-tolerant, high safe, reliable and effective. And it will take charge of both the flight control and mission management. Starting from this purpose, this paper finishes the work as follows: First, based on the human nervous control, a three-leveled hierarchical control architecture is proposed. At the top of the architecture, decision level is in charge of decision-making works. In the middle, organization & coordination level will schedule resources, monitor the states of the fighter and switch the control modes etc. And the bottom is execution level which holds the concrete drive and measurement; then, according to their function and resources all the tasks involving flight control and mission management are sorted to individual level; at last, in order to validate the three-leveled architecture, a physical configuration is also showed. The configuration is distributed and applies some new advancement in information technology industry such line replaced module and cluster technology.

  5. Fault Management Architectures and the Challenges of Providing Software Assurance

    NASA Technical Reports Server (NTRS)

    Savarino, Shirley; Fitz, Rhonda; Fesq, Lorraine; Whitman, Gerek

    2015-01-01

    The satellite systems Fault Management (FM) is focused on safety, the preservation of assets, and maintaining the desired functionality of the system. How FM is implemented varies among missions. Common to most is system complexity due to a need to establish a multi-dimensional structure across hardware, software and operations. This structure is necessary to identify and respond to system faults, mitigate technical risks and ensure operational continuity. These architecture, implementation and software assurance efforts increase with mission complexity. Because FM is a systems engineering discipline with a distributed implementation, providing efficient and effective verification and validation (VV) is challenging. A breakout session at the 2012 NASA Independent Verification Validation (IVV) Annual Workshop titled VV of Fault Management: Challenges and Successes exposed these issues in terms of VV for a representative set of architectures. NASA's IVV is funded by NASA's Software Assurance Research Program (SARP) in partnership with NASA's Jet Propulsion Laboratory (JPL) to extend the work performed at the Workshop session. NASA IVV will extract FM architectures across the IVV portfolio and evaluate the data set for robustness, assess visibility for validation and test, and define software assurance methods that could be applied to the various architectures and designs. This work focuses efforts on FM architectures from critical and complex projects within NASA. The identification of particular FM architectures, visibility, and associated VVIVV techniques provides a data set that can enable higher assurance that a satellite system will adequately detect and respond to adverse conditions. Ultimately, results from this activity will be incorporated into the NASA Fault Management Handbook providing dissemination across NASA, other agencies and the satellite community. This paper discusses the approach taken to perform the evaluations and preliminary findings from the research including identification of FM architectures, visibility observations, and methods utilized for VVIVV.

  6. An Architecture for Cross-Cloud System Management

    NASA Astrophysics Data System (ADS)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  7. Location Management in a Transport Layer Mobility Architecture

    NASA Technical Reports Server (NTRS)

    Eddy, Wesley M.; Ishac, Joseph

    2005-01-01

    Mobility architectures that place complexity in end nodes rather than in the network interior have many advantageous properties and are becoming popular research topics. Such architectures typically push mobility support into higher layers of the protocol stack than network layer approaches like Mobile IP. The literature is ripe with proposals to provide mobility services in the transport, session, and application layers. In this paper, we focus on a mobility architecture that makes the most significant changes to the transport layer. A common problem amongst all mobility protocols at various layers is location management, which entails translating some form of static identifier into a mobile node's dynamic location. Location management is required for mobile nodes to be able to provide globally-reachable services on-demand to other hosts. In this paper, we describe the challenges of location management in a transport layer mobility architecture, and discuss the advantages and disadvantages of various solutions proposed in the literature. Our conclusion is that, in principle, secure dynamic DNS is most desirable, although it may have current operational limitations. We note that this topic has room for further exploration, and we present this paper largely as a starting point for comparing possible solutions.

  8. Static Extraction and Conformance Analysis of Hierarchical Runtime Architectural Structure

    DTIC Science & Technology

    2010-05-14

    Example: CryptoDB 253 Architectural Component Java Class Note CustomerManager cryptodb.test.CustomerManager AKA “ crypto consumer” CustomerManager.Receipts...PROVIDERS PLAIN KEYID KEYMANAGEMENT KEYSTORAGE CRYPTO (+) (+) (+) (+) (+) (+) (+)(+) Figure 7.29: CryptoDB: Level-0 OOG with String objects...better understand this communication, we declared different domains for plain-text (PLAIN), encrypted ( CRYPTO ), alias identifier (ALIASID), and key

  9. Sensitivity analysis by approximation formulas - Illustrative examples. [reliability analysis of six-component architectures

    NASA Technical Reports Server (NTRS)

    White, A. L.

    1983-01-01

    This paper examines the reliability of three architectures for six components. For each architecture, the probabilities of the failure states are given by algebraic formulas involving the component fault rate, the system recovery rate, and the operating time. The dominant failure modes are identified, and the change in reliability is considered with respect to changes in fault rate, recovery rate, and operating time. The major conclusions concern the influence of system architecture on failure modes and parameter requirements. Without this knowledge, a system designer may pick an inappropriate structure.

  10. NASA Stennis Space Center Integrated System Health Management Test Bed and Development Capabilities

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Holland, Randy; Coote, David

    2006-01-01

    Integrated System Health Management (ISHM) is a capability that focuses on determining the condition (health) of every element in a complex System (detect anomalies, diagnose causes, prognosis of future anomalies), and provide data, information, and knowledge (DIaK)-not just data-to control systems for safe and effective operation. This capability is currently done by large teams of people, primarily from ground, but needs to be embedded on-board systems to a higher degree to enable NASA's new Exploration Mission (long term travel and stay in space), while increasing safety and decreasing life cycle costs of spacecraft (vehicles; platforms; bases or outposts; and ground test, launch, and processing operations). The topics related to this capability include: 1) ISHM Related News Articles; 2) ISHM Vision For Exploration; 3) Layers Representing How ISHM is Currently Performed; 4) ISHM Testbeds & Prototypes at NASA SSC; 5) ISHM Functional Capability Level (FCL); 6) ISHM Functional Capability Level (FCL) and Technology Readiness Level (TRL); 7) Core Elements: Capabilities Needed; 8) Core Elements; 9) Open Systems Architecture for Condition-Based Maintenance (OSA-CBM); 10) Core Elements: Architecture, taxonomy, and ontology (ATO) for DIaK management; 11) Core Elements: ATO for DIaK Management; 12) ISHM Architecture Physical Implementation; 13) Core Elements: Standards; 14) Systematic Implementation; 15) Sketch of Work Phasing; 16) Interrelationship Between Traditional Avionics Systems, Time Critical ISHM and Advanced ISHM; 17) Testbeds and On-Board ISHM; 18) Testbed Requirements: RETS AND ISS; 19) Sustainable Development and Validation Process; 20) Development of on-board ISHM; 21) Taxonomy/Ontology of Object Oriented Implementation; 22) ISHM Capability on the E1 Test Stand Hydraulic System; 23) Define Relationships to Embed Intelligence; 24) Intelligent Elements Physical and Virtual; 25) ISHM Testbeds and Prototypes at SSC Current Implementations; 26) Trailer-Mounted RETS; 27) Modeling and Simulation; 28) Summary ISHM Testbed Environments; 29) Data Mining - ARC; 30) Transitioning ISHM to Support NASA Missions; 31) Feature Detection Routines; 32) Sample Features Detected in SSC Test Stand Data; and 33) Health Assessment Database (DIaK Repository).

  11. Performance of Service-Discovery Architectures in Response to Node Failures

    DTIC Science & Technology

    2003-06-01

    cache manager ( SCM ). Multiple SCMs can be used to mitigate the effect of SCM failure. In both architectures, service discovery occurs passively, via...employed, the SCM operates as an intermediary, matching advertised SDs of SMs to SD requirements provided by SUs. In this study, each SM manages one SP...architecture in our experimental topology: with 12 SMs, one SU, and up to three SCMs . To animate our three-party model, we chose discovery behaviors from the

  12. Optimizing Security of Cloud Computing within the DoD

    DTIC Science & Technology

    2010-12-01

    information security governance and risk management; application security; cryptography; security architecture and design; operations security; business ...governance and risk management; application security; cryptography; security architecture and design; operations security; business continuity...20 7. Operational Security (OPSEC).........................................................20 8. Business Continuity Planning (BCP) and Disaster

  13. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    NASA Astrophysics Data System (ADS)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.

  14. Space station needs, attributes, and architectural options study. Volume 2: Program options, architecture, and technology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Mission scenarios and space station architectures are discussed. Electrical power subsystems (EPS), environmental control and life support, subsystems (ECLSS), and reaction control subsystem (RCS) architectures are addressed. Thermal control subsystems, (TCS), guidance/navigation and control (GN and C), information management systems IMS), communications and tracking (C and T), and propellant transfer and storage systems architectures are discussed.

  15. Learning Outcomes in Affective Domain within Contemporary Architectural Curricula

    ERIC Educational Resources Information Center

    Savic, Marko; Kashef, Mohamad

    2013-01-01

    Contemporary architectural education has shifted from the traditional focus on providing students with specific knowledge and skill sets or "inputs" to outcome based, student-centred educational approach. Within the outcome based model, students' performance is assessed against measureable objectives that relate acquired knowledge…

  16. A healthcare management system for Turkey based on a service-oriented architecture.

    PubMed

    Herand, Deniz; Gürder, Filiz; Taşkin, Harun; Yuksel, Emre Nuri

    2013-09-01

    The current Turkish healthcare management system has a structure that is extremely inordinate, cumbersome and inflexible. Furthermore, this structure has no common point of view and thus has no interoperability and responds slowly to innovations. The purpose of this study is to show that using which methods can the Turkish healthcare management system provide a structure that could be more modern, more flexible and more quick to respond to innovations and changes taking advantage of the benefits given by a service-oriented architecture (SOA). In this paper, the Turkish healthcare management system is chosen to be examined since Turkey is considered as one of the Third World countries and the information architecture of the existing healthcare management system of Turkey has not yet been configured with SOA, which is a contemporary innovative approach and should provide the base architecture of the new solution. The innovation of this study is the symbiosis of two main integration approaches, SOA and Health Level 7 (HL7), for integrating divergent healthcare information systems. A model is developed which is based on SOA and enables obtaining a healthcare management system having the SSF standards (HSSP Service Specification Framework) developed by the framework of the HSSP (Healthcare Services Specification Project) under the leadership of HL7 and the Object Management Group.

  17. LTSA Conformance Testing to Architectural Design of LMS Using Ontology

    ERIC Educational Resources Information Center

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper proposes a new methodology for checking conformance of the software architectural design of Learning Management System (LMS) to Learning Technology System Architecture (LTSA). In our approach, the architectural designing of LMS follows the formal modeling style of Acme. An ontology is built to represent the LTSA rules and the software…

  18. PROCESS DOCUMENTATION: A MODEL FOR KNOWLEDGE MANAGEMENT IN ORGANIZATIONS.

    PubMed

    Haddadpoor, Asefeh; Taheri, Behjat; Nasri, Mehran; Heydari, Kamal; Bahrami, Gholamreza

    2015-10-01

    Continuous and interconnected processes are a chain of activities that turn the inputs of an organization to its outputs and help achieve partial and overall goals of the organization. These activates are carried out by two types of knowledge in the organization called explicit and implicit knowledge. Among these, implicit knowledge is the knowledge that controls a major part of the activities of an organization, controls these activities internally and will not be transferred to the process owners unless they are present during the organization's work. Therefore the goal of this study is identification of implicit knowledge and its integration with explicit knowledge in order to improve human resources management, physical resource management, information resource management, training of new employees and other activities of Isfahan University of Medical Science. The project for documentation of activities in department of health of Isfahan University of Medical Science was carried out in several stages. First the main processes and related sub processes were identified and categorized with the help of planning expert. The categorization was carried out from smaller processes to larger ones. In this stage the experts of each process wrote down all their daily activities and organized them into general categories based on logical and physical relations between different activities. Then each activity was assigned a specific code. The computer software was designed after understanding the different parts of the processes, including main and sup processes, and categorization, which will be explained in the following sections. The findings of this study showed that documentation of activities can help expose implicit knowledge because all of inputs and outputs of a process along with the length, location, tools and different stages of the process, exchanged information, storage location of the information and information flow can be identified using proper documentation. A documentation program can create a complete identifier for every process of an organization and also acts as the main tool for establishment of information technology as the basis of the organization and helps achieve the goal of having electronic and information technology based organizations. In other words documentation is the starting step in creating an organizational architecture. Afterwards, in order to reach the desired goal of documentation, computer software containing all tools, methods, instructions and guidelines and implicit knowledge of the organization was designed. This software links all relevant knowledge to the main text of the documentation and identification of a process and provides the users with electronic versions of all documentations and helps use the explicit and implicit knowledge of the organization to facilitate the reengineering of the processes in the organization.

  19. Recording Information on Architectural Heritage Should Meet the Requirements for Conservation Digital Recording Practices at the Summer Palace

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Cong, Y.; Wu, C.; Bai, C.; Wu, C.

    2017-08-01

    The recording of Architectural heritage information is the foundation of research, conservation, management, and the display of architectural heritage. In other words, the recording of architectural heritage information supports heritage research, conservation, management and architectural heritage display. What information do we record and collect and what technology do we use for information recording? How do we determine the level of accuracy required when recording architectural information? What method do we use for information recording? These questions should be addressed in relation to the nature of the particular heritage site and the specific conditions for the conservation work. In recent years, with the rapid development of information acquisition technology such as Close Range Photogrammetry, 3D Laser Scanning as well as high speed and high precision Aerial Photogrammetry, many Chinese universities, research institutes and heritage management bureaux have purchased considerable equipment for information recording. However, the lack of understanding of both the nature of architectural heritage and the purpose for which the information is being collected has led to several problems. For example: some institutions when recording architectural heritage information aim solely at high accuracy. Some consider that advanced measuring methods must automatically replace traditional measuring methods. Information collection becomes the purpose, rather than the means, of architectural heritage conservation. Addressing these issues, this paper briefly reviews the history of architectural heritage information recording at the Summer Palace (Yihe Yuan, first built in 1750), Beijing. Using the recording practices at the Summer Palace during the past ten years as examples, we illustrate our achievements and lessons in recording architectural heritage information with regard to the following aspects: (buildings') ideal status desired, (buildings') current status, structural distortion analysis, display, statue restoration and thematic research. Three points will be highlighted in our discussion: 1. Understanding of the heritage is more important than the particular technology used: Architectural heritage information collection and recording are based on an understanding of the value and nature of the architectural heritage. Understanding is the purpose, whereas information collection and recording are the means. 2. Demand determines technology: Collecting and recording architectural heritage information is to serve the needs of heritage research, conservation, management and display. These different needs determine the different technologies that we use. 3. Set the level of accuracy appropriately: For information recording, high accuracy is not the key criterion; rather an appropriate level of accuracy is key. There is considerable deviation between the nominal accuracy of any instrument and the accuracy of any particular measurement.

  20. Executable Architecture Research at Old Dominion University

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.

    2011-01-01

    Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.

  1. Lifecycle Prognostics Architecture for Selected High-Cost Active Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Lybeck; B. Pham; M. Tawfik

    There are an extensive body of knowledge and some commercial products available for calculating prognostics, remaining useful life, and damage index parameters. The application of these technologies within the nuclear power community is still in its infancy. Online monitoring and condition-based maintenance is seeing increasing acceptance and deployment, and these activities provide the technological bases for expanding to add predictive/prognostics capabilities. In looking to deploy prognostics there are three key aspects of systems that are presented and discussed: (1) component/system/structure selection, (2) prognostic algorithms, and (3) prognostics architectures. Criteria are presented for component selection: feasibility, failure probability, consequences of failure,more » and benefits of the prognostics and health management (PHM) system. The basis and methods commonly used for prognostics algorithms are reviewed and summarized. Criteria for evaluating PHM architectures are presented: open, modular architecture; platform independence; graphical user interface for system development and/or results viewing; web enabled tools; scalability; and standards compatibility. Thirteen software products were identified and discussed in the context of being potentially useful for deployment in a PHM program applied to systems in a nuclear power plant (NPP). These products were evaluated by using information available from company websites, product brochures, fact sheets, scholarly publications, and direct communication with vendors. The thirteen products were classified into four groups of software: (1) research tools, (2) PHM system development tools, (3) deployable architectures, and (4) peripheral tools. Eight software tools fell into the deployable architectures category. Of those eight, only two employ all six modules of a full PHM system. Five systems did not offer prognostic estimates, and one system employed the full health monitoring suite but lacked operations and maintenance support. Each product is briefly described in Appendix A. Selection of the most appropriate software package for a particular application will depend on the chosen component, system, or structure. Ongoing research will determine the most appropriate choices for a successful demonstration of PHM systems in aging NPPs.« less

  2. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  3. NASA Performance Report

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Introduction NASA's mission is to advance and communicate scientific knowledge and understanding of Earth, the solar system, and the universe; to advance human exploration, use, and development of space; and to research, develop, verify, and transfer advanced aeronautics, space, and related technologies. In support of this mission, NASA has a strategic architecture that consists of four Enterprises supported by four Crosscutting Processes. The Strategic Enterprises are NASA's primary mission areas to include Earth Science, Space Science, Human Exploration and Development of Space, and Aerospace Technology. NASA's Crosscutting Processes are Manage Strategically, Provide Aerospace Products and Capabilities, Generate Knowledge and Communicate Knowledge. The implementation of NASA programs, science, and technology research occurs primarily at our Centers. NASA consists of a Headquarters, nine Centers, and the Jet Propulsion Laboratory, as well as several ancillary installations and offices in the United States and abroad. The nine Centers are as follows: (1) Ames Research Center, (2) Dryden Flight Research Center (DFRC), (3) Glenn Research Center (GRC), (4) Goddard Space Flight Center (GSFC), (5) Johnson Space Center, (6) Kennedy Space Center (KSC), (7) Langley Research Center (LaRC), (8) Marshall Space Flight Center (MSFC), and (9) Stennis Space Center (SSC).

  4. Second Generation RLV Space Vehicle Concept

    NASA Astrophysics Data System (ADS)

    Bailey, M. D.; Daniel, C. C.

    2002-01-01

    NASA has a long history of conducting development programs and projects in a consistant fashion. Systems Engineering within those programs and projects has also followed a given method outlined by such documents as the NASA Systems Engineering Handbook. The relatively new NASA Space Launch Initiative (SLI) is taking a new approach to developing a space vehicle, with innovative management methods as well as new Systems Engineering processes. With the program less than a year into its life cycle, the efficacy of these new processes has yet to be proven or disproven. At 776M for phase I, SLI represents a major portion of the NASA focus; however, the new processes being incorporated are not reflected in the training provided by NASA to its engineers. The NASA Academy of Program and Project Leadership (APPL) offers core classes in program and project management and systems engineering to NASA employees with the purpose of creating a "knowledge community where ideas, skills, and experiences are exchanged to increase each other's capacity for strong leadership". The SLI program is, in one sense, a combination of a conceptual design program and a technology program. The program as a whole doesn't map into the generic systems engineering project cycle as currently, and for some time, taught. For example, the NASA APPL Systems Engineering training course teaches that the "first step in developing an architecture is to define the external boundaries of the system", which will require definition of the interfaces with other systems and the next step will be to "define all the components that make up the next lower level of the system hierarchy" where fundamental requirements are allocated to each component. Whereas, the SLI technology risk reduction approach develops architecture subsystem technologies prior to developing architectures. The higher level architecture requirements are not allowed to fully develop and undergo decomposition and allocation down to the subsystems before the subsystems must develop allocated requirements based on the highest level of requirements. In the vernacular of the project cycles prior to the mid 1990's, the architecture definition portion of the program appears to be at a generic Phase A stage, while the subsystems are operating at Phase B. Even the management structure of the SLI program is innovative in its approach to Systems Engineering and is not reflected in the APPL training modules. The SLI program has established a Systems Engineering office as an office separate from the architecture development or the subsystem technology development, while that office does have representatives within these other offices. The distributed resources of the Systems Engineering Office are co=located with the respect Project Offices. This template is intended to provide systems engineering as an integrated function at the Program Level. . Undoubtedly, the program management of SLI and the NIAT agree that "program/project managers and the systems engineering team must work closely together towards the single objective of delivering quality products that meet the customer needs". This paper will explore the differences between the methods being taught by NASA, which represent decades of ideas, and those currently in practice in SLI. Time will tell if the innovation employed by SLI will prove to be the model of the future. For now, it is suggested that the training of the present exercise the flexibility of recognizing the new processes employed by a major new NASA program.

  5. Second Generation RLV Space Vehicle Concept

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Throckmorton, David A. (Technical Monitor)

    2002-01-01

    NASA has a long history of conducting development programs and projects in a consistent fashion. Systems Engineering within those programs and projects has also followed a given method outlined by such documents as the NASA Systems Engineering Handbook. The relatively new NASA Space Launch Initiative (SLI) is taking a new approach to developing a space vehicle, with innovative management methods as well as new Systems Engineering processes. With the program less than a year into its life cycle, the efficacy of these new processes has yet to be proven or disproven. At $776M for phase 1, SLI represents a major portion of the NASA focus; however, the new processes being incorporated are not reflected in the training provided by NASA to its engineers. The NASA Academy of Program and Project Leadership (APPL) offers core classes in program and project management and systems engineering to NASA employees with the purpose of creating a "knowledge community where ideas, skills, and experiences are exchanged to increase each other's capacity for strong leadership". The SLI program is, in one sense, a combination of a conceptual design program and a technology program. The program as a whole doesn't map into the generic systems engineering project cycle as currently, and for some time, taught. For example, the NASA APPL Systems Engineering training course teaches that the "first step in developing an architecture is to define the external boundaries of the system", which will require definition of the interfaces with other systems and the next step will be to "define all the components that make up the next lower level of the system hierarchy" where fundamental requirements are allocated to each component. Whereas, the SLI technology risk reduction approach develops architecture subsystem technologies prior to developing architectures. The higher level architecture requirements are not allowed to fully develop and undergo decomposition and allocation down to the subsystems before the subsystems must develop allocated requirements based on the highest level of requirements. In the vernacular of the project cycles prior to the mid 1990's, the architecture definition portion of the program appears to be at a generic Phase A stage, while the subsystems are operating at Phase B. Even the management structure of the SLI program is innovative in its approach to Systems Engineering and is not reflected in the APPL training modules. The SLI program has established a Systems Engineering office as an office separate from the architecture development or the subsystem technology development, while that office does have representatives within these other offices. The distributed resources of the Systems Engineering Office are co-located with the respective Project Offices. This template is intended to provide systems engineering as an integrated function at the Program Level. the program management of SLI and the MAT agree that "program/project managers and the systems engineering team must work closely together towards the single objective of delivering quality products that meet the customer needs". This paper will explore the differences between the methods being taught by NASA, which represent decades of ideas, and those currently in practice in SLI. Time will tell if the innovation employed by SLI will prove to be the model of the future. For now, it is suggested that the training of the present exercise the flexibility of recognizing the new processes employed by a major new NASA program.

  6. Functional Interface Considerations within an Exploration Life Support System Architecture

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Sargusingh, Miriam J.; Toomarian, Nikzad

    2016-01-01

    As notional life support system (LSS) architectures are developed and evaluated, myriad options must be considered pertaining to process technologies, components, and equipment assemblies. Each option must be evaluated relative to its impact on key functional interfaces within the LSS architecture. A leading notional architecture has been developed to guide the path toward realizing future crewed space exploration goals. This architecture includes atmosphere revitalization, water recovery and management, and environmental monitoring subsystems. Guiding requirements for developing this architecture are summarized and important interfaces within the architecture are discussed. The role of environmental monitoring within the architecture is described.

  7. Evaluating non-relational storage technology for HEP metadata and meta-data catalog

    NASA Astrophysics Data System (ADS)

    Grigorieva, M. A.; Golosova, M. V.; Gubin, M. Y.; Klimentov, A. A.; Osipova, V. V.; Ryabinkin, E. A.

    2016-10-01

    Large-scale scientific experiments produce vast volumes of data. These data are stored, processed and analyzed in a distributed computing environment. The life cycle of experiment is managed by specialized software like Distributed Data Management and Workload Management Systems. In order to be interpreted and mined, experimental data must be accompanied by auxiliary metadata, which are recorded at each data processing step. Metadata describes scientific data and represent scientific objects or results of scientific experiments, allowing them to be shared by various applications, to be recorded in databases or published via Web. Processing and analysis of constantly growing volume of auxiliary metadata is a challenging task, not simpler than the management and processing of experimental data itself. Furthermore, metadata sources are often loosely coupled and potentially may lead to an end-user inconsistency in combined information queries. To aggregate and synthesize a range of primary metadata sources, and enhance them with flexible schema-less addition of aggregated data, we are developing the Data Knowledge Base architecture serving as the intelligence behind GUIs and APIs.

  8. DCS: A Case Study of Identification of Knowledge and Disposition Gaps Using Principles of Continuous Risk Management

    NASA Technical Reports Server (NTRS)

    Norcross, Jason; Steinberg, Susan; Kundrot, Craig; Charles, John

    2011-01-01

    The Human Research Program (HRP) is formulated around the program architecture of Evidence-Risk-Gap-Task-Deliverable. Review of accumulated evidence forms the basis for identification of high priority risks to human health and performance in space exploration. Gaps in knowledge or disposition are identified for each risk, and a portfolio of research tasks is developed to fill them. Deliverables from the tasks inform the evidence base with the ultimate goal of defining the level of risk and reducing it to an acceptable level. A comprehensive framework for gap identification, focus, and metrics has been developed based on principles of continuous risk management and clinical care. Research towards knowledge gaps improves understanding of the likelihood, consequence or timeframe of the risk. Disposition gaps include development of standards or requirements for risk acceptance, development of countermeasures or technology to mitigate the risk, and yearly technology assessment related to watching developments related to the risk. Standard concepts from clinical care: prevention, diagnosis, treatment, monitoring, rehabilitation, and surveillance, can be used to focus gaps dealing with risk mitigation. The research plan for the new HRP Risk of Decompression Sickness (DCS) used the framework to identify one disposition gap related to establishment of a DCS standard for acceptable risk, two knowledge gaps related to DCS phenomenon and mission attributes, and three mitigation gaps focused on prediction, prevention, and new technology watch. These gaps were organized in this manner primarily based on target for closure and ease of organizing interim metrics so that gap status could be quantified. Additional considerations for the knowledge gaps were that one was highly design reference mission specific and the other gap was focused on DCS phenomenon.

  9. Improving Project Management Using Formal Models and Architectures

    NASA Technical Reports Server (NTRS)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  10. 41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What basic Art-in-Architecture policy governs Federal agencies? 102-77.10 Section 102-77.10 Public Contracts and Property... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy...

  11. 41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What basic Art-in-Architecture policy governs Federal agencies? 102-77.10 Section 102-77.10 Public Contracts and Property... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy...

  12. 41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What basic Art-in-Architecture policy governs Federal agencies? 102-77.10 Section 102-77.10 Public Contracts and Property... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy...

  13. Developing Enterprise Architectures to Address the Enterprise Dilemma of Deciding What Should Be Sustained versus What Should Be Changed

    ERIC Educational Resources Information Center

    Harrell, J. Michael

    2011-01-01

    Enterprise architecture is a relatively new concept that arose in the latter half of the twentieth century as a means of managing the information technology resources within the enterprise. Borrowing from the disciplines of brick and mortar architecture, software engineering, software architecture, and systems engineering, the enterprise…

  14. 75 FR 68806 - Statement of Organization, Functions and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-09

    ... Agency business applications architectures, the engineering of business processes, the building and... architecture, engineers technology for business processes, builds, deploys, maintains and manages enterprise systems and data collections efforts; (5) applies business applications architecture to process specific...

  15. Context-aware workflow management of mobile health applications.

    PubMed

    Salden, Alfons; Poortinga, Remco

    2006-01-01

    We propose a medical application management architecture that allows medical (IT) experts readily designing, developing and deploying context-aware mobile health (m-health) applications or services. In particular, we elaborate on how our application workflow management architecture enables chaining, coordinating, composing, and adapting context-sensitive medical application components such that critical Quality of Service (QoS) and Quality of Context (QoC) requirements typical for m-health applications or services can be met. This functional architectural support requires learning modules for distilling application-critical selection of attention and anticipation models. These models will help medical experts constructing and adjusting on-the-fly m-health application workflows and workflow strategies. We illustrate our context-aware workflow management paradigm for a m-health data delivery problem, in which optimal communication network configurations have to be determined.

  16. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  17. Harnessing the Risk-Related Data Supply Chain: An Information Architecture Approach to Enriching Human System Research and Operations Knowledge

    NASA Technical Reports Server (NTRS)

    Buquo, Lynn; Johnson-Throop, Kathy

    2010-01-01

    NASA's Human Research Program (HRP) and Space Life Sciences Directorate (SLSD), not unlike many NASA organizations today, struggle with the inherent inefficiencies caused by dependencies on heterogeneous data systems and silos of data and information spread across decentralized discipline domains. The capture of operational and research-based data/information (both in-flight and ground-based) in disparate IT systems impedes the extent to which that data/information can be efficiently and securely shared, analyzed, and enriched into knowledge that directly and more rapidly supports HRP's research-focused human system risk mitigation efforts and SLSD s operationally oriented risk management efforts. As a result, an integrated effort is underway to more fully understand and document how specific sets of risk-related data/information are generated and used and in what IT systems that data/information currently resides. By mapping the risk-related data flow from raw data to useable information and knowledge (think of it as the data supply chain), HRP and SLSD are building an information architecture plan to leverage their existing, shared IT infrastructure. In addition, it is important to create a centralized structured tool to represent risks including attributes such as likelihood, consequence, contributing factors, and the evidence supporting the information in all these fields. Representing the risks in this way enables reasoning about the risks, e.g. revisiting a risk assessment when a mitigation strategy is unavailable, updating a risk assessment when new information becomes available, etc. Such a system also provides a concise way to communicate the risks both within the organization as well as with collaborators. Understanding and, hence, harnessing the human system risk-related data supply chain enhances both organizations' abilities to securely collect, integrate, and share data assets that improve human system research and operations.

  18. A context management system for a cost-efficient smart home platform

    NASA Astrophysics Data System (ADS)

    Schneider, J.; Klein, A.; Mannweiler, C.; Schotten, H. D.

    2012-09-01

    This paper presents an overview of state-of-the-art architectures for integrating wireless sensor and actuators networks into the Future Internet. Furthermore, we will address advantages and disadvantages of the different architectures. With respect to these criteria, we develop a new architecture overcoming these weaknesses. Our system, called Smart Home Context Management System, will be used for intelligent home utilities, appliances, and electronics and includes physical, logical as well as network context sources within one concept. It considers important aspects and requirements of modern context management systems for smart X applications: plug and play as well as plug and trust capabilities, scalability, extensibility, security, and adaptability. As such, it is able to control roller blinds, heating systems as well as learn, for example, the user's taste w.r.t. to home entertainment (music, videos, etc.). Moreover, Smart Grid applications and Ambient Assisted Living (AAL) functions are applicable. With respect to AAL, we included an Emergency Handling function. It assures that emergency calls (police, ambulance or fire department) are processed appropriately. Our concept is based on a centralized Context Broker architecture, enhanced by a distributed Context Broker system. The goal of this concept is to develop a simple, low-priced, multi-functional, and save architecture affordable for everybody. Individual components of the architecture are well tested. Implementation and testing of the architecture as a whole is in progress.

  19. Module Architecture for in Situ Space Laboratories

    NASA Technical Reports Server (NTRS)

    Sherwood, Brent

    2010-01-01

    The paper analyzes internal outfitting architectures for space exploration laboratory modules. ISS laboratory architecture is examined as a baseline for comparison; applicable insights are derived. Laboratory functional programs are defined for seven planet-surface knowledge domains. Necessary and value-added departures from the ISS architecture standard are defined, and three sectional interior architecture options are assessed for practicality and potential performance. Contemporary guidelines for terrestrial analytical laboratory design are found to be applicable to the in-space functional program. Densepacked racks of system equipment, and high module volume packing ratios, should not be assumed as the default solution for exploration laboratories whose primary activities include un-scriptable investigations and experimentation on the system equipment itself.

  20. Real-Time Cognitive Computing Architecture for Data Fusion in a Dynamic Environment

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A.; Duong, Vu A.

    2012-01-01

    A novel cognitive computing architecture is conceptualized for processing multiple channels of multi-modal sensory data streams simultaneously, and fusing the information in real time to generate intelligent reaction sequences. This unique architecture is capable of assimilating parallel data streams that could be analog, digital, synchronous/asynchronous, and could be programmed to act as a knowledge synthesizer and/or an "intelligent perception" processor. In this architecture, the bio-inspired models of visual pathway and olfactory receptor processing are combined as processing components, to achieve the composite function of "searching for a source of food while avoiding the predator." The architecture is particularly suited for scene analysis from visual data and odorant.

  1. Ervin Zube and landscape architecture

    Treesearch

    Paul H. Gobster

    2002-01-01

    As he grew in his knowledge about the landscape through his involvemment in it as a person, student, practitioner, teacher, program director, and researcher, Ervin Zube's ideas about what landscape architecture is and should be continually evolved. He was a prolific writer whose publications span a broad range of audiences, and his contributions to ...

  2. Proposing an Optimal Learning Architecture for the Digital Enterprise.

    ERIC Educational Resources Information Center

    O'Driscoll, Tony

    2003-01-01

    Discusses the strategic role of learning in information age organizations; analyzes parallels between the application of technology to business and the application of technology to learning; and proposes a learning architecture that aligns with the knowledge-based view of the firm and optimizes the application of technology to achieve proficiency…

  3. Integrating Environmental and Information Systems Management: An Enterprise Architecture Approach

    NASA Astrophysics Data System (ADS)

    Noran, Ovidiu

    Environmental responsibility is fast becoming an important aspect of strategic management as the reality of climate change settles in and relevant regulations are expected to tighten significantly in the near future. Many businesses react to this challenge by implementing environmental reporting and management systems. However, the environmental initiative is often not properly integrated in the overall business strategy and its information system (IS) and as a result the management does not have timely access to (appropriately aggregated) environmental information. This chapter argues for the benefit of integrating the environmental management (EM) project into the ongoing enterprise architecture (EA) initiative present in all successful companies. This is done by demonstrating how a reference architecture framework and a meta-methodology using EA artefacts can be used to co-design the EM system, the organisation and its IS in order to achieve a much needed synergy.

  4. An Ontology of Quality Initiatives and a Model for Decentralized, Collaborative Quality Management on the (Semantic) World Wide Web

    PubMed Central

    2001-01-01

    This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be. PMID:11772549

  5. An ontology of quality initiatives and a model for decentralized, collaborative quality management on the (semantic) World-Wide-Web.

    PubMed

    Eysenbach, G

    2001-01-01

    This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be.

  6. Experiment Management System for the SND Detector

    NASA Astrophysics Data System (ADS)

    Pugachev, K.

    2017-10-01

    We present a new experiment management system for the SND detector at the VEPP-2000 collider (Novosibirsk). An important part to report about is access to experimental databases (configuration, conditions and metadata). The system is designed in client-server architecture. User interaction comes true using web-interface. The server side includes several logical layers: user interface templates; template variables description and initialization; implementation details. The templates are meant to involve as less IT knowledge as possible. Experiment configuration, conditions and metadata are stored in a database. To implement the server side Node.js, a modern JavaScript framework, has been chosen. A new template engine having an interesting feature is designed. A part of the system is put into production. It includes templates dealing with showing and editing first level trigger configuration and equipment configuration and also showing experiment metadata and experiment conditions data index.

  7. A Content Markup Language for Data Services

    NASA Astrophysics Data System (ADS)

    Noviello, C.; Acampa, P.; Mango Furnari, M.

    Network content delivery and documents sharing is possible using a variety of technologies, such as distributed databases, service-oriented applications, and so forth. The development of such systems is a complex job, because document life cycle involves a strong cooperation between domain experts and software developers. Furthermore, the emerging software methodologies, such as the service-oriented architecture and knowledge organization (e.g., semantic web) did not really solve the problems faced in a real distributed and cooperating settlement. In this chapter the authors' efforts to design and deploy a distribute and cooperating content management system are described. The main features of the system are a user configurable document type definition and a management middleware layer. It allows CMS developers to orchestrate the composition of specialized software components around the structure of a document. In this chapter are also reported some of the experiences gained on deploying the developed framework in a cultural heritage dissemination settlement.

  8. Cooperating knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Edward A.; Buchanan, Bruce G.

    1988-01-01

    This final report covers work performed under Contract NCC2-220 between NASA Ames Research Center and the Knowledge Systems Laboratory, Stanford University. The period of research was from March 1, 1987 to February 29, 1988. Topics covered were as follows: (1) concurrent architectures for knowledge-based systems; (2) methods for the solution of geometric constraint satisfaction problems, and (3) reasoning under uncertainty. The research in concurrent architectures was co-funded by DARPA, as part of that agency's Strategic Computing Program. The research has been in progress since 1985, under DARPA and NASA sponsorship. The research in geometric constraint satisfaction has been done in the context of a particular application, that of determining the 3-D structure of complex protein molecules, using the constraints inferred from NMR measurements.

  9. Analysis and Design of the Innovation and Entrepreneurship Training Management System based on School Enterprise Cooperation (Taking the School of Computer and Information Engineering of Beijing University of Agriculture as an example)

    NASA Astrophysics Data System (ADS)

    Qianyi, Zhang; Xiaoshun, Li; Ping, Hu; Lu, Ning

    2018-03-01

    With the promotion of undergraduate training mode of “3+1” in Beijing University of Agriculture, the mode and direction of applied and compound talents training should be further visualized, at the same time, in order to make up for the shortage of Double Teachers in the school and the lack of teaching cases that cover the advanced technology in the industry, the school actively encourages the cooperation between the two teaching units and enterprises, and closely connects the enterprise resources with the school teaching system, using the “1” in “3+1” to carry out innovative training work for students. This method is beneficial for college students to integrate theory into practice and realize the purpose of applying knowledge in Higher Education. However, in the actual student training management, this kind of cooperation involves three party units and personnel, so it is difficult to form a unified management, on the other hand, it may also result from poor communication, which leads to unsatisfactory training results. At the same time, there is no good training supervision mechanism, causes the student training work specious. To solve the above problem,this paper designs a training management system of student innovation and Entrepreneurship Based on school enterprise cooperation,the system can effectively manage the relevant work of students’ training, and effectively solve the above problems. The subject is based on the training of innovation and entrepreneurship in the school of computer and information engineering of Beijing University of Agriculture. The system software architecture is designed using B/S architecture technology, the system is divided into three layers, the application of logic layer includes student training management related business, and realized the user’s basic operation management for student training, users can not only realize the basic information management of enterprises, colleges and students through the system, at the same time, it also realizes the information operation of student training management [1]. The data layer of the system creates database applications through Mysql technology, and provides data storage for the whole system.

  10. Trust information-based privacy architecture for ubiquitous health.

    PubMed

    Ruotsalainen, Pekka Sakari; Blobel, Bernd; Seppälä, Antto; Nykänen, Pirkko

    2013-10-08

    Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system analysis. The architecture mimics the way humans use trust information in decision making, and enables the DS to design system-specific privacy policies using computational trust information that is based on systems' measured features. The trust attributes that were developed describe the level systems for support awareness and transparency, and how they follow general and domain-specific regulations and laws. The monitoring component of the architecture offers dynamic feedback concerning how the system enforces the polices of DS. The privacy management architecture developed in this study enables the DS to dynamically manage information privacy in ubiquitous health and to define individual policies for all systems considering their trust value and corresponding attributes. The DS can also set policies for secondary use and reuse of health information. The architecture offers protection against privacy threats existing in ubiquitous environments. Although the architecture is targeted to ubiquitous health, it can easily be modified to other ubiquitous applications.

  11. Trust Information-Based Privacy Architecture for Ubiquitous Health

    PubMed Central

    2013-01-01

    Background Ubiquitous health is defined as a dynamic network of interconnected systems that offers health services independent of time and location to a data subject (DS). The network takes place in open and unsecure information space. It is created and managed by the DS who sets rules that regulate the way personal health information is collected and used. Compared to health care, it is impossible in ubiquitous health to assume the existence of a priori trust between the DS and service providers and to produce privacy using static security services. In ubiquitous health features, business goals and regulations systems followed often remain unknown. Furthermore, health care-specific regulations do not rule the ways health data is processed and shared. To be successful, ubiquitous health requires novel privacy architecture. Objective The goal of this study was to develop a privacy management architecture that helps the DS to create and dynamically manage the network and to maintain information privacy. The architecture should enable the DS to dynamically define service and system-specific rules that regulate the way subject data is processed. The architecture should provide to the DS reliable trust information about systems and assist in the formulation of privacy policies. Furthermore, the architecture should give feedback upon how systems follow the policies of DS and offer protection against privacy and trust threats existing in ubiquitous environments. Methods A sequential method that combines methodologies used in system theory, systems engineering, requirement analysis, and system design was used in the study. In the first phase, principles, trust and privacy models, and viewpoints were selected. Thereafter, functional requirements and services were developed on the basis of a careful analysis of existing research published in journals and conference proceedings. Based on principles, models, and requirements, architectural components and their interconnections were developed using system analysis. Results The architecture mimics the way humans use trust information in decision making, and enables the DS to design system-specific privacy policies using computational trust information that is based on systems’ measured features. The trust attributes that were developed describe the level systems for support awareness and transparency, and how they follow general and domain-specific regulations and laws. The monitoring component of the architecture offers dynamic feedback concerning how the system enforces the polices of DS. Conclusions The privacy management architecture developed in this study enables the DS to dynamically manage information privacy in ubiquitous health and to define individual policies for all systems considering their trust value and corresponding attributes. The DS can also set policies for secondary use and reuse of health information. The architecture offers protection against privacy threats existing in ubiquitous environments. Although the architecture is targeted to ubiquitous health, it can easily be modified to other ubiquitous applications. PMID:25099213

  12. [E-Learning in radiology; the practical use of the content management system ILIAS].

    PubMed

    Schütze, B; Mildenberger, P; Kämmerer, M

    2006-05-01

    Due to the possibility of using different kinds of visualization, e-learning has the advantage of allowing individualized learning. A check should be performed to determine whether the use of the web-based content management system ILIAS simplifies the writing and production of electronic learning modules in radiology. Internet-based e-learning provides access to existing learning modules regardless of time and location, since fast Internet connections are readily available. Web Content Management Systems (WCMS) are suitable platforms for imparting radiology-related information (visual abilities like the recognition of patterns as well as interdisciplinary specialized knowledge). The open source product ILIAS is a free WCMS. It is used by many universities and is accepted by both students and lecturers. Its modular and object-oriented software architecture makes it easy to adapt and enlarge the platform. The employment of e-learning standards such as LOM and SCORM within ILIAS makes it possible to reuse contents, even if the platform has to be changed. ILIAS renders it possible to provide students with texts, images, or files of any other kind within a learning context which is defined by the lecturer. Students can check their acquired knowledge via online testing and receive direct performance feedback. The significant interest that students have shown in ILIAS proves that e-learning can be a useful addition to conventional learning methods.

  13. A Case Study on Neural Inspired Dynamic Memory Management Strategies for High Performance Computing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vineyard, Craig Michael; Verzi, Stephen Joseph

    As high performance computing architectures pursue more computational power there is a need for increased memory capacity and bandwidth as well. A multi-level memory (MLM) architecture addresses this need by combining multiple memory types with different characteristics as varying levels of the same architecture. How to efficiently utilize this memory infrastructure is an unknown challenge, and in this research we sought to investigate whether neural inspired approaches can meaningfully help with memory management. In particular we explored neurogenesis inspired re- source allocation, and were able to show a neural inspired mixed controller policy can beneficially impact how MLM architectures utilizemore » memory.« less

  14. Internet-enabled collaborative agent-based supply chains

    NASA Astrophysics Data System (ADS)

    Shen, Weiming; Kremer, Rob; Norrie, Douglas H.

    2000-12-01

    This paper presents some results of our recent research work related to the development of a new Collaborative Agent System Architecture (CASA) and an Infrastructure for Collaborative Agent Systems (ICAS). Initially being proposed as a general architecture for Internet based collaborative agent systems (particularly complex industrial collaborative agent systems), the proposed architecture is very suitable for managing the Internet enabled complex supply chain for a large manufacturing enterprise. The general collaborative agent system architecture with the basic communication and cooperation services, domain independent components, prototypes and mechanisms are described. Benefits of implementing Internet enabled supply chains with the proposed infrastructure are discussed. A case study on Internet enabled supply chain management is presented.

  15. Information Architecture: Looking Ahead.

    ERIC Educational Resources Information Center

    Rosenfeld, Louis

    2002-01-01

    Considers the future of the field of information architecture. Highlights include a comparison with the growth of the field of professional management; the design of information systems since the Web; more demanding users; the need for an interdisciplinary approach; and how to define information architecture. (LRW)

  16. An architecture and protocol for communications satellite constellations regarded as multi-agent systems

    NASA Technical Reports Server (NTRS)

    Lindley, Craig A.

    1995-01-01

    This paper presents an architecture for satellites regarded as intercommunicating agents. The architecture is based upon a postmodern paradigm of artificial intelligence in which represented knowledge is regarded as text, inference procedures are regarded as social discourse and decision making conventions and the semantics of representations are grounded in the situated behaviour and activity of agents. A particular protocol is described for agent participation in distributed search and retrieval operations conducted as joint activities.

  17. The sixth generation robot in space

    NASA Technical Reports Server (NTRS)

    Butcher, A.; Das, A.; Reddy, Y. V.; Singh, H.

    1990-01-01

    The knowledge based simulator developed in the artificial intelligence laboratory has become a working test bed for experimenting with intelligent reasoning architectures. With this simulator, recently, small experiments have been done with an aim to simulate robot behavior to avoid colliding paths. An automatic extension of such experiments to intelligently planning robots in space demands advanced reasoning architectures. One such architecture for general purpose problem solving is explored. The robot, seen as a knowledge base machine, goes via predesigned abstraction mechanism for problem understanding and response generation. The three phases in one such abstraction scheme are: abstraction for representation, abstraction for evaluation, and abstraction for resolution. Such abstractions require multimodality. This multimodality requires the use of intensional variables to deal with beliefs in the system. Abstraction mechanisms help in synthesizing possible propagating lattices for such beliefs. The machine controller enters into a sixth generation paradigm.

  18. Implementation of a frame-based representation in CLIPS

    NASA Technical Reports Server (NTRS)

    Assal, Hisham; Myers, Leonard

    1990-01-01

    Knowledge representation is one of the major concerns in expert systems. The representation of domain-specific knowledge should agree with the nature of the domain entities and their use in the real world. For example, architectural applications deal with objects and entities such as spaces, walls, and windows. A natural way of representing these architectural entities is provided by frames. This research explores the potential of using the expert system shell CLIPS, developed by NASA, to implement a frame-based representation that can accommodate architectural knowledge. These frames are similar but quite different from the 'template' construct in version 4.3 of CLIPS. Templates support only the grouping of related information and the assignment of default values to template fields. In addition to these features frames provide other capabilities including definition of classes, inheritance between classes and subclasses, relation of objects of different classes with 'has-a', association of methods (demons) of different types (standard and user-defined) to fields (slots), and creation of new fields at run-time. This frame-based representation is implemented completely in CLIPS. No change to the source code is necessary.

  19. Extended artificial neural networks: incorporation of a priori chemical knowledge enables use of ion selective electrodes for in-situ measurement of ions at environmentally relevant levels.

    PubMed

    Mueller, Amy V; Hemond, Harold F

    2013-12-15

    A novel artificial neural network (ANN) architecture is proposed which explicitly incorporates a priori system knowledge, i.e., relationships between output signals, while preserving the unconstrained non-linear function estimator characteristics of the traditional ANN. A method is provided for architecture layout, disabling training on a subset of neurons, and encoding system knowledge into the neuron structure. The novel architecture is applied to raw readings from a chemical sensor multi-probe (electric tongue), comprised of off-the-shelf ion selective electrodes (ISEs), to estimate individual ion concentrations in solutions at environmentally relevant concentrations and containing environmentally representative ion mixtures. Conductivity measurements and the concept of charge balance are incorporated into the ANN structure, resulting in (1) removal of estimation bias typically seen with use of ISEs in mixtures of unknown composition and (2) improvement of signal estimation by an order of magnitude or more for both major and minor constituents relative to use of ISEs as stand-alone sensors and error reduction by 30-50% relative to use of standard ANN models. This method is suggested as an alternative to parameterization of traditional models (e.g., Nikolsky-Eisenman), for which parameters are strongly dependent on both analyte concentration and temperature, and to standard ANN models which have no mechanism for incorporation of system knowledge. Network architecture and weighting are presented for the base case where the dot product can be used to relate ion concentrations to both conductivity and charge balance as well as for an extension to log-normalized data where the model can no longer be represented in this manner. While parameterization in this case study is analyte-dependent, the architecture is generalizable, allowing application of this method to other environmental problems for which mathematical constraints can be explicitly stated. © 2013 Elsevier B.V. All rights reserved.

  20. Satellite ATM Networks: Architectures and Guidelines Developed

    NASA Technical Reports Server (NTRS)

    vonDeak, Thomas C.; Yegendu, Ferit

    1999-01-01

    An important element of satellite-supported asynchronous transfer mode (ATM) networking will involve support for the routing and rerouting of active connections. Work published under the auspices of the Telecommunications Industry Association (http://www.tiaonline.org), describes basic architectures and routing protocol issues for satellite ATM (SATATM) networks. The architectures and issues identified will serve as a basis for further development of technical specifications for these SATATM networks. Three ATM network architectures for bent pipe satellites and three ATM network architectures for satellites with onboard ATM switches were developed. The architectures differ from one another in terms of required level of mobility, supported data rates, supported terrestrial interfaces, and onboard processing and switching requirements. The documentation addresses low-, middle-, and geosynchronous-Earth-orbit satellite configurations. The satellite environment may require real-time routing to support the mobility of end devices and nodes of the ATM network itself. This requires the network to be able to reroute active circuits in real time. In addition to supporting mobility, rerouting can also be used to (1) optimize network routing, (2) respond to changing quality-of-service requirements, and (3) provide a fault tolerance mechanism. Traffic management and control functions are necessary in ATM to ensure that the quality-of-service requirements associated with each connection are not violated and also to provide flow and congestion control functions. Functions related to traffic management were identified and described. Most of these traffic management functions will be supported by on-ground ATM switches, but in a hybrid terrestrial-satellite ATM network, some of the traffic management functions may have to be supported by the onboard satellite ATM switch. Future work is planned to examine the tradeoffs of placing traffic management functions onboard a satellite as opposed to implementing those functions at the Earth station components.

  1. Fault Management Architectures and the Challenges of Providing Software Assurance

    NASA Technical Reports Server (NTRS)

    Savarino, Shirley; Fitz, Rhonda; Fesq, Lorraine; Whitman, Gerek

    2015-01-01

    Fault Management (FM) is focused on safety, the preservation of assets, and maintaining the desired functionality of the system. How FM is implemented varies among missions. Common to most missions is system complexity due to a need to establish a multi-dimensional structure across hardware, software and spacecraft operations. FM is necessary to identify and respond to system faults, mitigate technical risks and ensure operational continuity. Generally, FM architecture, implementation, and software assurance efforts increase with mission complexity. Because FM is a systems engineering discipline with a distributed implementation, providing efficient and effective verification and validation (V&V) is challenging. A breakout session at the 2012 NASA Independent Verification & Validation (IV&V) Annual Workshop titled "V&V of Fault Management: Challenges and Successes" exposed this issue in terms of V&V for a representative set of architectures. NASA's Software Assurance Research Program (SARP) has provided funds to NASA IV&V to extend the work performed at the Workshop session in partnership with NASA's Jet Propulsion Laboratory (JPL). NASA IV&V will extract FM architectures across the IV&V portfolio and evaluate the data set, assess visibility for validation and test, and define software assurance methods that could be applied to the various architectures and designs. This SARP initiative focuses efforts on FM architectures from critical and complex projects within NASA. The identification of particular FM architectures and associated V&V/IV&V techniques provides a data set that can enable improved assurance that a system will adequately detect and respond to adverse conditions. Ultimately, results from this activity will be incorporated into the NASA Fault Management Handbook providing dissemination across NASA, other agencies and the space community. This paper discusses the approach taken to perform the evaluations and preliminary findings from the research.

  2. Non invasive sensing technologies for cultural heritage management and fruition

    NASA Astrophysics Data System (ADS)

    Soldovieri, Francesco; Masini, Nicola

    2016-04-01

    The relevance of the information produced by science and technology for the knowledge of the cultural heritage depends on the quality of the feedback and, consequently, on the "cultural" distance between scientists and end-users. In particular, the solution to this problem mainly resides in the capability of end-users' capability to assess and transform the knowledge produced by diagnostics with regard to: information on both cultural objects and sites (decay patterns, vulnerability, presence of buried archaeological remains); decision making (management plan, conservation project, and excavation plan). From our experience in the field of the cultural heritage and namely the conservation, of monuments, there is a significant gap of information between technologists (geophysicists/physicists/engineers) and end-users (conservators/historians/architects). This cultural gap is due to the difficulty to interpret "indirect data" produced by non invasive diagnostics (i.e. radargrams/thermal images/seismic tomography etc..) in order to provide information useful to improve the historical knowledge (e.g. the chronology of the different phases of a building), to characterise the state of conservation (e.g. detection of cracks in the masonry) and to monitor in time cultural heritage artifacts and sites. The possible answer to this difficulty is in the set-up of a knowledge chain regarding the following steps: - Integrated application of novel and robust data processing methods; - Augmented reality as a tool for making easier the interpretation of non invasive - investigations for the analysis of decay pathologies of masonry and architectural surfaces; - The comparison between direct data (carrots, visual inspection) and results from non-invasive tests, including geophysics, aims to improve the interpretation and the rendering of the monuments and even of the archaeological landscapes; - The use of specimens or test beds for the detection of archaeological features and monitoring of monuments and sites. In this way, we will be able to improve the appreciation of diagnostics and remote sensing technologies by the end-users. At the conference, we will show and discuss several study cases depicting the deployment of this knowledge chain in realistic conditions regarding the CH management. References Leucci G., Masini N., Persico R., Soldovieri F. 2011. GPR and sonic tomography for structural restoration: the case of the cathedral of Tricarico, Journal of Geophysics and Engineering, 8 (3), 76-92, doi:10.1088/1742-2132/8/3/S08 Masini N., Soldovieri F. 2011. Editorial: Integrated non-invasive sensing techniques and geophysical methods for the study and conservation of architectural, archaeological and artistic heritage, Journal of Geophysics and Engineering, 8 (3), 1-2, doi:10.1088/1742-2132/8/3/E01 Masini N., Persico R., Rizzo E., Calia A., Giannotta M.T., Quarta G., Pagliuca A. 2010, Integrated Techniques for Analysis and Monitoring of Historical Monuments: the case of S.Giovanni al Sepolcro in Brindisi (Southern Italy), Near Surface Geophysics, 8(5), 423-432, doi:10.3997/1873-0604.2010012

  3. The Evolution of the DARWIN System

    NASA Technical Reports Server (NTRS)

    Walton, Joan D.; Filman, Robert E.; Korsmeyer, David J.; Norvig, Peter (Technical Monitor)

    1999-01-01

    DARWIN is a web-based system for presenting the results of wind-tunnel testing and computational model analyses to aerospace designers. DARWIN captures the data, maintains the information, and manages derived knowledge (e.g. visualizations, etc.) of large quantities of aerospace data. In addition, it provides tools and an environment for distributed collaborative engineering. We are currently constructing the third version of the DARWIN software system. DARWN's development history has, in some sense, tracked the development of web applications. The 1995 DARWIN reflected the latest web technologies--CGI scripts, Java applets and a three-layer architecture--available at that time. The 1997 version of DARWIN expanded on this base, making extensive use of a plethora of web technologies, including Java/JavaScript and Dynamic HTML. While more powerful, this multiplicity has proven to be a maintenance and development headache. The year 2000 version of DARWIN will provide a more stable and uniform foundation environment, composed primarily of Java mechanisms. In this paper, we discuss this evolution, comparing the strengths and weaknesses of the various architectural approaches and describing the lessons learned about building complex web applications.

  4. The development of a prototype intelligent user interface subsystem for NASA's scientific database systems

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Roelofs, Larry H.; Short, Nicholas M., Jr.

    1987-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components the development of an Intelligent User Interface (IUI).The intent of the latter is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. The purpose is to support the large number of potential scientific and engineering users presently having need of space and land related research and technical data but who have little or no experience in query languages or understanding of the information content or architecture of the databases involved. This technical memorandum presents prototype Intelligent User Interface Subsystem (IUIS) using the Crustal Dynamics Project Database as a test bed for the implementation of the CRUDDES (Crustal Dynamics Expert System). The knowledge base has more than 200 rules and represents a single application view and the architectural view. Operational performance using CRUDDES has allowed nondatabase users to obtain useful information from the database previously accessible only to an expert database user or the database designer.

  5. Graphical explanation in an expert system for Space Station Freedom rack integration

    NASA Technical Reports Server (NTRS)

    Craig, F. G.; Cutts, D. E.; Fennel, T. R.; Purves, B.

    1990-01-01

    The rationale and methodology used to incorporate graphics into explanations provided by an expert system for Space Station Freedom rack integration is examined. The rack integration task is typical of a class of constraint satisfaction problems for large programs where expertise from several areas is required. Graphically oriented approaches are used to explain the conclusions made by the system, the knowledge base content, and even at more abstract levels the control strategies employed by the system. The implemented architecture combines hypermedia and inference engine capabilities. The advantages of this architecture include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. The graphical techniques employed range from simple statis presentation of schematics to dynamic creation of a series of pictures presented motion picture style. User models control the type, amount, and order of information presented.

  6. An Integration Architecture of Virtual Campuses with External e-Learning Tools

    ERIC Educational Resources Information Center

    Navarro, Antonio; Cigarran, Juan; Huertas, Francisco; Rodriguez-Artacho, Miguel; Cogolludo, Alberto

    2014-01-01

    Technology enhanced learning relies on a variety of software architectures and platforms to provide different kinds of management service and enhanced instructional interaction. As e-learning support has become more complex, there is a need for virtual campuses that combine learning management systems with the services demanded by educational…

  7. Strategies for memory-based decision making: Modeling behavioral and neural signatures within a cognitive architecture.

    PubMed

    Fechner, Hanna B; Pachur, Thorsten; Schooler, Lael J; Mehlhorn, Katja; Battal, Ceren; Volz, Kirsten G; Borst, Jelmer P

    2016-12-01

    How do people use memories to make inferences about real-world objects? We tested three strategies based on predicted patterns of response times and blood-oxygen-level-dependent (BOLD) responses: one strategy that relies solely on recognition memory, a second that retrieves additional knowledge, and a third, lexicographic (i.e., sequential) strategy, that considers knowledge conditionally on the evidence obtained from recognition memory. We implemented the strategies as computational models within the Adaptive Control of Thought-Rational (ACT-R) cognitive architecture, which allowed us to derive behavioral and neural predictions that we then compared to the results of a functional magnetic resonance imaging (fMRI) study in which participants inferred which of two cities is larger. Overall, versions of the lexicographic strategy, according to which knowledge about many but not all alternatives is searched, provided the best account of the joint patterns of response times and BOLD responses. These results provide insights into the interplay between recognition and additional knowledge in memory, hinting at an adaptive use of these two sources of information in decision making. The results highlight the usefulness of implementing models of decision making within a cognitive architecture to derive predictions on the behavioral and neural level. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  9. A unified approach to the design of clinical reporting systems.

    PubMed

    Gouveia-Oliveira, A; Salgado, N C; Azevedo, A P; Lopes, L; Raposo, V D; Almeida, I; de Melo, F G

    1994-12-01

    Computer-based Clinical Reporting Systems (CRS) for diagnostic departments that use structured data entry have a number of functional and structural affinities suggesting that a common software architecture for CRS may be defined. Such an architecture should allow easy expandability and reusability of a CRS. We report the development methodology and the architecture of SISCOPE, a CRS originally designed for gastrointestinal endoscopy that is expandable and reusable. Its main components are a patient database, a knowledge base, a reports base, and screen and reporting engines. The knowledge base contains the description of the controlled vocabulary and all the information necessary to control the menu system, and is easily accessed and modified with a conventional text editor. The structure of the controlled vocabulary is formally presented as an entity-relationship diagram. The screen engine drives a dynamic user interface and the reporting engine automatically creates a medical report; both engines operate by following a set of rules and the information contained in the knowledge base. Clinical experience has shown this architecture to be highly flexible and to allow frequent modifications of both the vocabulary and the menu system. This structure provided increased collaboration among development teams, insulating the domain expert from the details of the database, and enabling him to modify the system as necessary and to test the changes immediately. The system has also been reused in several different domains.

  10. A Cloud Robotics Based Service for Managing RPAS in Emergency, Rescue and Hazardous Scenarios

    NASA Astrophysics Data System (ADS)

    Silvagni, Mario; Chiaberge, Marcello; Sanguedolce, Claudio; Dara, Gianluca

    2016-04-01

    Cloud robotics and cloud services are revolutionizing not only the ICT world but also the robotics industry, giving robots more computing capabilities, storage and connection bandwidth while opening new scenarios that blend the physical to the digital world. In this vision, new IT architectures are required to manage robots, retrieve data from them and create services to interact with users. Among all the robots this work is mainly focused on flying robots, better known as drones, UAV (Unmanned Aerial Vehicle) or RPAS (Remotely Piloted Aircraft Systems). The cloud robotics approach shifts the concept of having a single local "intelligence" for every single UAV, as a unique device that carries out onboard all the computation and storage processes, to a more powerful "centralized brain" located in the cloud. This breakthrough opens new scenarios where UAVs are agents, relying on remote servers for most of their computational load and data storage, creating a network of devices where they can share knowledge and information. Many applications, using UAVs, are growing as interesting and suitable devices for environment monitoring. Many services can be build fetching data from UAVs, such as telemetry, video streaming, pictures or sensors data; once. These services, part of the IT architecture, can be accessed via web by other devices or shared with other UAVs. As test cases of the proposed architecture, two examples are reported. In the first one a search and rescue or emergency management, where UAVs are required for monitoring intervention, is shown. In case of emergency or aggression, the user requests the emergency service from the IT architecture, providing GPS coordinates and an identification number. The IT architecture uses a UAV (choosing among the available one according to distance, service status, etc.) to reach him/her for monitoring and support operations. In the meantime, an officer will use the service to see the current position of the UAV, its telemetry and video streaming from its camera. Data are stored for further use and documentation and can be shared to all the involved personal or services. The second case refer to imaging survey. An investigation area is selected using a map or a set of coordinates by a user that can be on the field on in a management facility. The cloud system elaborate this data and automatically compute a flight plan that consider the survey data requirements (i.e: picture ground resolution, overlapping) but also several environment constraints (i.e: no fly zones, possible hazardous areas, known obstacles, etc). Once the flight plan is loaded in the selected UAV the mission starts. During the mission, if a suitable data network coverage is available, the UAV transmit acquired images (typically low quality image to limit bandwidth) and shooting pose in order to perform a preliminary check during the mission and minimize failing in survey; if not, all data are uploaded asynchronously after the mission. The cloud servers perform all the tasks related to image processing (mosaic, ortho-photo, geo-referencing, 3D models) and data management.

  11. Contextual cloud-based service oriented architecture for clinical workflow.

    PubMed

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.

  12. Adding Learning to Knowledge-Based Systems: Taking the "Artificial" Out of AI

    Treesearch

    Daniel L. Schmoldt

    1997-01-01

    Both, knowledge-based systems (KBS) development and maintenance require time-consuming analysis of domain knowledge. Where example cases exist, KBS can be built, and later updated, by incorporating learning capabilities into their architecture. This applies to both supervised and unsupervised learning scenarios. In this paper, the important issues for learning systems-...

  13. From data towards knowledge: revealing the architecture of signaling systems by unifying knowledge mining and data mining of systematic perturbation data.

    PubMed

    Lu, Songjian; Jin, Bo; Cowart, L Ashley; Lu, Xinghua

    2013-01-01

    Genetic and pharmacological perturbation experiments, such as deleting a gene and monitoring gene expression responses, are powerful tools for studying cellular signal transduction pathways. However, it remains a challenge to automatically derive knowledge of a cellular signaling system at a conceptual level from systematic perturbation-response data. In this study, we explored a framework that unifies knowledge mining and data mining towards the goal. The framework consists of the following automated processes: 1) applying an ontology-driven knowledge mining approach to identify functional modules among the genes responding to a perturbation in order to reveal potential signals affected by the perturbation; 2) applying a graph-based data mining approach to search for perturbations that affect a common signal; and 3) revealing the architecture of a signaling system by organizing signaling units into a hierarchy based on their relationships. Applying this framework to a compendium of yeast perturbation-response data, we have successfully recovered many well-known signal transduction pathways; in addition, our analysis has led to many new hypotheses regarding the yeast signal transduction system; finally, our analysis automatically organized perturbed genes as a graph reflecting the architecture of the yeast signaling system. Importantly, this framework transformed molecular findings from a gene level to a conceptual level, which can be readily translated into computable knowledge in the form of rules regarding the yeast signaling system, such as "if genes involved in the MAPK signaling are perturbed, genes involved in pheromone responses will be differentially expressed."

  14. A Socio-Cognitive Approach to Knowledge Construction in Design Studio through Blended Learning

    ERIC Educational Resources Information Center

    Kocaturk, Tuba

    2017-01-01

    This paper results from an educational research project that was undertaken by the School of Architecture, at the University of Liverpool funded by the Higher Education Academy in UK. The research explored technology driven shifts in architectural design studio education, identified their cognitive effects on design learning and developed an…

  15. Polynomial Calculus: Rethinking the Role of Calculus in High Schools

    ERIC Educational Resources Information Center

    Grant, Melva R.; Crombie, William; Enderson, Mary; Cobb, Nell

    2016-01-01

    Access to advanced study in mathematics, in general, and to calculus, in particular, depends in part on the conceptual architecture of these knowledge domains. In this paper, we outline an alternative conceptual architecture for elementary calculus. Our general strategy is to separate basic concepts from the particular advanced techniques used in…

  16. Reusable Autonomy

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Obenschain, Arthur F. (Technical Monitor)

    2002-01-01

    Currently, spacecraft ground systems have a well defined and somewhat standard architecture and operations concept. Based on domain analysis studies of various control centers conducted over the years it is clear that ground systems have core capabilities and functionality that are common across all ground systems. This observation alone supports the realization of reuse. Additionally, spacecraft ground systems are increasing in their ability to do things autonomously. They are being engineered using advanced expert systems technology to provide automated support for operators. A clearer understanding of the possible roles of agent technology is advancing the prospects of greater autonomy for these systems. Many of their functional and management tasks are or could be supported by applied agent technology, the dynamics of the ground system's infrastructure could be monitored by agents, there are intelligent agent-based approaches to user-interfaces, etc. The premise of this paper is that the concepts associated with software reuse, applicable in consideration of classically-engineered ground systems, can be updated to address their application in highly agent-based realizations of future ground systems. As a somewhat simplified example consider the following situation, involving human agents in a ground system context. Let Group A of controllers be working on Mission X. They are responsible for the command, control and health and safety of the Mission X spacecraft. Let us suppose that mission X successfully completes it mission and is turned off. Group A could be dispersed or perhaps move to another Mission Y. In this case there would be reuse of the human agents from Mission X to Mission Y. The Group A agents perform their well-understood functions in a somewhat but related context. There will be a learning or familiarization process that the group A agents go through to make the new context, determined by the new Mission Y, understood. This simplified scenario highlights some of the major issues that need to be addressed when considering the situation where Group A is composed of software-based agents (not their human counterparts) and they migrate from one mission support system to another. This paper will address: - definition of an agent architecture appropriate to support reuse; - identification of non-mission-specific agent capabilities required; - appropriate knowledge representation schemes for mission-specific knowledge; - agent interface with mission-specific knowledge (a type of Learning); development of a fully-operational group of cooperative software agents for ground system support; architecture and operation of a repository of reusable agents that could be the source of intelligent components for realizing an autonomous (or nearly autonomous) agent-based ground system, and an agent-based approach to repository management and operation (an intelligent interface for human use of the repository in a ground-system development activity).

  17. Modern and Contemporary Cultural Heritage Documentation and Knowledge by Surveying and its Representation

    NASA Astrophysics Data System (ADS)

    Balletti, C.; Costa, M.; Guerra, F.; Martinello, F.; Vernier, P.

    2018-05-01

    Conservation of modern and contemporary cultural heritage, which goes from design objects, to architecture, to cities and territories, is certainly a current topic and in the development phase as it is underway - in the same modernity - a process of systematic replacement of architectural elements, outcome of solutions then experimental, which today are reproduced with contemporary materials, analogous in the appearance, but intimately different especially in the technological content.The paper describes the particular case of La Tour de Meudon, better known as The Tower, (1966) by André Bloc, a contemporary architect of Le Corbusier, founder of L'Architecture d'aujourd'hui, who created his habitable sculptures. All his works mark the evolution of geometric abstraction to the free form, and they are still admirable testimonies of a journey that led him from architecture to architecture. His Architecture and his sculpture intertwine, opening the plastic unity of form in physical space-time. The survey is a fundamental moment for the knowledge of these hybrid architectures, where the structural component is hidden by its evident plasticity, as if it were a large sculpture with abstract and overlapping geometric shapes.Survey isn't only an analysis of geometries: it is instrumental to the other structural and material analyses since it provides a metric and topological basis on which to spatially locate the phenomena being studied. The integrated survey of the building (laser scanning, photogrammetry, topography) has allowed to document his project, contributing to the to definition of the actual construction characteristics and ascertain both the material consistency and the state of conservation.

  18. Image and Morphology in Modern Theory of Architecture

    NASA Astrophysics Data System (ADS)

    Yankovskaya, Y. S.; Merenkov, A. V.

    2017-11-01

    This paper is devoted to some important and fundamental problems of the modern Russian architectural theory. These problems are: methodological and technological retardation; substitution of the modern professional architectural theoretical knowledge by the humanitarian concepts; preference of the traditional historical or historical-theoretical research. One of the most probable ways is the formation of useful modern subject (and multi-subject)-oriented concepts in architecture. To get over the criticism and distrust of the architectural theory is possible through the recognition of an important role of the subject (architect, consumer, contractor, ruler, etc.) and direction of the practical tasks of the forming human environment in the today’s rapidly changing world and post-industrial society. In this article we consider the evolution of two basic concepts for the theory of architecture such as the image and morphology.

  19. Building the Knowledge Base to Support the Automatic Animation Generation of Chinese Traditional Architecture

    NASA Astrophysics Data System (ADS)

    Wei, Gongjin; Bai, Weijing; Yin, Meifang; Zhang, Songmao

    We present a practice of applying the Semantic Web technologies in the domain of Chinese traditional architecture. A knowledge base consisting of one ontology and four rule bases is built to support the automatic generation of animations that demonstrate the construction of various Chinese timber structures based on the user's input. Different Semantic Web formalisms are used, e.g., OWL DL, SWRL and Jess, to capture the domain knowledge, including the wooden components needed for a given building, construction sequence, and the 3D size and position of every piece of wood. Our experience in exploiting the current Semantic Web technologies in real-world application systems indicates their prominent advantages (such as the reasoning facilities and modeling tools) as well as the limitations (such as low efficiency).

  20. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the software community. This paper discusses the findings and TR suite informing the FM domain in best practices for FM architectural design, visibility observations, and methods employed for IV&V and mission assurance.

  1. Extensions to the Parallel Real-Time Artificial Intelligence System (PRAIS) for fault-tolerant heterogeneous cycle-stealing reasoning

    NASA Technical Reports Server (NTRS)

    Goldstein, David

    1991-01-01

    Extensions to an architecture for real-time, distributed (parallel) knowledge-based systems called the Parallel Real-time Artificial Intelligence System (PRAIS) are discussed. PRAIS strives for transparently parallelizing production (rule-based) systems, even under real-time constraints. PRAIS accomplished these goals (presented at the first annual C Language Integrated Production System (CLIPS) conference) by incorporating a dynamic task scheduler, operating system extensions for fact handling, and message-passing among multiple copies of CLIPS executing on a virtual blackboard. This distributed knowledge-based system tool uses the portability of CLIPS and common message-passing protocols to operate over a heterogeneous network of processors. Results using the original PRAIS architecture over a network of Sun 3's, Sun 4's and VAX's are presented. Mechanisms using the producer-consumer model to extend the architecture for fault-tolerance and distributed truth maintenance initiation are also discussed.

  2. Integrating Computing Resources: A Shared Distributed Architecture for Academics and Administrators.

    ERIC Educational Resources Information Center

    Beltrametti, Monica; English, Will

    1994-01-01

    Development and implementation of a shared distributed computing architecture at the University of Alberta (Canada) are described. Aspects discussed include design of the architecture, users' views of the electronic environment, technical and managerial challenges, and the campuswide human infrastructures needed to manage such an integrated…

  3. Architectural Design Document for the Technology Demonstration of the Joint Network Defence and Management System (JNDMS) Project

    DTIC Science & Technology

    2009-09-21

    specified by contract no. W7714-040875/001/SV. This document contains the design of the JNDMS software to the system architecture level. Other...alternative for the presentation functions. ASP, Java, ActiveX , DLL, HTML, DHTML, SOAP, .NET HTML, DHTML, XML, Jscript, VBScript, SOAP, .NET...retrieved through the network, typically by a network management console. Information is contained in a Management Information Base (MIB), which is a data

  4. Architecture for knowledge-based and federated search of online clinical evidence.

    PubMed

    Coiera, Enrico; Walther, Martin; Nguyen, Ken; Lovell, Nigel H

    2005-10-24

    It is increasingly difficult for clinicians to keep up-to-date with the rapidly growing biomedical literature. Online evidence retrieval methods are now seen as a core tool to support evidence-based health practice. However, standard search engine technology is not designed to manage the many different types of evidence sources that are available or to handle the very different information needs of various clinical groups, who often work in widely different settings. The objectives of this paper are (1) to describe the design considerations and system architecture of a wrapper-mediator approach to federate search system design, including the use of knowledge-based, meta-search filters, and (2) to analyze the implications of system design choices on performance measurements. A trial was performed to evaluate the technical performance of a federated evidence retrieval system, which provided access to eight distinct online resources, including e-journals, PubMed, and electronic guidelines. The Quick Clinical system architecture utilized a universal query language to reformulate queries internally and utilized meta-search filters to optimize search strategies across resources. We recruited 227 family physicians from across Australia who used the system to retrieve evidence in a routine clinical setting over a 4-week period. The total search time for a query was recorded, along with the duration of individual queries sent to different online resources. Clinicians performed 1662 searches over the trial. The average search duration was 4.9 +/- 3.2 s (N = 1662 searches). Mean search duration to the individual sources was between 0.05 s and 4.55 s. Average system time (ie, system overhead) was 0.12 s. The relatively small system overhead compared to the average time it takes to perform a search for an individual source shows that the system achieves a good trade-off between performance and reliability. Furthermore, despite the additional effort required to incorporate the capabilities of each individual source (to improve the quality of search results), system maintenance requires only a small additional overhead.

  5. Trust-based information system architecture for personal wellness.

    PubMed

    Ruotsalainen, Pekka; Nykänen, Pirkko; Seppälä, Antto; Blobel, Bernd

    2014-01-01

    Modern eHealth, ubiquitous health and personal wellness systems take place in an unsecure and ubiquitous information space where no predefined trust occurs. This paper presents novel information model and an architecture for trust based privacy management of personal health and wellness information in ubiquitous environment. The architecture enables a person to calculate a dynamic and context-aware trust value for each service provider, and using it to design personal privacy policies for trustworthy use of health and wellness services. For trust calculation a novel set of measurable context-aware and health information-sensitive attributes is developed. The architecture enables a person to manage his or her privacy in ubiquitous environment by formulating context-aware and service provider specific policies. Focus groups and information modelling was used for developing a wellness information model. System analysis method based on sequential steps that enable to combine results of analysis of privacy and trust concerns and the selection of trust and privacy services was used for development of the information system architecture. Its services (e.g. trust calculation, decision support, policy management and policy binding services) and developed attributes enable a person to define situation-aware policies that regulate the way his or her wellness and health information is processed.

  6. Study on Global GIS architecture and its key technologies

    NASA Astrophysics Data System (ADS)

    Cheng, Chengqi; Guan, Li; Lv, Xuefeng

    2009-09-01

    Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.

  7. Study on Global GIS architecture and its key technologies

    NASA Astrophysics Data System (ADS)

    Cheng, Chengqi; Guan, Li; Lv, Xuefeng

    2010-11-01

    Global GIS (G2IS) is a system, which supports the huge data process and the global direct manipulation on global grid based on spheroid or ellipsoid surface. Based on global subdivision grid (GSG), Global GIS architecture is presented in this paper, taking advantage of computer cluster theory, the space-time integration technology and the virtual reality technology. Global GIS system architecture is composed of five layers, including data storage layer, data representation layer, network and cluster layer, data management layer and data application layer. Thereinto, it is designed that functions of four-level protocol framework and three-layer data management pattern of Global GIS based on organization, management and publication of spatial information in this architecture. Three kinds of core supportive technologies, which are computer cluster theory, the space-time integration technology and the virtual reality technology, and its application pattern in the Global GIS are introduced in detail. The primary ideas of Global GIS in this paper will be an important development tendency of GIS.

  8. Nicephor[e]: a web-based solution for teaching forensic and scientific photography.

    PubMed

    Voisard, R; Champod, C; Furrer, J; Curchod, J; Vautier, A; Massonnet, G; Buzzini, P

    2007-04-11

    Nicephor[e] is a project funded by "Swiss Virtual Campus" and aims at creating a distant or mixed web-based learning system in forensic and scientific photography and microscopy. The practical goal is to organize series of on-line modular courses corresponding to the educational requirements of undergraduate academic programs. Additionally, this program could be used in the context of continuing educational programs. The architecture of the project is designed to guarantee a high level of knowledge in forensic and scientific photographic techniques, and to have an easy content production and the ability to create a number of different courses sharing the same content. The e-learning system Nicephor[e] consists of three different parts. The first one is a repository of learning objects that gathers all theoretical subject matter of the project such as texts, animations, images, and films. This repository is a web content management system (Typo3) that permits creating, publishing, and administrating dynamic content via a web browser as well as storing it into a database. The flexibility of the system's architecture allows for an easy updating of the content to follow the development of photographic technology. The instructor of a course can decide which modular contents need to be included in the course, and in which order they will be accessed by students. All the modular courses are developed in a learning management system (WebCT or Moodle) that can deal with complex learning scenarios, content distribution, students, tests, and interaction with instructor. Each course has its own learning scenario based on the goals of the course and the student's profile. The content of each course is taken from the content management system. It is then structured in the learning management system according to the pedagogical goals defined by the instructor. The modular courses are created in a highly interactive setting and offer autoevaluating tests to the students. The last part of the system is a digital assets management system (Extensis Portfolio). The practical portion of each course is to produce images of different marks or objects. The collection of all this material produced, indexed by the students and corrected by the instructor is essential to the development of a knowledge base of photographic techniques applied to a specific forensic subject. It represents also an extensible collection of different marks from known sources obtained under various conditions. It allows to reuse these images for creating image-based case files.

  9. Dialogic e-Learning2learn: Creating Global Digital Networks and Educational Knowledge Building Architectures across Diversity

    ERIC Educational Resources Information Center

    Sorensen, Elsebeth Korsgaard

    2007-01-01

    Purpose: The purpose of this paper is to address the challenge and potential of online higher and continuing education, of fostering and promoting, in a global perspective across time and space, democratic values working for a better world. Design/methodology/approach: The paper presents a generalized dialogic learning architecture of networked…

  10. Tutorial on architectural acoustics

    NASA Astrophysics Data System (ADS)

    Shaw, Neil; Talaske, Rick; Bistafa, Sylvio

    2002-11-01

    This tutorial is intended to provide an overview of current knowledge and practice in architectural acoustics. Topics covered will include basic concepts and history, acoustics of small rooms (small rooms for speech such as classrooms and meeting rooms, music studios, small critical listening spaces such as home theatres) and the acoustics of large rooms (larger assembly halls, auditoria, and performance halls).

  11. Expanding the Responsibility of Architectural Education: Civic Professionalism in Two Schools of Architecture

    ERIC Educational Resources Information Center

    Rinehart, Michelle A.

    2010-01-01

    There has been a renewed interest in the purposes of professional education and the teaching of civic professionalism, whereby future professionals are exposed to their responsibility to use their specialized skills and knowledge to serve the public good. Recent studies on civic purposes in professional education, however, have largely ignored the…

  12. A Model Based Framework for Semantic Interpretation of Architectural Construction Drawings

    ERIC Educational Resources Information Center

    Babalola, Olubi Oluyomi

    2011-01-01

    The study addresses the automated translation of architectural drawings from 2D Computer Aided Drafting (CAD) data into a Building Information Model (BIM), with emphasis on the nature, possible role, and limitations of a drafting language Knowledge Representation (KR) on the problem and process. The central idea is that CAD to BIM translation is a…

  13. A Critical Mapping of Practice-Based Research as Evidenced by Swedish Architectural Theses

    ERIC Educational Resources Information Center

    Buchler, Daniela; Biggs, Michael A. R.; Stahl, Lars-Henrik

    2011-01-01

    This article presents an investigation that was funded by the Swedish Institute into the role of creative practice in architectural research as evidenced in Swedish doctoral theses. The sample was mapped and analysed in terms of clusters of interest, approaches, cultures of knowledge and uses of creative practice. This allowed the identification…

  14. Automatic acquisition of domain and procedural knowledge

    NASA Technical Reports Server (NTRS)

    Ferber, H. J.; Ali, M.

    1988-01-01

    The design concept and performance of AKAS, an automated knowledge-acquisition system for the development of expert systems, are discussed. AKAS was developed using the FLES knowledge base for the electrical system of the B-737 aircraft and employs a 'learn by being told' strategy. The system comprises four basic modules, a system administration module, a natural-language concept-comprehension module, a knowledge-classification/extraction module, and a knowledge-incorporation module; details of the module architectures are explored.

  15. A resilient and secure software platform and architecture for distributed spacecraft

    NASA Astrophysics Data System (ADS)

    Otte, William R.; Dubey, Abhishek; Karsai, Gabor

    2014-06-01

    A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.

  16. Management of space networks

    NASA Technical Reports Server (NTRS)

    Markley, R. W.; Williams, B. F.

    1993-01-01

    NASA has proposed missions to the Moon and Mars that reflect three areas of emphasis: human presence, exploration, and space resource development for the benefit of Earth. A major requirement for such missions is a robust and reliable communications architecture. Network management--the ability to maintain some degree of human and automatic control over the span of the network from the space elements to the end users on Earth--is required to realize such robust and reliable communications. This article addresses several of the architectural issues associated with space network management. Round-trip delays, such as the 5- to 40-min delays in the Mars case, introduce a host of problems that must be solved by delegating significant control authority to remote nodes. Therefore, management hierarchy is one of the important architectural issues. The following article addresses these concerns, and proposes a network management approach based on emerging standards that covers the needs for fault, configuration, and performance management, delegated control authority, and hierarchical reporting of events. A relatively simple approach based on standards was demonstrated in the DSN 2000 Information Systems Laboratory, and the results are described.

  17. A Content Standard for Computational Models; Digital Rights Management (DRM) Architectures; A Digital Object Approach to Interoperable Rights Management: Finely-Grained Policy Enforcement Enabled by a Digital Object Infrastructure; LOCKSS: A Permanent Web Publishing and Access System; Tapestry of Time and Terrain.

    ERIC Educational Resources Information Center

    Hill, Linda L.; Crosier, Scott J.; Smith, Terrence R.; Goodchild, Michael; Iannella, Renato; Erickson, John S.; Reich, Vicky; Rosenthal, David S. H.

    2001-01-01

    Includes five articles. Topics include requirements for a content standard to describe computational models; architectures for digital rights management systems; access control for digital information objects; LOCKSS (Lots of Copies Keep Stuff Safe) that allows libraries to run Web caches for specific journals; and a Web site from the U.S.…

  18. Space Station Freedom power management and distribution design status

    NASA Technical Reports Server (NTRS)

    Javidi, S.; Gholdston, E.; Stroh, P.

    1989-01-01

    The design status of the power management and distribution electric power system for the Space Station Freedom is presented. The current design is a star architecture, which has been found to be the best approach for meeting the requirement to deliver 120 V dc to the user interface. The architecture minimizes mass and power losses while improving element-to-element isolation and system flexibility. The design is partitioned into three elements: energy collection, storage and conversion, system protection and distribution, and management and control.

  19. Towards Methodologies for Building Knowledge-Based Instructional Systems.

    ERIC Educational Resources Information Center

    Duchastel, Philippe

    1992-01-01

    Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…

  20. Modelling Teaching Strategies.

    ERIC Educational Resources Information Center

    Major, Nigel

    1995-01-01

    Describes a modelling language for representing teaching strategies, based in the context of the COCA intelligent tutoring system. Examines work on meta-reasoning in knowledge-based systems and describes COCA's architecture, giving details of the language used for representing teaching knowledge. Discusses implications for future work. (AEF)

  1. Microclimate and architectural tectonic: vernacular floating house resilience in Seberang Ulu 1, Palembang

    NASA Astrophysics Data System (ADS)

    Puspitasari, P.; Kadri, T.; Indartoyo, I.; Kusumawati, L.

    2018-01-01

    This paper aims to describe the results of preliminary research on floating houses on the Musi River, Seberang Ulu 1, Palembang, focused on studying the influence of microclimates to the tectonics of Rumah Rakit (Floating House). The increase of water surface due to global warming will increase the need of using floating house typology in the future. The description of the inhabitants’ experiences on applying technics to create vernacular floating houses is considered as significant knowledge to develop advance technology on the basis of local characteristic. Vernacular floating houses resilience consists of natural experiences of inhabitants in adapting their daily activities to the characteristic of local climate. By using qualitative approach, the Rumah Rakit inhabitants’ verbal information in this article becomes the main aspect in exploring local knowledge. At the end, the conceptual model of vernacular Rumah Rakit in Seberang Ulu 1, Palembang is formulated, in terms of building architectural tectonic that is closely related to the local climate characteristic. The knowledge can be utilized in the context of rehabilitation or preservation of such architectural objects that are their existences tend to be extinct at this time.

  2. Combining metric episodes with semantic event concepts within the Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS)

    NASA Astrophysics Data System (ADS)

    Kelley, Troy D.; McGhee, S.

    2013-05-01

    This paper describes the ongoing development of a robotic control architecture that inspired by computational cognitive architectures from the discipline of cognitive psychology. The Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS) combines symbolic and sub-symbolic representations of knowledge into a unified control architecture. The new architecture leverages previous work in cognitive architectures, specifically the development of the Adaptive Character of Thought-Rational (ACT-R) and Soar. This paper details current work on learning from episodes or events. The use of episodic memory as a learning mechanism has, until recently, been largely ignored by computational cognitive architectures. This paper details work on metric level episodic memory streams and methods for translating episodes into abstract schemas. The presentation will include research on learning through novelty and self generated feedback mechanisms for autonomous systems.

  3. An Open Specification for Space Project Mission Operations Control Architectures

    NASA Technical Reports Server (NTRS)

    Hooke, A.; Heuser, W. R.

    1995-01-01

    An 'open specification' for Space Project Mission Operations Control Architectures is under development in the Spacecraft Control Working Group of the American Institute for Aeronautics and Astro- nautics. This architecture identifies 5 basic elements incorporated in the design of similar operations systems: Data, System Management, Control Interface, Decision Support Engine, & Space Messaging Service.

  4. Enhancing Architecture-Implementation Conformance with Change Management and Support for Behavioral Mapping

    ERIC Educational Resources Information Center

    Zheng, Yongjie

    2012-01-01

    Software architecture plays an increasingly important role in complex software development. Its further application, however, is challenged by the fact that software architecture, over time, is often found not conformant to its implementation. This is usually caused by frequent development changes made to both artifacts. Against this background,…

  5. 77 FR 187 - Federal Acquisition Regulation; Transition to the System for Award Management (SAM)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-03

    ... architecture. Deletes reference to ``business partner network'' at 4.1100, Scope, which is no longer necessary...) architecture has begun. This effort will transition the Central Contractor Registration (CCR) database, the...) to the new architecture. This case provides the first step in updating the FAR for these changes, and...

  6. Applying emerging digital video interface standards to airborne avionics sensor and digital map integrations: benefits outweigh the initial costs

    NASA Astrophysics Data System (ADS)

    Kuehl, C. Stephen

    1996-06-01

    Video signal system performance can be compromised in a military aircraft cockpit management system (CMS) with the tailoring of vintage Electronics Industries Association (EIA) RS170 and RS343A video interface standards. Video analog interfaces degrade when induced system noise is present. Further signal degradation has been traditionally associated with signal data conversions between avionics sensor outputs and the cockpit display system. If the CMS engineering process is not carefully applied during the avionics video and computing architecture development, extensive and costly redesign will occur when visual sensor technology upgrades are incorporated. Close monitoring and technical involvement in video standards groups provides the knowledge-base necessary for avionic systems engineering organizations to architect adaptable and extendible cockpit management systems. With the Federal Communications Commission (FCC) in the process of adopting the Digital HDTV Grand Alliance System standard proposed by the Advanced Television Systems Committee (ATSC), the entertainment and telecommunications industries are adopting and supporting the emergence of new serial/parallel digital video interfaces and data compression standards that will drastically alter present NTSC-M video processing architectures. The re-engineering of the U.S. Broadcasting system must initially preserve the electronic equipment wiring networks within broadcast facilities to make the transition to HDTV affordable. International committee activities in technical forums like ITU-R (former CCIR), ANSI/SMPTE, IEEE, and ISO/IEC are establishing global consensus on video signal parameterizations that support a smooth transition from existing analog based broadcasting facilities to fully digital computerized systems. An opportunity exists for implementing these new video interface standards over existing video coax/triax cabling in military aircraft cockpit management systems. Reductions in signal conversion processing steps, major improvement in video noise reduction, and an added capability to pass audio/embedded digital data within the digital video signal stream are the significant performance increases associated with the incorporation of digital video interface standards. By analyzing the historical progression of military CMS developments, establishing a systems engineering process for CMS design, tracing the commercial evolution of video signal standardization, adopting commercial video signal terminology/definitions, and comparing/contrasting CMS architecture modifications using digital video interfaces; this paper provides a technical explanation on how a systems engineering process approach to video interface standardization can result in extendible and affordable cockpit management systems.

  7. A support architecture for reliable distributed computing systems

    NASA Technical Reports Server (NTRS)

    Mckendry, Martin S.

    1986-01-01

    The Clouds kernel design was through several design phases and is nearly complete. The object manager, the process manager, the storage manager, the communications manager, and the actions manager are examined.

  8. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Propulsion System Choices and Their Implications

    NASA Technical Reports Server (NTRS)

    Joyner, Claude R., II; Levack, Daniel J. H.; Rhodes, Russell, E.; Robinson, John W.

    2010-01-01

    In defining a space vehicle architecture, the propulsion system and related subsystem choices will have a major influence on achieving the goals and objectives desired. There are many alternatives and the choices made must produce a system that meets the performance requirements, but at the same time also provide the greatest opportunity of reaching all of the required objectives. Recognizing the above, the SPST Functional Requirements subteam has drawn on the knowledge, expertise, and experience of its members, to develop insight that wiIJ effectively aid the architectural concept developer in making the appropriate choices consistent with the architecture goals. This data not only identifies many selected choices, but also, more importantly, presents the collective assessment of this subteam on the "pros" and the "cons" of these choices. The propulsion system choices with their pros and cons are presented in five major groups. A. System Integration Approach. Focused on the requirement for safety, reliability, dependability, maintainability, and low cost. B. Non-Chemical Propulsion. Focused on choice of propulsion type. C. Chemical Propulsion. Focused on propellant choice implications. D. Functional Integration. Focused on the degree of integration of the many propulsive and closely associated functions, and on the choice of the engine combustion power cycle. E. Thermal Management. Focused on propellant tank insulation and integration. Each of these groups is further broken down into subgroups, and at that level the consensus pros and cons are presented. The intended use of this paper is to provide a resource of focused material for architectural concept developers to use in designing new advanced systems including college design classes. It is also a possible source of input material for developing a model for designing and analyzing advanced concepts to help identify focused technology needs and their priorities.

  10. 7 CFR 1724.21 - Architectural services contracts.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the architect furnishes or obtains all architectural services related to the design and construction management of the facilities. (c) Reasonable modifications or additions to the terms and conditions in the...

  11. 7 CFR 1724.21 - Architectural services contracts.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the architect furnishes or obtains all architectural services related to the design and construction management of the facilities. (c) Reasonable modifications or additions to the terms and conditions in the...

  12. 7 CFR 1724.21 - Architectural services contracts.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the architect furnishes or obtains all architectural services related to the design and construction management of the facilities. (c) Reasonable modifications or additions to the terms and conditions in the...

  13. 7 CFR 1724.21 - Architectural services contracts.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the architect furnishes or obtains all architectural services related to the design and construction management of the facilities. (c) Reasonable modifications or additions to the terms and conditions in the...

  14. Advanced Ground Systems Maintenance Enterprise Architecture Project

    NASA Technical Reports Server (NTRS)

    Harp, Janicce Leshay

    2014-01-01

    The project implements an architecture for delivery of integrated health management capabilities for the 21st Century launch complex. Capabilities include anomaly detection, fault isolation, prognostics and physics-based diagnostics.

  15. GAMES II Project: a general architecture for medical knowledge-based systems.

    PubMed

    Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M

    1994-10-01

    GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.

  16. Real-Time Management of Multimodal Streaming Data for Monitoring of Epileptic Patients.

    PubMed

    Triantafyllopoulos, Dimitrios; Korvesis, Panagiotis; Mporas, Iosif; Megalooikonomou, Vasileios

    2016-03-01

    New generation of healthcare is represented by wearable health monitoring systems, which provide real-time monitoring of patient's physiological parameters. It is expected that continuous ambulatory monitoring of vital signals will improve treatment of patients and enable proactive personal health management. In this paper, we present the implementation of a multimodal real-time system for epilepsy management. The proposed methodology is based on a data streaming architecture and efficient management of a big flow of physiological parameters. The performance of this architecture is examined for varying spatial resolution of the recorded data.

  17. Association mapping of brassinosteroid candidate genes and plant architecture in a diverse panel of Sorghum bicolor.

    PubMed

    Mantilla Perez, Maria B; Zhao, Jing; Yin, Yanhai; Hu, Jieyun; Salas Fernandez, Maria G

    2014-12-01

    This first association analysis between plant architecture and BR candidate genes in sorghum suggests that natural allelic variation has significant and pleiotropic effects on plant architecture phenotypes. Sorghum bicolor (L) Moench is a self-pollinated species traditionally used as a staple crop for human consumption and as a forage crop for livestock feed. Recently, sorghum has received attention as a bioenergy crop due to its water use efficiency and biomass yield potential. Breeding for superior bioenergy-type lines requires knowledge of the genetic mechanisms controlling plant architecture. Brassinosteroids (BRs) are a group of hormones that determine plant growth, development, and architecture. Biochemical and genetic information on BRs are available from model species but the application of that knowledge to crop species has been very limited. A candidate gene association mapping approach and a diverse sorghum collection of 315 accessions were used to assess marker-trait associations between BR biosynthesis and signaling genes and six plant architecture traits. A total of 263 single nucleotide polymorphisms (SNPs) from 26 BR genes were tested, 73 SNPs were significantly associated with the phenotypes of interest and 18 of those were associated with more than one trait. An analysis of the phenotypic variation explained by each BR pathway revealed that the signaling pathway had a larger effect for most phenotypes (R (2) = 0.05-0.23). This study constitutes the first association analysis between plant architecture and BR genes in sorghum and the first LD mapping for leaf angle, stem circumference, panicle exsertion and panicle length. Markers on or close to BKI1 associated with all phenotypes and thus, they are the most important outcomes of this study and will be further validated for their future application in breeding programs.

  18. An architecture model for multiple disease management information systems.

    PubMed

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  19. Thermal Control System Automation Project (TCSAP)

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.

    1991-01-01

    Information is given in viewgraph form on the Space Station Freedom (SSF) Thermal Control System Automation Project (TCSAP). Topics covered include the assembly of the External Thermal Control System (ETCS); the ETCS functional schematic; the baseline Fault Detection, Isolation, and Recovery (FDIR), including the development of a knowledge based system (KBS) for application of rule based reasoning to the SSF ETCS; TCSAP software architecture; the High Fidelity Simulator architecture; the TCSAP Runtime Object Database (RODB) data flow; KBS functional architecture and logic flow; TCSAP growth and evolution; and TCSAP relationships.

  20. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  1. Two complementary personal medication management applications developed on a common platform: case report.

    PubMed

    Ross, Stephen E; Johnson, Kevin B; Siek, Katie A; Gordon, Jeffry S; Khan, Danish U; Haverhals, Leah M

    2011-07-12

    Adverse drug events are a major safety issue in ambulatory care. Improving medication self-management could reduce these adverse events. Researchers have developed medication applications for tethered personal health records (PHRs), but little has been reported about medication applications for interoperable PHRs. Our objective was to develop two complementary personal health applications on a common PHR platform: one to assist children with complex health needs (MyMediHealth), and one to assist older adults in care transitions (Colorado Care Tablet). The applications were developed using a user-centered design approach. The two applications shared a common PHR platform based on a service-oriented architecture. MyMediHealth employed Web and mobile phone user interfaces. Colorado Care Tablet employed a Web interface customized for a tablet PC. We created complementary medication management applications tailored to the needs of distinctly different user groups using common components. Challenges were addressed in multiple areas, including how to encode medication identities, how to incorporate knowledge bases for medication images and consumer health information, how to include supplementary dosing information, how to simplify user interfaces for older adults, and how to support mobile devices for children. These prototypes demonstrate the utility of abstracting PHR data and services (the PHR platform) from applications that can be tailored to meet the needs of diverse patients. Based on the challenges we faced, we provide recommendations on the structure of publicly available knowledge resources and the use of mobile messaging systems for PHR applications.

  2. Characterization of Emergent Data Networks Among Long-Tail Data

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa; Kumar, Praveen; Hedstrom, Margaret; Myers, James; Plale, Beth; Marini, Luigi; McDonald, Robert

    2014-05-01

    Data curation underpins data-driven scientific advancements. It manages the information flux across multiple users throughout data life cycle as well as increases data sustainability and reusability. The exponential growth in data production spanning across the Earth Science involving individual and small research groups, which is termed as log-tail data, increases the data-knowledge latency among related domains. It has become clear that an advanced framework-agnostic metadata and ontologies for long-tail data is required to increase their visibility to each other, and provide concise and meaningful descriptions that reveal their connectivity. Despite the advancement that has been achieved by various sophisticated data management models in different Earth Science disciplines, it is not always straightforward to derive relationships among long-tail data. Semantic data clustering algorithms and pre-defined logic rules that are oriented toward prediction of possible data relationships, is one method to address these challenges. Our work advances the connectivity of related long-tail data by introducing the design for an ontology-based knowledge management system. In this work, we present the system architecture, its components, and illustrate how it can be used to scrutinize the connectivity among datasets. To demonstrate the capabilities of this "data network" prototype, we implemented this approach within the Sustainable Environment Actionable Data (SEAD) environment, an open-source semantic content repository that provides a RDF database for long-tail data, and show how emergent relationships among datasets can be identified.

  3. Intelligent deflection routing in buffer-less networks.

    PubMed

    Haeri, Soroush; Trajković, Ljiljana

    2015-02-01

    Deflection routing is employed to ameliorate packet loss caused by contention in buffer-less architectures such as optical burst-switched networks. The main goal of deflection routing is to successfully deflect a packet based only on a limited knowledge that network nodes possess about their environment. In this paper, we present a framework that introduces intelligence to deflection routing (iDef). iDef decouples the design of the signaling infrastructure from the underlying learning algorithm. It consists of a signaling and a decision-making module. Signaling module implements a feedback management protocol while the decision-making module implements a reinforcement learning algorithm. We also propose several learning-based deflection routing protocols, implement them in iDef using the ns-3 network simulator, and compare their performance.

  4. An Intelligent Propulsion Control Architecture to Enable More Autonomous Vehicle Operation

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Sowers, T. Shane; Simon, Donald L.; Owen, A. Karl; Rinehart, Aidan W.; Chicatelli, Amy K.; Acheson, Michael J.; Hueschen, Richard M.; Spiers, Christopher W.

    2018-01-01

    This paper describes an intelligent propulsion control architecture that coordinates with the flight control to reduce the amount of pilot intervention required to operate the vehicle. Objectives of the architecture include the ability to: automatically recognize the aircraft operating state and flight phase; configure engine control to optimize performance with knowledge of engine condition and capability; enhance aircraft performance by coordinating propulsion control with flight control; and recognize off-nominal propulsion situations and to respond to them autonomously. The hierarchical intelligent propulsion system control can be decomposed into a propulsion system level and an individual engine level. The architecture is designed to be flexible to accommodate evolving requirements, adapt to technology improvements, and maintain safety.

  5. Machine Learning for the Knowledge Plane

    DTIC Science & Technology

    2006-06-01

    this idea is to combine techniques from machine learning with new architectural concepts in networking to make the internet self-aware and self...work on the machine learning portion of the Knowledge Plane. This consisted of three components: (a) we wrote a document formulating the various

  6. Perceptual telerobotics

    NASA Technical Reports Server (NTRS)

    Ligomenides, Panos A.

    1989-01-01

    A sensory world modeling system, congruent with a human expert's perception, is proposed. The Experiential Knowledge Base (EKB) system can provide a highly intelligible communication interface for telemonitoring and telecontrol of a real time robotic system operating in space. Paradigmatic acquisition of empirical perceptual knowledge, and real time experiential pattern recognition and knowledge integration are reviewed. The cellular architecture and operation of the EKB system are also examined.

  7. A Multi Agent Based Approach for Prehospital Emergency Management.

    PubMed

    Safdari, Reza; Shoshtarian Malak, Jaleh; Mohammadzadeh, Niloofar; Danesh Shahraki, Azimeh

    2017-07-01

    To demonstrate an architecture to automate the prehospital emergency process to categorize the specialized care according to the situation at the right time for reducing the patient mortality and morbidity. Prehospital emergency process were analyzed using existing prehospital management systems, frameworks and the extracted process were modeled using sequence diagram in Rational Rose software. System main agents were identified and modeled via component diagram, considering the main system actors and by logically dividing business functionalities, finally the conceptual architecture for prehospital emergency management was proposed. The proposed architecture was simulated using Anylogic simulation software. Anylogic Agent Model, State Chart and Process Model were used to model the system. Multi agent systems (MAS) had a great success in distributed, complex and dynamic problem solving environments, and utilizing autonomous agents provides intelligent decision making capabilities.  The proposed architecture presents prehospital management operations. The main identified agents are: EMS Center, Ambulance, Traffic Station, Healthcare Provider, Patient, Consultation Center, National Medical Record System and quality of service monitoring agent. In a critical condition like prehospital emergency we are coping with sophisticated processes like ambulance navigation health care provider and service assignment, consultation, recalling patients past medical history through a centralized EHR system and monitoring healthcare quality in a real-time manner. The main advantage of our work has been the multi agent system utilization. Our Future work will include proposed architecture implementation and evaluation of its impact on patient quality care improvement.

  8. A Multi Agent Based Approach for Prehospital Emergency Management

    PubMed Central

    Safdari, Reza; Shoshtarian Malak, Jaleh; Mohammadzadeh, Niloofar; Danesh Shahraki, Azimeh

    2017-01-01

    Objective: To demonstrate an architecture to automate the prehospital emergency process to categorize the specialized care according to the situation at the right time for reducing the patient mortality and morbidity. Methods: Prehospital emergency process were analyzed using existing prehospital management systems, frameworks and the extracted process were modeled using sequence diagram in Rational Rose software. System main agents were identified and modeled via component diagram, considering the main system actors and by logically dividing business functionalities, finally the conceptual architecture for prehospital emergency management was proposed. The proposed architecture was simulated using Anylogic simulation software. Anylogic Agent Model, State Chart and Process Model were used to model the system. Results: Multi agent systems (MAS) had a great success in distributed, complex and dynamic problem solving environments, and utilizing autonomous agents provides intelligent decision making capabilities.  The proposed architecture presents prehospital management operations. The main identified agents are: EMS Center, Ambulance, Traffic Station, Healthcare Provider, Patient, Consultation Center, National Medical Record System and quality of service monitoring agent. Conclusion: In a critical condition like prehospital emergency we are coping with sophisticated processes like ambulance navigation health care provider and service assignment, consultation, recalling patients past medical history through a centralized EHR system and monitoring healthcare quality in a real-time manner. The main advantage of our work has been the multi agent system utilization. Our Future work will include proposed architecture implementation and evaluation of its impact on patient quality care improvement. PMID:28795061

  9. Thermal Management of Quantum Cascade Lasers in an individually Addressable Array Architecture

    DTIC Science & Technology

    2016-02-08

    Thermal Management of Quantum Cascade Lasers in an Individually Addressable Monolithic Array Architecture Leo Missaggia, Christine Wang, Michael...power laser systems in the mid-to-long-infrared wavelength range. By virtue of their demonstrated watt-level performance and wavelength diversity...quantum cascade laser (QCL) and amplifier devices are an excellent choice of emitter for those applications. To realize the power levels of interest

  10. Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.

    PubMed

    Blobel, B G M E; Engel, K; Pharow, P

    2006-01-01

    To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.

  11. Incorporation of Personal Single Nucleotide Polymorphism (SNP) Data into a National Level Electronic Health Record for Disease Risk Assessment, Part 2: The Incorporation of SNP into the National Health Information System of Turkey

    PubMed Central

    Beyan, Timur

    2014-01-01

    Background A personalized medicine approach provides opportunities for predictive and preventive medicine. Using genomic, clinical, environmental, and behavioral data, the tracking and management of individual wellness is possible. A prolific way to carry this personalized approach into routine practices can be accomplished by integrating clinical interpretations of genomic variations into electronic medical record (EMR)s/electronic health record (EHR)s systems. Today, various central EHR infrastructures have been constituted in many countries of the world, including Turkey. Objective As an initial attempt to develop a sophisticated infrastructure, we have concentrated on incorporating the personal single nucleotide polymorphism (SNP) data into the National Health Information System of Turkey (NHIS-T) for disease risk assessment, and evaluated the performance of various predictive models for prostate cancer cases. We present our work as a miniseries containing three parts: (1) an overview of requirements, (2) the incorporation of SNP into the NHIS-T, and (3) an evaluation of SNP data incorporated into the NHIS-T for prostate cancer. Methods For the second article of this miniseries, we have analyzed the existing NHIS-T and proposed the possible extensional architectures. In light of the literature survey and characteristics of NHIS-T, we have proposed and argued opportunities and obstacles for a SNP incorporated NHIS-T. A prototype with complementary capabilities (knowledge base and end-user applications) for these architectures has been designed and developed. Results In the proposed architectures, the clinically relevant personal SNP (CR-SNP) and clinicogenomic associations are shared between central repositories and end-users via the NHIS-T infrastructure. To produce these files, we need to develop a national level clinicogenomic knowledge base. Regarding clinicogenomic decision support, we planned to complete interpretation of these associations on the end-user applications. This approach gives us the flexibility to add/update envirobehavioral parameters and family health history that will be monitored or collected by end users. Conclusions Our results emphasized that even though the existing NHIS-T messaging infrastructure supports the integration of SNP data and clinicogenomic association, it is critical to develop a national level, accredited knowledge base and better end-user systems for the interpretation of genomic, clinical, and envirobehavioral parameters. PMID:25599817

  12. Incorporation of personal single nucleotide polymorphism (SNP) data into a national level electronic health record for disease risk assessment, part 2: the incorporation of SNP into the national health information system of Turkey.

    PubMed

    Beyan, Timur; Aydın Son, Yeşim

    2014-08-11

    A personalized medicine approach provides opportunities for predictive and preventive medicine. Using genomic, clinical, environmental, and behavioral data, the tracking and management of individual wellness is possible. A prolific way to carry this personalized approach into routine practices can be accomplished by integrating clinical interpretations of genomic variations into electronic medical record (EMR)s/electronic health record (EHR)s systems. Today, various central EHR infrastructures have been constituted in many countries of the world, including Turkey. As an initial attempt to develop a sophisticated infrastructure, we have concentrated on incorporating the personal single nucleotide polymorphism (SNP) data into the National Health Information System of Turkey (NHIS-T) for disease risk assessment, and evaluated the performance of various predictive models for prostate cancer cases. We present our work as a miniseries containing three parts: (1) an overview of requirements, (2) the incorporation of SNP into the NHIS-T, and (3) an evaluation of SNP data incorporated into the NHIS-T for prostate cancer. For the second article of this miniseries, we have analyzed the existing NHIS-T and proposed the possible extensional architectures. In light of the literature survey and characteristics of NHIS-T, we have proposed and argued opportunities and obstacles for a SNP incorporated NHIS-T. A prototype with complementary capabilities (knowledge base and end-user applications) for these architectures has been designed and developed. In the proposed architectures, the clinically relevant personal SNP (CR-SNP) and clinicogenomic associations are shared between central repositories and end-users via the NHIS-T infrastructure. To produce these files, we need to develop a national level clinicogenomic knowledge base. Regarding clinicogenomic decision support, we planned to complete interpretation of these associations on the end-user applications. This approach gives us the flexibility to add/update envirobehavioral parameters and family health history that will be monitored or collected by end users. Our results emphasized that even though the existing NHIS-T messaging infrastructure supports the integration of SNP data and clinicogenomic association, it is critical to develop a national level, accredited knowledge base and better end-user systems for the interpretation of genomic, clinical, and envirobehavioral parameters.

  13. State-of-the-lagoon reports as vehicles of cross-disciplinary integration.

    PubMed

    Zaucha, Jacek; Davoudi, Simin; Slob, Adriaan; Bouma, Geiske; van Meerkerk, Ingmar; Oen, Amy Mp; Breedveld, Gijs D

    2016-10-01

    An integrative approach across disciplines is needed for sustainable lagoon and estuary management as identified by integrated coastal zone management. The ARCH research project (Architecture and roadmap to manage multiple pressures on lagoons) has taken initial steps to overcome the boundaries between disciplines and focus on cross-disciplinary integration by addressing the driving forces, challenges, and problems at various case study sites. A model was developed as a boundary-spanning activity to produce joint knowledge and understanding. The backbone of the model is formed by the interaction between the natural and human systems, including economy and governance-based subsystems. The model was used to create state-of-the-lagoon reports for 10 case study sites (lagoons and estuarine coastal areas), with a geographical distribution covering all major seas surrounding Europe. The reports functioned as boundary objects to build joint knowledge. The experiences related to the framing of the model and its subsequent implementation at the case study sites have resulted in key recommendations on how to address the challenges of cross-disciplinary work required for the proper management of complex social-ecological systems such as lagoons, estuarine areas, and other land-sea regions. Cross-disciplinary integration is initially resource intensive and time consuming; one should set aside the required resources and invest efforts at the forefront. It is crucial to create engagement among the group of researchers by focusing on a joint, appealing overall concept that will stimulate cross-sectoral thinking and focusing on the identified problems as a link between collected evidence and future management needs. Different methods for collecting evidence should be applied including both quantitative (jointly agreed indicators) and qualitative (narratives) information. Cross-disciplinary integration is facilitated by functional boundary objects. Integration offers important rewards in terms of developing a better understanding and subsequently improved management of complex social-ecological systems. Integr Environ Assess Manag 2016;12:690-700. © 2016 SETAC. © 2016 SETAC.

  14. A scalable architecture for incremental specification and maintenance of procedural and declarative clinical decision-support knowledge.

    PubMed

    Hatsek, Avner; Shahar, Yuval; Taieb-Maimon, Meirav; Shalom, Erez; Klimov, Denis; Lunenfeld, Eitan

    2010-01-01

    Clinical guidelines have been shown to improve the quality of medical care and to reduce its costs. However, most guidelines exist in a free-text representation and, without automation, are not sufficiently accessible to clinicians at the point of care. A prerequisite for automated guideline application is a machine-comprehensible representation of the guidelines. In this study, we designed and implemented a scalable architecture to support medical experts and knowledge engineers in specifying and maintaining the procedural and declarative aspects of clinical guideline knowledge, resulting in a machine comprehensible representation. The new framework significantly extends our previous work on the Digital electronic Guidelines Library (DeGeL) The current study designed and implemented a graphical framework for specification of declarative and procedural clinical knowledge, Gesher. We performed three different experiments to evaluate the functionality and usability of the major aspects of the new framework: Specification of procedural clinical knowledge, specification of declarative clinical knowledge, and exploration of a given clinical guideline. The subjects included clinicians and knowledge engineers (overall, 27 participants). The evaluations indicated high levels of completeness and correctness of the guideline specification process by both the clinicians and the knowledge engineers, although the best results, in the case of declarative-knowledge specification, were achieved by teams including a clinician and a knowledge engineer. The usability scores were high as well, although the clinicians' assessment was significantly lower than the assessment of the knowledge engineers.

  15. Advanced Ground Systems Maintenance Enterprise Architecture Project

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Compiler)

    2015-01-01

    The project implements an architecture for delivery of integrated health management capabilities for the 21st Century launch complex. The delivered capabilities include anomaly detection, fault isolation, prognostics and physics based diagnostics.

  16. PELS: A Noble Architecture and Framework for a Personal E-Learning System (PELS)

    ERIC Educational Resources Information Center

    Dewan, Jahangir; Chowdhury, Morshed; Batten, Lynn

    2014-01-01

    This article presents a personal e-learning system architecture in the context of a social network environment. The main objective of a personal e-learning system is to develop individual skills on a specific subject and share resources with peers. The authors' system architecture defines the organisation and management of a personal learning…

  17. Application developer's tutorial for the CSM testbed architecture

    NASA Technical Reports Server (NTRS)

    Underwood, Phillip; Felippa, Carlos A.

    1988-01-01

    This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.

  18. 41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What basic Art-in... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy governs Federal agencies? Federal agencies must incorporate fine arts as an integral part of the total...

  19. 41 CFR 102-76.25 - What standards must Federal agencies meet in providing architectural and interior design services?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Federal agencies meet in providing architectural and interior design services? 102-76.25 Section 102-76.25...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction § 102-76.25 What standards must Federal agencies meet in providing architectural and interior design services...

  20. 41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What basic Art-in... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy governs Federal agencies? Federal agencies must incorporate fine arts as an integral part of the total...

  1. Informatics and Decisions support in Galway Bay (SmartBay) using ERDDAP, OGC Technologies and Third Party Data Sources to Provide Services to the Marine Community.

    NASA Astrophysics Data System (ADS)

    Delaney, Conor; Gaughan, Paul; Smyth, Damian

    2013-04-01

    The global marine sector generates and consumes vast quantities of operational and forecast data on a daily basis. One of the key challenges facing the sector relates to the management and transformation of that data into knowledge. The Irish Marine Institute (MI) generates oceanographic and environmental data on a regular and frequent basis. This data comes from operational ocean models run on the MI's high performance computer (HPC) and various environmental observation sensors systems. Some of the data published by the Marine Institute is brokered by the Environmental Research Division's Data Access Program (ERDDAP) data broker, which is a broker technology that uses technology based on OPeNDAP and Open Geospatial Consortium (OGC) standards. The broker provides a consistent web service interface to the data services of the Marine Institute; these services include wave, tide and weather sensors and numerical model output. An ERDDAP server publishes data in a number of standard and developer friendly ways, including some OGC formats. The data on the MI ERDDAP (http://erddap.marine.ie) server is published as OpenData. The marine work package of the FP7 funded ENVIROFI project (http://www.envirofi.eu/) has used the ERDDAP data broker as a core resource in the development of its Marine Asset management decision Support Tool (MAST) portal and phone App. Communication between MAST and ERDDAP is via a Uniform Resource Identifier (Linked Data). A key objective of the MAST prototype is to demonstrate the potential of next-generation dynamic web-based products and services and how they can be harnessed to facilitate growth of both the marine and IT sectors. The use case driving the project is the management of ocean energy assets in the marine environment. In particular the provision of information that aid in the decision making process surrounding maintenance at sea. This question is common to any offshore industry and solution proposed here is applicable to other users of Galway Bay, Ireland. The architecture of the MAST is based on the concepts of Representational State Transfer (REST), Resource Orientated Architecture (ROA), Service Orientated Architecture (SOA), OpenData and MASHUPS. In this paper we demonstrate the architecture of the MAST system and discuss the potential of ERDDAP technology to serve complex data in formats that are accessible to the general developer community. We also discuss of the potential of next generation web technologies and OpenData to encourage the use of valuable marine data resources.

  2. STGT program: Ada coding and architecture lessons learned

    NASA Technical Reports Server (NTRS)

    Usavage, Paul; Nagurney, Don

    1992-01-01

    STGT (Second TDRSS Ground Terminal) is currently halfway through the System Integration Test phase (Level 4 Testing). To date, many software architecture and Ada language issues have been encountered and solved. This paper, which is the transcript of a presentation at the 3 Dec. meeting, attempts to define these lessons plus others learned regarding software project management and risk management issues, training, performance, reuse, and reliability. Observations are included regarding the use of particular Ada coding constructs, software architecture trade-offs during the prototyping, development and testing stages of the project, and dangers inherent in parallel or concurrent systems, software, hardware, and operations engineering.

  3. Towards integration of clinical decision support in commercial hospital information systems using distributed, reusable software and knowledge components.

    PubMed

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2001-12-01

    Clinicians' acceptance of clinical decision support depends on its workflow-oriented, context-sensitive accessibility and availability at the point of care, integrated into the Electronic Patient Record (EPR). Commercially available Hospital Information Systems (HIS) often focus on administrative tasks and mostly do not provide additional knowledge based functionality. Their traditionally monolithic and closed software architecture encumbers integration of and interaction with external software modules. Our aim was to develop methods and interfaces to integrate knowledge sources into two different commercial hospital information systems to provide the best decision support possible within the context of available patient data. An existing, proven standalone scoring system for acute abdominal pain was supplemented by a communication interface. In both HIS we defined data entry forms and developed individual and reusable mechanisms for data exchange with external software modules. We designed an additional knowledge support frontend which controls data exchange between HIS and the knowledge modules. Finally, we added guidelines and algorithms to the knowledge library. Despite some major drawbacks which resulted mainly from the HIS' closed software architectures we showed exemplary, how external knowledge support can be integrated almost seamlessly into different commercial HIS. This paper describes the prototypical design and current implementation and discusses our experiences.

  4. Anchorage regional ITS architecture : summary report

    DOT National Transportation Integrated Search

    2004-10-14

    The Municipality of Anchorage (MOA) initiated the development of a regional Intelligent : Transportation System (ITS) architecture to manage implementation of a range of technologies that will improve transportation within the municipality. ITS is th...

  5. Space station needs, attributes and architectural options study. Volume 4: Architectural options, subsystems, technology and programmatics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Space station architectural options, habitability considerations and subsystem analyses, technology, and programmatics are reviewed. The methodology employed for conceiving and defining space station concepts is presented. As a result of this approach, architectures were conceived and along with their supporting rationale are described within this portion of the report. Habitability consideration and subsystem analyses describe the human factors associated with space station operations and includes subsections covering (1) data management, (2) communications and tracking, (3) environmental control and life support, (4) manipulator systems, (5) resupply, (6) pointing, (7) thermal management and (8) interface standardization. A consolidated matrix of subsystems technology issues as related to meeting the mission needs for a 1990's era space station is presented. Within the programmatics portion, a brief description of costing and program strategies is outlined.

  6. Medical Data Architecture (MDA) Project Status

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2018-01-01

    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.

  7. Medical Data Architecture Project Status

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2018-01-01

    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.

  8. SAMS--a systems architecture for developing intelligent health information systems.

    PubMed

    Yılmaz, Özgün; Erdur, Rıza Cenk; Türksever, Mustafa

    2013-12-01

    In this paper, SAMS, a novel health information system architecture for developing intelligent health information systems is proposed and also some strategies for developing such systems are discussed. The systems fulfilling this architecture will be able to store electronic health records of the patients using OWL ontologies, share patient records among different hospitals and provide physicians expertise to assist them in making decisions. The system is intelligent because it is rule-based, makes use of rule-based reasoning and has the ability to learn and evolve itself. The learning capability is provided by extracting rules from previously given decisions by the physicians and then adding the extracted rules to the system. The proposed system is novel and original in all of these aspects. As a case study, a system is implemented conforming to SAMS architecture for use by dentists in the dental domain. The use of the developed system is described with a scenario. For evaluation, the developed dental information system will be used and tried by a group of dentists. The development of this system proves the applicability of SAMS architecture. By getting decision support from a system derived from this architecture, the cognitive gap between experienced and inexperienced physicians can be compensated. Thus, patient satisfaction can be achieved, inexperienced physicians are supported in decision making and the personnel can improve their knowledge. A physician can diagnose a case, which he/she has never diagnosed before, using this system. With the help of this system, it will be possible to store general domain knowledge in this system and the personnel's need to medical guideline documents will be reduced.

  9. The elements of a comprehensive education for future architectural acousticians

    NASA Astrophysics Data System (ADS)

    Wang, Lily M.

    2005-04-01

    Curricula for students who seek to become consultants of architectural acoustics or researchers in the field are few in the United States and in the world. This paper will present the author's opinions on the principal skills a student should obtain from a focused course of study in architectural acoustics. These include: (a) a solid command of math and wave theory, (b) fluency with digital signal processing techniques and sound measurement equipment, (c) expertise in using architectural acoustic software with an understanding of its limitations, (d) knowledge of building mechanical systems, (e) an understanding of human psychoacoustics, and (f) an appreciation for the artistic aspects of the discipline. Additionally, writing and presentation skills should be emphasized and participation in professional societies encouraged. Armed with such abilities, future architectural acousticians will advance the field significantly.

  10. The computational structural mechanics testbed architecture. Volume 4: The global-database manager GAL-DBM

    NASA Technical Reports Server (NTRS)

    Wright, Mary A.; Regelbrugge, Marc E.; Felippa, Carlos A.

    1989-01-01

    This is the fourth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 4 describes the nominal-record data management component of the NICE software. It is intended for all users.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less

  12. SKA Telescope Manager (TM): status and architecture overview

    NASA Astrophysics Data System (ADS)

    Natarajan, Swaminathan; Barbosa, Domingos; Barraca, Joao P.; Bridger, Alan; Choudhury, Subhrojyoti R.; Di Carlo, Matteo; Dolci, Mauro; Gupta, Yashwant; Guzman, Juan; Van den Heever, Lize; Le Roux, Gerhard; Nicol, Mark; Patil, Mangesh; Smareglia, Riccardo; Swart, Paul; Thompson, Roger; Vrcic, Sonja; Williams, Stewart

    2016-07-01

    The SKA radio telescope project is building two telescopes, SKA-Low in Australia and SKA-Mid in South Africa respectively. The Telescope Manager is responsible for the observations lifecycle and for monitoring and control of each instrument, and is being developed by an international consortium. The project is currently in the design phase, with the Preliminary Design Review having been successfully completed, along with re-baselining to match project scope to available budget. This report presents the status of the Telescope Manager work, key architectural challenges and our approach to addressing them.

  13. Research and application of knowledge resources network for product innovation.

    PubMed

    Li, Chuan; Li, Wen-qiang; Li, Yan; Na, Hui-zhen; Shi, Qian

    2015-01-01

    In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users' enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method.

  14. Knowledge Management: A Skeptic's Guide

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2006-01-01

    A viewgraph presentation discussing knowledge management is shown. The topics include: 1) What is Knowledge Management? 2) Why Manage Knowledge? The Presenting Problems; 3) What Gets Called Knowledge Management? 4) Attempts to Rethink Assumptions about Knowledgs; 5) What is Knowledge? 6) Knowledge Management and INstitutional Memory; 7) Knowledge Management and Culture; 8) To solve a social problem, it's easier to call for cultural rather than organizational change; 9) Will the Knowledge Management Effort Succeed? and 10) Backup: Metrics for Valuing Intellectural Capital i.e. Knowledge.

  15. Multiscale Interactive Communication: Inside and Outside Thun Castle

    NASA Astrophysics Data System (ADS)

    Massari, G. A.; Luce, F.; Pellegatta, C.

    2011-09-01

    The applications of informatics to architecture have become, for professionals, a great tool for managing analytical phases and project activities but also, for the general public, new ways of communication that may relate directly present, past and future facts. Museums in historic buildings, their installations and the recent experiences of eco-museums located throughout the territory provide a privileged experimentation field for technical and digital representation. On the one hand, the safeguarding and the functional adaptation of buildings use 3D computer graphics models that are real spatially related databases: in them are ordered, viewed and interpreted the results of archival, artistic-historical, diagnostic, technological-structural studies and the assumption and feasibility of interventions. On the other hand, the disclosure of things and knowledge linked to collective memory relies on interactive maps and hypertext systems that provide access to authentic virtual museums; a sort of multimedia extension of the exhibition hall is produced to an architectural scale, but at landscape scale the result is an instrument of cultural development so far unpublished: works that are separated in direct perception find in a zenith view of the map a synthetic relation, related both to spatial parameters and temporal interpretations.

  16. TACIT: An open-source text analysis, crawling, and interpretation tool.

    PubMed

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  17. Managing Complex IT Security Processes with Value Based Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2009-01-01

    Current trends indicate that IT security measures will need to greatly expand to counter the ever increasingly sophisticated, well-funded and/or economically motivated threat space. Traditional risk management approaches provide an effective method for guiding courses of action for assessment, and mitigation investments. However, such approaches no matter how popular demand very detailed knowledge about the IT security domain and the enterprise/cyber architectural context. Typically, the critical nature and/or high stakes require careful consideration and adaptation of a balanced approach that provides reliable and consistent methods for rating vulnerabilities. As reported in earlier works, the Cyberspace Security Econometrics System provides amore » comprehensive measure of reliability, security and safety of a system that accounts for the criticality of each requirement as a function of one or more stakeholders interests in that requirement. This paper advocates a dependability measure that acknowledges the aggregate structure of complex system specifications, and accounts for variations by stakeholder, by specification components, and by verification and validation impact.« less

  18. Review of integrated digital systems: evolution and adoption

    NASA Astrophysics Data System (ADS)

    Fritz, Lawrence W.

    The factors that are influencing the evolution of photogrammetric and remote sensing technology to transition into fully integrated digital systems are reviewed. These factors include societal pressures for new, more timely digital products from the Spatial Information Sciencesand the adoption of rapid technological advancements in digital processing hardware and software. Current major developments in leading government mapping agencies of the USA, such as the Digital Production System (DPS) modernization programme at the Defense Mapping Agency, and the Automated Nautical Charting System II (ANCS-II) programme and Integrated Digital Photogrammetric Facility (IDPF) at NOAA/National Ocean Service, illustrate the significant benefits to be realized. These programmes are examples of different levels of integrated systems that have been designed to produce digital products. They provide insights to the management complexities to be considered for very large integrated digital systems. In recognition of computer industry trends, a knowledge-based architecture for managing the complexity of the very large spatial information systems of the future is proposed.

  19. Space Station needs, attributes and architectural options. Volume 2, book 2, part 2, Task 2: Information management system

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Missions to be performed, station operations and functions to be carried out, and technologies anticipated during the time frame of the space station were examined in order to determine the scope of the overall information management system for the space station. This system comprises: (1) the data management system which includes onboard computer related hardware and software required to assume and exercise control of all activities performed on the station; (2) the communication system for both internal and external communications; and (3) the ground segment. Techniques used to examine the information system from a functional and performance point of view are described as well as the analyses performed to derive the architecture of both the onboard data management system and the system for internal and external communications. These architectures are then used to generate a conceptual design of the onboard elements in order to determine the physical parameters (size/weight/power) of the hardware and software. The ground segment elements are summarized.

  20. Space Station needs, attributes and architectural options. Volume 2, book 2, part 2, Task 2: Information management system

    NASA Astrophysics Data System (ADS)

    1983-04-01

    Missions to be performed, station operations and functions to be carried out, and technologies anticipated during the time frame of the space station were examined in order to determine the scope of the overall information management system for the space station. This system comprises: (1) the data management system which includes onboard computer related hardware and software required to assume and exercise control of all activities performed on the station; (2) the communication system for both internal and external communications; and (3) the ground segment. Techniques used to examine the information system from a functional and performance point of view are described as well as the analyses performed to derive the architecture of both the onboard data management system and the system for internal and external communications. These architectures are then used to generate a conceptual design of the onboard elements in order to determine the physical parameters (size/weight/power) of the hardware and software. The ground segment elements are summarized.

  1. Hybrid knowledge systems

    NASA Technical Reports Server (NTRS)

    Subrahmanian, V. S.

    1994-01-01

    An architecture called hybrid knowledge system (HKS) is described that can be used to interoperate between a specification of the control laws describing a physical system, a collection of databases, knowledge bases and/or other data structures reflecting information about the world in which the physical system controlled resides, observations (e.g. sensor information) from the external world, and actions that must be taken in response to external observations.

  2. Computers for symbolic processing

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Lowrie, Matthew B.; Li, Guo-Jie

    1989-01-01

    A detailed survey on the motivations, design, applications, current status, and limitations of computers designed for symbolic processing is provided. Symbolic processing computations are performed at the word, relation, or meaning levels, and the knowledge used in symbolic applications may be fuzzy, uncertain, indeterminate, and ill represented. Various techniques for knowledge representation and processing are discussed from both the designers' and users' points of view. The design and choice of a suitable language for symbolic processing and the mapping of applications into a software architecture are then considered. The process of refining the application requirements into hardware and software architectures is treated, and state-of-the-art sequential and parallel computers designed for symbolic processing are discussed.

  3. Text-based discovery in biomedicine: the architecture of the DAD-system.

    PubMed

    Weeber, M; Klein, H; Aronson, A R; Mork, J G; de Jong-van den Berg, L T; Vos, R

    2000-01-01

    Current scientific research takes place in highly specialized contexts with poor communication between disciplines as a likely consequence. Knowledge from one discipline may be useful for the other without researchers knowing it. As scientific publications are a condensation of this knowledge, literature-based discovery tools may help the individual scientist to explore new useful domains. We report on the development of the DAD-system, a concept-based Natural Language Processing system for PubMed citations that provides the biomedical researcher such a tool. We describe the general architecture and illustrate its operation by a simulation of a well-known text-based discovery: The favorable effects of fish oil on patients suffering from Raynaud's disease [1].

  4. A reinforcement learning-based architecture for fuzzy logic control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1992-01-01

    This paper introduces a new method for learning to refine a rule-based fuzzy logic controller. A reinforcement learning technique is used in conjunction with a multilayer neural network model of a fuzzy controller. The approximate reasoning based intelligent control (ARIC) architecture proposed here learns by updating its prediction of the physical system's behavior and fine tunes a control knowledge base. Its theory is related to Sutton's temporal difference (TD) method. Because ARIC has the advantage of using the control knowledge of an experienced operator and fine tuning it through the process of learning, it learns faster than systems that train networks from scratch. The approach is applied to a cart-pole balancing system.

  5. Portable inference engine: An extended CLIPS for real-time production systems

    NASA Technical Reports Server (NTRS)

    Le, Thach; Homeier, Peter

    1988-01-01

    The present C-Language Integrated Production System (CLIPS) architecture has not been optimized to deal with the constraints of real-time production systems. Matching in CLIPS is based on the Rete Net algorithm, whose assumption of working memory stability might fail to be satisfied in a system subject to real-time dataflow. Further, the CLIPS forward-chaining control mechanism with a predefined conflict resultion strategy may not effectively focus the system's attention on situation-dependent current priorties, or appropriately address different kinds of knowledge which might appear in a given application. Portable Inference Engine (PIE) is a production system architecture based on CLIPS which attempts to create a more general tool while addressing the problems of real-time expert systems. Features of the PIE design include a modular knowledge base, a modified Rete Net algorithm, a bi-directional control strategy, and multiple user-defined conflict resolution strategies. Problems associated with real-time applications are analyzed and an explanation is given for how the PIE architecture addresses these problems.

  6. RT-Syn: A real-time software system generator

    NASA Technical Reports Server (NTRS)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  7. A multimedia Anatomy Browser incorporating a knowledge base and 3D images.

    PubMed Central

    Eno, K.; Sundsten, J. W.; Brinkley, J. F.

    1991-01-01

    We describe a multimedia program for teaching anatomy. The program, called the Anatomy Browser, displays cross-sectional and topographical images, with outlines around structures and regions of interest. The user may point to these structures and retrieve text descriptions, view symbolic relationships between structures, or view spatial relationships by accessing 3-D graphics animations from videodiscs produced specifically for this program. The software also helps students exercise what they have learned by asking them to identify structures by name and location. The program is implemented in a client-server architecture, with the user interface residing on a Macintosh, while images, data, and a growing symbolic knowledge base of anatomy are stored on a fileserver. This architecture allows us to develop practical tutorial modules that are in current use, while at the same time developing the knowledge base that will lead to more intelligent tutorial systems. PMID:1807699

  8. Creating an Organic Knowledge-Building Environment within an Asynchronous Distributed Learning Context.

    ERIC Educational Resources Information Center

    Moller, Leslie; Prestera, Gustavo E.; Harvey, Douglas; Downs-Keller, Margaret; McCausland, Jo-Ann

    2002-01-01

    Discusses organic architecture and suggests that learning environments should be designed and constructed using an organic approach, so that learning is not viewed as a distinct human activity but incorporated into everyday performance. Highlights include an organic knowledge-building model; information objects; scaffolding; discourse action…

  9. Technology advances and market forces: Their impact on high performance architectures

    NASA Technical Reports Server (NTRS)

    Best, D. R.

    1978-01-01

    Reasonable projections into future supercomputer architectures and technology require an analysis of the computer industry market environment, the current capabilities and trends within the component industry, and the research activities on computer architecture in the industrial and academic communities. Management, programmer, architect, and user must cooperate to increase the efficiency of supercomputer development efforts. Care must be taken to match the funding, compiler, architecture and application with greater attention to testability, maintainability, reliability, and usability than supercomputer development programs of the past.

  10. IRAF and STSDAS under the new ALPHA architecture

    NASA Technical Reports Server (NTRS)

    Zarate, N. R.

    1992-01-01

    Digital's next generation RISC architecture, known as ALPHA, presents many IRAF system portability questions and challenges to both site managers and end users. DEC promises to support the ULTRIX, VMS, and OSF/1 operating systems, which should allow IRAF to be ported to the new architecture at either the program executable level (using VEST), or at the source level, where IRAF can be tuned for greater performance. These notes highlight some of the details of porting IRAF to OpenVMS on the ALPHA architecture.

  11. Invocation oriented architecture for agile code and agile data

    NASA Astrophysics Data System (ADS)

    Verma, Dinesh; Chan, Kevin; Leung, Kin; Gkelias, Athanasios

    2017-05-01

    In order to address the unique requirements of sensor information fusion in a tactical coalition environment, we are proposing a new architecture - one based on the concept of invocations. An invocation is a combination of a software code and a piece of data, both managed using techniques from Information Centric networking. This paper will discuss limitations of current approaches, present the architecture for an invocation oriented architecture, illustrate how it works with an example scenario, and provide reasons for its suitability in a coalition environment.

  12. A Review of Enterprise Architecture Use in Defence

    DTIC Science & Technology

    2014-09-01

    dictionary of terms; • architecture description language; • architectural information (pertaining both to specific projects and higher level...UNCLASSIFIED 59 Z39.19 2005 Monolingual Controlled Vocabularies, National Information Standards Organisation, Bethesda: NISO Press, 2005. BABOK 2009...togaf/ Z39.19 2005 ANSI/NISO Z39.19 – Guidelines for the Construction, Format, and Management of Monolingual Controlled Vocabularies, Bethesda: NISO

  13. Emulation of Industrial Control Field Device Protocols

    DTIC Science & Technology

    2013-03-01

    platforms such as the Arduino ( based on the Atmel AVR architecture) or popular PIC architecture based devices, which are programmed for specific functions...UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base , Ohio DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...confidence intervals for the mean. Based on these results, extensive knowledge of the specific implementations of the protocols or timing profiles of the

  14. Long-term knowledge acquisition using contextual information in a memory-inspired robot architecture

    NASA Astrophysics Data System (ADS)

    Pratama, Ferdian; Mastrogiovanni, Fulvio; Lee, Soon Geul; Chong, Nak Young

    2017-03-01

    In this paper, we present a novel cognitive framework allowing a robot to form memories of relevant traits of its perceptions and to recall them when necessary. The framework is based on two main principles: on the one hand, we propose an architecture inspired by current knowledge in human memory organisation; on the other hand, we integrate such an architecture with the notion of context, which is used to modulate the knowledge acquisition process when consolidating memories and forming new ones, as well as with the notion of familiarity, which is employed to retrieve proper memories given relevant cues. Although much research has been carried out, which exploits Machine Learning approaches to provide robots with internal models of their environment (including objects and occurring events therein), we argue that such approaches may not be the right direction to follow if a long-term, continuous knowledge acquisition is to be achieved. As a case study scenario, we focus on both robot-environment and human-robot interaction processes. In case of robot-environment interaction, a robot performs pick and place movements using the objects in the workspace, at the same time observing their displacement on a table in front of it, and progressively forms memories defined as relevant cues (e.g. colour, shape or relative position) in a context-aware fashion. As far as human-robot interaction is concerned, the robot can recall specific snapshots representing past events using both sensory information and contextual cues upon request by humans.

  15. Neural simulations on multi-core architectures.

    PubMed

    Eichner, Hubert; Klug, Tobias; Borst, Alexander

    2009-01-01

    Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.

  16. Neural Simulations on Multi-Core Architectures

    PubMed Central

    Eichner, Hubert; Klug, Tobias; Borst, Alexander

    2009-01-01

    Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393

  17. Architectural switches in plant thylakoid membranes.

    PubMed

    Kirchhoff, Helmut

    2013-10-01

    Recent progress in elucidating the structure of higher plants photosynthetic membranes provides a wealth of information. It allows generation of architectural models that reveal well-organized and complex arrangements not only on whole membrane level, but also on the supramolecular level. These arrangements are not static but highly responsive to the environment. Knowledge about the interdependency between dynamic structural features of the photosynthetic machinery and the functionality of energy conversion is central to understanding the plasticity of photosynthesis in an ever-changing environment. This review summarizes the architectural switches that are realized in thylakoid membranes of green plants.

  18. Advanced control architecture for autonomous vehicles

    NASA Astrophysics Data System (ADS)

    Maurer, Markus; Dickmanns, Ernst D.

    1997-06-01

    An advanced control architecture for autonomous vehicles is presented. The hierarchical architecture consists of four levels: a vehicle level, a control level, a rule-based level and a knowledge-based level. A special focus is on forms of internal representation, which have to be chosen adequately for each level. The control scheme is applied to VaMP, a Mercedes passenger car which autonomously performs missions on German freeways. VaMP perceives the environment with its sense of vision and conventional sensors. It controls its actuators for locomotion and attention focusing. Modules for perception, cognition and action are discussed.

  19. Chromosome Territories

    PubMed Central

    Cremer, Thomas; Cremer, Marion

    2010-01-01

    Chromosome territories (CTs) constitute a major feature of nuclear architecture. In a brief statement, the possible contribution of nuclear architecture studies to the field of epigenomics is considered, followed by a historical account of the CT concept and the final compelling experimental evidence of a territorial organization of chromosomes in all eukaryotes studied to date. Present knowledge of nonrandom CT arrangements, of the internal CT architecture, and of structural interactions with other CTs is provided as well as the dynamics of CT arrangements during cell cycle and postmitotic terminal differentiation. The article concludes with a discussion of open questions and new experimental strategies to answer them. PMID:20300217

  20. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

Top