Sample records for based interoperability solution

  1. Clinical data interoperability based on archetype transformation.

    PubMed

    Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2011-10-01

    The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. A framework for semantic interoperability in healthcare: a service oriented architecture based on health informatics standards.

    PubMed

    Ryan, Amanda; Eklund, Peter

    2008-01-01

    Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).

  3. The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.

    PubMed

    Ryan, Amanda; Eklund, Peter

    2010-01-01

    Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.

  4. Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Annoni, Alessandro

    2013-04-01

    Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.

  5. Interoperable and standard e-Health solution over Bluetooth.

    PubMed

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  6. The role of architecture and ontology for interoperability.

    PubMed

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  7. ECHO Services: Foundational Middleware for a Science Cyberinfrastructure

    NASA Technical Reports Server (NTRS)

    Burnett, Michael

    2005-01-01

    This viewgraph presentation describes ECHO, an interoperability middleware solution. It uses open, XML-based APIs, and supports net-centric architectures and solutions. ECHO has a set of interoperable registries for both data (metadata) and services, and provides user accounts and a common infrastructure for the registries. It is built upon a layered architecture with extensible infrastructure for supporting community unique protocols. It has been operational since November, 2002 and it available as open source.

  8. A cloud-based approach for interoperable electronic health records (EHRs).

    PubMed

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  9. Tool and data interoperability in the SSE system

    NASA Technical Reports Server (NTRS)

    Shotton, Chuck

    1988-01-01

    Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.

  10. Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands

    NASA Astrophysics Data System (ADS)

    Scheurleer, J.; Koken, Ph; Wessel, R.

    2014-03-01

    Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.

  11. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency-first responders: the Netherlands-Germany border fire.

  12. A SOA-Based Platform to Support Clinical Data Sharing.

    PubMed

    Gazzarata, R; Giannini, B; Giacomini, M

    2017-01-01

    The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the "Interoperable" Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the "Interoperable" Tier, the current solution actually covers the "Connected" Tier, due to local hospital policy restrictions.

  13. Resource allocation in shared spectrum access communications for operators with diverse service requirements

    NASA Astrophysics Data System (ADS)

    Kibria, Mirza Golam; Villardi, Gabriel Porto; Ishizu, Kentaro; Kojima, Fumihide; Yano, Hiroyuki

    2016-12-01

    In this paper, we study inter-operator spectrum sharing and intra-operator resource allocation in shared spectrum access communication systems and propose efficient dynamic solutions to address both inter-operator and intra-operator resource allocation optimization problems. For inter-operator spectrum sharing, we present two competent approaches, namely the subcarrier gain-based sharing and fragmentation-based sharing, which carry out fair and flexible allocation of the available shareable spectrum among the operators subject to certain well-defined sharing rules, traffic demands, and channel propagation characteristics. The subcarrier gain-based spectrum sharing scheme has been found to be more efficient in terms of achieved throughput. However, the fragmentation-based sharing is more attractive in terms of computational complexity. For intra-operator resource allocation, we consider resource allocation problem with users' dissimilar service requirements, where the operator supports users with delay constraint and non-delay constraint service requirements, simultaneously. This optimization problem is a mixed-integer non-linear programming problem and non-convex, which is computationally very expensive, and the complexity grows exponentially with the number of integer variables. We propose less-complex and efficient suboptimal solution based on formulating exact linearization, linear approximation, and convexification techniques for the non-linear and/or non-convex objective functions and constraints. Extensive simulation performance analysis has been carried out that validates the efficiency of the proposed solution.

  14. ECITE: A Testbed for Assessment of Technology Interoperability and Integration wiht Architecture Components

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Keiser, K.; Law, E.; Yang, C. P.; Djorgovski, S. G.

    2016-12-01

    ECITE (EarthCube Integration and Testing Environment) is providing both cloud-based computational testing resources and an Assessment Framework for Technology Interoperability and Integration. NSF's EarthCube program is funding the development of cyberinfrastructure building block components as technologies to address Earth science research problems. These EarthCube building blocks need to support integration and interoperability objectives to work towards a coherent cyberinfrastructure architecture for the program. ECITE is being developed to provide capabilities to test and assess the interoperability and integration across funded EarthCube technology projects. EarthCube defined criteria for interoperability and integration are applied to use cases coordinating science problems with technology solutions. The Assessment Framework facilitates planning, execution and documentation of the technology assessments for review by the EarthCube community. This presentation will describe the components of ECITE and examine the methodology of cross walking between science and technology use cases.

  15. Ocean Data Interoperability Platform (ODIP): developing a common framework for global marine data management

    NASA Astrophysics Data System (ADS)

    Glaves, H. M.

    2015-12-01

    In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.

  16. DIMP: an interoperable solution for software integration and product data exchange

    NASA Astrophysics Data System (ADS)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  17. An Ontological Solution to Support Interoperability in the Textile Industry

    NASA Astrophysics Data System (ADS)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  18. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    NASA Astrophysics Data System (ADS)

    Glaves, Helen; Schaap, Dick

    2016-04-01

    The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential impact for the individual regional data infrastructures of implementing these prototype interoperability solutions is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project which is focussed on regional marine data systems in Europe, Australia, the USA and, more recently, Canada.

  19. A Shovel-Ready Solution to Fill the Nursing Data Gap in the Interdisciplinary Clinical Picture.

    PubMed

    Keenan, Gail M; Lopez, Karen Dunn; Sousa, Vanessa E C; Stifter, Janet; Macieira, Tamara G R; Boyd, Andrew D; Yao, Yingwei; Herdman, T Heather; Moorhead, Sue; McDaniel, Anna; Wilkie, Diana J

    2018-01-01

    To critically evaluate 2014 American Academy of Nursing (AAN) call-to-action plan for generating interoperable nursing data. Healthcare literature. AAN's plan will not generate the nursing data needed to participate in big data science initiatives in the short term because Logical Observation Identifiers Names and Codes and Systematized Nomenclature of Medicine - Clinical Terms are not yet ripe for generating interoperable data. Well-tested viable alternatives exist. Authors present recommendations for revisions to AAN's plan and an evidence-based alternative to generating interoperable nursing data in the near term. These revisions can ultimately lead to the proposed terminology goals of the AAN's plan in the long term. © 2017 NANDA International, Inc.

  20. Improved inter-layer prediction for light field content coding with display scalability

    NASA Astrophysics Data System (ADS)

    Conti, Caroline; Ducla Soares, Luís.; Nunes, Paulo

    2016-09-01

    Light field imaging based on microlens arrays - also known as plenoptic, holoscopic and integral imaging - has recently risen up as feasible and prospective technology due to its ability to support functionalities not straightforwardly available in conventional imaging systems, such as: post-production refocusing and depth of field changing. However, to gradually reach the consumer market and to provide interoperability with current 2D and 3D representations, a display scalable coding solution is essential. In this context, this paper proposes an improved display scalable light field codec comprising a three-layer hierarchical coding architecture (previously proposed by the authors) that provides interoperability with 2D (Base Layer) and 3D stereo and multiview (First Layer) representations, while the Second Layer supports the complete light field content. For further improving the compression performance, novel exemplar-based inter-layer coding tools are proposed here for the Second Layer, namely: (i) an inter-layer reference picture construction relying on an exemplar-based optimization algorithm for texture synthesis, and (ii) a direct prediction mode based on exemplar texture samples from lower layers. Experimental results show that the proposed solution performs better than the tested benchmark solutions, including the authors' previous scalable codec.

  1. Data interoperability software solution for emergency reaction in the Europe Union

    NASA Astrophysics Data System (ADS)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2014-09-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision-making slower and more difficult. However, spread and development of networks and IT-based Emergency Management Systems (EMS) has improved emergency responses, becoming more coordinated. Despite improvements made in recent years, EMS have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision-making. In addition, from a technical perspective, the consolidation of current EMS and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMS surrounded by different contexts. To overcome these problems we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries cultural linguistic issues. To deal with the diversity of data protocols and formats, we have designed a Service Oriented Architecture for Data Interoperability (named DISASTER) providing a flexible extensible solution to solve the mediation issues. Web Services have been adopted as specific technology to implement such paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency first responders: the Netherlands-Germany border fire.

  2. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.

  3. Flexible solution for interoperable cloud healthcare systems.

    PubMed

    Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena

    2012-01-01

    It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.

  4. Ontology Mappings to Improve Learning Resource Search

    ERIC Educational Resources Information Center

    Gasevic, Dragan; Hatala, Marek

    2006-01-01

    This paper proposes an ontology mapping-based framework that allows searching for learning resources using multiple ontologies. The present applications of ontologies in e-learning use various ontologies (eg, domain, curriculum, context), but they do not give a solution on how to interoperate e-learning systems based on different ontologies. The…

  5. ACR Imaging IT Reference Guide: Image Sharing: Evolving Solutions in the Age of Interoperability

    PubMed Central

    Erickson, Bradley J.; Choy, Garry

    2014-01-01

    Interoperability is a major focus of the quickly evolving world of Health Information Technology. Easy, yet secure and confidential exchange of imaging exams and the associated reports must be a part of the solutions that are implemented. The availability of historical exams is essential in providing a quality interpretation and reducing inappropriate utilization of imaging services. Today exchange of imaging exams is most often achieved via a CD. We describe the virtues of this solution as well as challenges that have surfaced. Internet and cloud based technologies employed for many consumer services can provide a better solution. Vendors are making these solutions available. Standards for internet based exchange are emerging. Just as Radiology converged on DICOM as a standard to store and view images we need a common exchange standard. We will review the existing standards, and how they are organized into useful workflows through Integrating the Healthcare Enterprise (IHE) profiles. IHE and standards development processes are discussed. Healthcare and the domain of Radiology must stay current with quickly evolving internet standards. The successful use of the “cloud” will depend upon both the technologies we discuss and the policies put into place around these technologies. We discuss both aspects. The Radiology community must lead the way and provide a solution that works for radiologists and clinicians in the Electronic Medical Record (EMR). Lastly we describe the features we believe radiologists should consider when considering adding internet based exchange solutions to their practice. PMID:25467903

  6. Archetype-based electronic health records: a literature review and evaluation of their applicability to health data interoperability and access.

    PubMed

    Wollersheim, Dennis; Sari, Anny; Rahayu, Wenny

    Health Information Managers (HIMs) are responsible for overseeing health information. The change management necessary during the transition to electronic health records (EHR) is substantial, and ongoing. Archetype-based EHRs are a core health information system component which solve many of the problems that arise during this period of change. Archetypes are models of clinical content, and they have many beneficial properties. They are interoperable, both between settings and through time. They are more amenable to change than conventional paradigms, and their design is congruent with clinical practice. This paper is an overview of the current archetype literature relevant to Health Information Managers. The literature was sourced in the English language sections of ScienceDirect, IEEE Explore, Pubmed, Google Scholar, ACM Digital library and other databases on the usage of archetypes for electronic health record storage, looking at the current areas of archetype research, appropriate usage, and future research. We also used reference lists from the cited papers, papers referenced by the openEHR website, and the recommendations from experts in the area. Criteria for inclusion were (a) if studies covered archetype research and (b) were either studies of archetype use, archetype system design, or archetype effectiveness. The 47 papers included show a wide and increasing worldwide archetype usage, in a variety of medical domains. Most of the papers noted that archetypes are an appropriate solution for future-proof and interoperable medical data storage. We conclude that archetypes are a suitable solution for the complex problem of electronic health record storage and interoperability.

  7. A novel web-enabled healthcare solution on health vault system.

    PubMed

    Liao, Lingxia; Chen, Min; Rodrigues, Joel J P C; Lai, Xiaorong; Vuong, Son

    2012-06-01

    Complicated Electronic Medical Records (EMR) systems have created problems in systems regarding an easy implementation and interoperability for a Web-enabled Healthcare Solution, which is normally provided by an independent healthcare giver with limited IT knowledge and interests. An EMR system with well-designed and user-friendly interface, such as Microsoft HealthVault System used as the back-end platform of a Web-enabled healthcare application will be an approach to deal with these problems. This paper analyzes the patient oriented Web-enabled healthcare service application as the new trend to delivery healthcare from hospital/clinic-centric to patient-centric, the current e-healthcare applications, and the main backend EMR systems. Then, we present a novel web-enabled healthcare solution based on Microsoft HealthVault EMR system to meet customers' needs, such as, low total cost, easily development and maintenance, and good interoperability. A sample system is given to show how the solution can be fulfilled, evaluated, and validated. We expect that this paper will provide a deep understanding of the available EMR systems, leading to insights for new solutions and approaches driven to next generation EMR systems.

  8. The RICORDO approach to semantic interoperability for biomedical data and models: strategy, standards and solutions

    PubMed Central

    2011-01-01

    Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109

  9. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems.

    PubMed

    Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel

    2013-03-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    PubMed

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  11. The solution of target assignment problem in command and control decision-making behaviour simulation

    NASA Astrophysics Data System (ADS)

    Li, Ni; Huai, Wenqing; Wang, Shaodan

    2017-08-01

    C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.

  12. A SOA-Based Platform to Support Clinical Data Sharing

    PubMed

    Gazzarata, R; Giannini, B; Giacomini, M

    2017-01-01

    The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. © 2017 R. Gazzarata et al.

  13. A SOA-Based Platform to Support Clinical Data Sharing

    PubMed Central

    Gazzarata, R.; Giannini, B.

    2017-01-01

    The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. PMID:29065576

  14. Improving Patient Safety with X-Ray and Anesthesia Machine Ventilator Synchronization: A Medical Device Interoperability Case Study

    NASA Astrophysics Data System (ADS)

    Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup

    When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.

  15. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    PubMed

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.

  16. Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements

    NASA Astrophysics Data System (ADS)

    Allison, M. L.; Richard, S. M.; Patten, K.

    2014-12-01

    The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.

  17. Data Modeling Challenges of Advanced Interoperability.

    PubMed

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  18. OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.

    PubMed

    Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk

    2018-02-23

    Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.

  19. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, John; Halbgewachs, Ron; Chavez, Adrian

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relatingmore » to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock utilities into proprietary and closed systems.« less

  1. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.

  2. Cohort Selection and Management Application Leveraging Standards-based Semantic Interoperability and a Groovy DSL

    PubMed Central

    Bucur, Anca; van Leeuwen, Jasper; Chen, Njin-Zu; Claerhout, Brecht; de Schepper, Kristof; Perez-Rey, David; Paraiso-Medina, Sergio; Alonso-Calvo, Raul; Mehta, Keyur; Krykwinski, Cyril

    2016-01-01

    This paper describes a new Cohort Selection application implemented to support streamlining the definition phase of multi-centric clinical research in oncology. Our approach aims at both ease of use and precision in defining the selection filters expressing the characteristics of the desired population. The application leverages our standards-based Semantic Interoperability Solution and a Groovy DSL to provide high expressiveness in the definition of filters and flexibility in their composition into complex selection graphs including splits and merges. Widely-adopted ontologies such as SNOMED-CT are used to represent the semantics of the data and to express concepts in the application filters, facilitating data sharing and collaboration on joint research questions in large communities of clinical users. The application supports patient data exploration and efficient collaboration in multi-site, heterogeneous and distributed data environments. PMID:27570644

  3. Joint Command and Control: Integration Not Interoperability

    DTIC Science & Technology

    2013-03-01

    separate computer and communication equipment. Besides having to engineer interoperability, the Services also must determine the level of...effects.  Determines force responsiveness and allocates resources.5 This thesis argues Joint military operations will never be fully integrated as...processes and systems. Secondly, the limited depth of discussion risks implying (or the reader inferring) the solution is more straightforward than

  4. Toward interoperable bioscience data

    PubMed Central

    Sansone, Susanna-Assunta; Rocca-Serra, Philippe; Field, Dawn; Maguire, Eamonn; Taylor, Chris; Hofmann, Oliver; Fang, Hong; Neumann, Steffen; Tong, Weida; Amaral-Zettler, Linda; Begley, Kimberly; Booth, Tim; Bougueleret, Lydie; Burns, Gully; Chapman, Brad; Clark, Tim; Coleman, Lee-Ann; Copeland, Jay; Das, Sudeshna; de Daruvar, Antoine; de Matos, Paula; Dix, Ian; Edmunds, Scott; Evelo, Chris T; Forster, Mark J; Gaudet, Pascale; Gilbert, Jack; Goble, Carole; Griffin, Julian L; Jacob, Daniel; Kleinjans, Jos; Harland, Lee; Haug, Kenneth; Hermjakob, Henning; Ho Sui, Shannan J; Laederach, Alain; Liang, Shaoguang; Marshall, Stephen; McGrath, Annette; Merrill, Emily; Reilly, Dorothy; Roux, Magali; Shamu, Caroline E; Shang, Catherine A; Steinbeck, Christoph; Trefethen, Anne; Williams-Jones, Bryn; Wolstencroft, Katherine; Xenarios, Ioannis; Hide, Winston

    2012-01-01

    To make full use of research data, the bioscience community needs to adopt technologies and reward mechanisms that support interoperability and promote the growth of an open ‘data commoning’ culture. Here we describe the prerequisites for data commoning and present an established and growing ecosystem of solutions using the shared ‘Investigation-Study-Assay’ framework to support that vision. PMID:22281772

  5. A generic architecture for an adaptive, interoperable and intelligent type 2 diabetes mellitus care system.

    PubMed

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Schulz, Stefan

    2015-01-01

    Chronic diseases such as Type 2 Diabetes Mellitus (T2DM) constitute a big burden to the global health economy. T2DM Care Management requires a multi-disciplinary and multi-organizational approach. Because of different languages and terminologies, education, experiences, skills, etc., such an approach establishes a special interoperability challenge. The solution is a flexible, scalable, business-controlled, adaptive, knowledge-based, intelligent system following a systems-oriented, architecture-centric, ontology-based and policy-driven approach. The architecture of real systems is described, using the basics and principles of the Generic Component Model (GCM). For representing the functional aspects of a system the Business Process Modeling Notation (BPMN) is used. The system architecture obtained is presented using a GCM graphical notation, class diagrams and BPMN diagrams. The architecture-centric approach considers the compositional nature of the real world system and its functionalities, guarantees coherence, and provides right inferences. The level of generality provided in this paper facilitates use case specific adaptations of the system. By that way, intelligent, adaptive and interoperable T2DM care systems can be derived from the presented model as presented in another publication.

  6. Research on key technologies for data-interoperability-based metadata, data compression and encryption, and their application

    NASA Astrophysics Data System (ADS)

    Yu, Xu; Shao, Quanqin; Zhu, Yunhai; Deng, Yuejin; Yang, Haijun

    2006-10-01

    With the development of informationization and the separation between data management departments and application departments, spatial data sharing becomes one of the most important objectives for the spatial information infrastructure construction, and spatial metadata management system, data transmission security and data compression are the key technologies to realize spatial data sharing. This paper discusses the key technologies for metadata based on data interoperability, deeply researches the data compression algorithms such as adaptive Huffman algorithm, LZ77 and LZ78 algorithm, studies to apply digital signature technique to encrypt spatial data, which can not only identify the transmitter of spatial data, but also find timely whether the spatial data are sophisticated during the course of network transmission, and based on the analysis of symmetric encryption algorithms including 3DES,AES and asymmetric encryption algorithm - RAS, combining with HASH algorithm, presents a improved mix encryption method for spatial data. Digital signature technology and digital watermarking technology are also discussed. Then, a new solution of spatial data network distribution is put forward, which adopts three-layer architecture. Based on the framework, we give a spatial data network distribution system, which is efficient and safe, and also prove the feasibility and validity of the proposed solution.

  7. Capturing Essential Information to Achieve Safe Interoperability

    PubMed Central

    Weininger, Sandy; Jaffe, Michael B.; Rausch, Tracy; Goldman, Julian M.

    2016-01-01

    In this article we describe the role of “clinical scenario” information to assure the safety of interoperable systems, as well as the system’s ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets (MDIDSa) and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a “Patient-controlled analgesia safety interlock” are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a Learning Health System to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™, is described. PMID:27387840

  8. Capturing Essential Information to Achieve Safe Interoperability.

    PubMed

    Weininger, Sandy; Jaffe, Michael B; Rausch, Tracy; Goldman, Julian M

    2017-01-01

    In this article, we describe the role of "clinical scenario" information to assure the safety of interoperable systems, as well as the system's ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore, improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a "patient-controlled analgesia safety interlock" are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a learning health system to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A Web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™ (MD PnP Program), is described.

  9. Maturity Model for Advancing Smart Grid Interoperability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Mark; Widergren, Steven E.; Mater, J.

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less

  10. Analysis model for personal eHealth solutions and services.

    PubMed

    Mykkänen, Juha; Tuomainen, Mika; Luukkonen, Irmeli; Itälä, Timo

    2010-01-01

    In this paper, we present a framework for analysing and assessing various features of personal wellbeing information management services and solutions such as personal health records and citizen-oriented eHealth services. The model is based on general functional and interoperability standards for personal health management applications and generic frameworks for different aspects of analysis. It has been developed and used in the MyWellbeing project in Finland to provide baseline for the research, development and comparison of many different personal wellbeing and health management solutions and to support the development of unified "Coper" concept for citizen empowerment.

  11. Towards an Approach of Semantic Access Control for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Hu, Luokai; Ying, Shi; Jia, Xiangyang; Zhao, Kai

    With the development of cloud computing, the mutual understandability among distributed Access Control Policies (ACPs) has become an important issue in the security field of cloud computing. Semantic Web technology provides the solution to semantic interoperability of heterogeneous applications. In this paper, we analysis existing access control methods and present a new Semantic Access Control Policy Language (SACPL) for describing ACPs in cloud computing environment. Access Control Oriented Ontology System (ACOOS) is designed as the semantic basis of SACPL. Ontology-based SACPL language can effectively solve the interoperability issue of distributed ACPs. This study enriches the research that the semantic web technology is applied in the field of security, and provides a new way of thinking of access control in cloud computing.

  12. eHealth integration and interoperability issues: towards a solution through enterprise architecture.

    PubMed

    Adenuga, Olugbenga A; Kekwaletswe, Ray M; Coleman, Alfred

    2015-01-01

    Investments in healthcare information and communication technology (ICT) and health information systems (HIS) continue to increase. This is creating immense pressure on healthcare ICT and HIS to deliver and show significance in such investments in technology. It is discovered in this study that integration and interoperability contribute largely to this failure in ICT and HIS investment in healthcare, thus resulting in the need towards healthcare architecture for eHealth. This study proposes an eHealth architectural model that accommodates requirement based on healthcare need, system, implementer, and hardware requirements. The model is adaptable and examines the developer's and user's views that systems hold high hopes for their potential to change traditional organizational design, intelligence, and decision-making.

  13. Building the Synergy between Public Sector and Research Data Infrastructures

    NASA Astrophysics Data System (ADS)

    Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea

    2014-05-01

    INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.

  14. On the Execution Control of HLA Federations using the SISO Space Reference FOM

    NASA Technical Reports Server (NTRS)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2017-01-01

    In the Space domain the High Level Architecture (HLA) is one of the reference standard for Distributed Simulation. However, for the different organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA) and their industrial partners, it is difficult to implement HLA simulators (called Federates) able to interact and interoperate in the context of a distributed HLA simulation (called Federation). The lack of a common FOM (Federation Object Model) for the Space domain is one of the main reasons that precludes a-priori interoperability between heterogeneous federates. To fill this lack a Product Development Group (PDG) has been recently activated in the Simulation Interoperability Standards Organization (SISO) with the aim to provide a Space Reference FOM (SRFOM) for international collaboration on Space systems simulations. Members of the PDG come from several countries and contribute experiences from projects within NASA, ESA and other organizations. Participants represent government, academia and industry. The paper presents an overview of the ongoing Space Reference FOM standardization initiative by focusing on the solution provided for managing the execution of an SRFOM-based Federation.

  15. Design and Implementation of e-Health System Based on Semantic Sensor Network Using IETF YANG.

    PubMed

    Jin, Wenquan; Kim, Do Hyeun

    2018-02-20

    Recently, healthcare services can be delivered effectively to patients anytime and anywhere using e-Health systems. e-Health systems are developed through Information and Communication Technologies (ICT) that involve sensors, mobiles, and web-based applications for the delivery of healthcare services and information. Remote healthcare is an important purpose of the e-Health system. Usually, the eHealth system includes heterogeneous sensors from diverse manufacturers producing data in different formats. Device interoperability and data normalization is a challenging task that needs research attention. Several solutions are proposed in the literature based on manual interpretation through explicit programming. However, programmatically implementing the interpretation of the data sender and data receiver in the e-Health system for the data transmission is counterproductive as modification will be required for each new device added into the system. In this paper, an e-Health system with the Semantic Sensor Network (SSN) is proposed to address the device interoperability issue. In the proposed system, we have used IETF YANG for modeling the semantic e-Health data to represent the information of e-Health sensors. This modeling scheme helps in provisioning semantic interoperability between devices and expressing the sensing data in a user-friendly manner. For this purpose, we have developed an ontology for e-Health data that supports different styles of data formats. The ontology is defined in YANG for provisioning semantic interpretation of sensing data in the system by constructing meta-models of e-Health sensors. The proposed approach assists in the auto-configuration of eHealth sensors and querying the sensor network with semantic interoperability support for the e-Health system.

  16. Design and Implementation of e-Health System Based on Semantic Sensor Network Using IETF YANG

    PubMed Central

    Kim, Do Hyeun

    2018-01-01

    Recently, healthcare services can be delivered effectively to patients anytime and anywhere using e-Health systems. e-Health systems are developed through Information and Communication Technologies (ICT) that involve sensors, mobiles, and web-based applications for the delivery of healthcare services and information. Remote healthcare is an important purpose of the e-Health system. Usually, the eHealth system includes heterogeneous sensors from diverse manufacturers producing data in different formats. Device interoperability and data normalization is a challenging task that needs research attention. Several solutions are proposed in the literature based on manual interpretation through explicit programming. However, programmatically implementing the interpretation of the data sender and data receiver in the e-Health system for the data transmission is counterproductive as modification will be required for each new device added into the system. In this paper, an e-Health system with the Semantic Sensor Network (SSN) is proposed to address the device interoperability issue. In the proposed system, we have used IETF YANG for modeling the semantic e-Health data to represent the information of e-Health sensors. This modeling scheme helps in provisioning semantic interoperability between devices and expressing the sensing data in a user-friendly manner. For this purpose, we have developed an ontology for e-Health data that supports different styles of data formats. The ontology is defined in YANG for provisioning semantic interpretation of sensing data in the system by constructing meta-models of e-Health sensors. The proposed approach assists in the auto-configuration of eHealth sensors and querying the sensor network with semantic interoperability support for the e-Health system. PMID:29461493

  17. DbMap: improving database interoperability issues in medical software using a simple, Java-Xml based solution.

    PubMed Central

    Karadimas, H.; Hemery, F.; Roland, P.; Lepage, E.

    2000-01-01

    In medical software development, the use of databases plays a central role. However, most of the databases have heterogeneous encoding and data models. To deal with these variations in the application code directly is error-prone and reduces the potential reuse of the produced software. Several approaches to overcome these limitations have been proposed in the medical database literature, which will be presented. We present a simple solution, based on a Java library, and a central Metadata description file in XML. This development approach presents several benefits in software design and development cycles, the main one being the simplicity in maintenance. PMID:11079915

  18. Requirements and Solutions for Personalized Health Systems.

    PubMed

    Blobel, Bernd; Ruotsalainen, Pekka; Lopez, Diego M; Oemig, Frank

    2017-01-01

    Organizational, methodological and technological paradigm changes enable a precise, personalized, predictive, preventive and participative approach to health and social services supported by multiple actors from different domains at diverse level of knowledge and skills. Interoperability has to advance beyond Information and Communication Technologies (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. The paper introduces and compares personalized health definitions, summarizes requirements and principles for pHealth systems, and considers intelligent interoperability. It addresses knowledge representation and harmonization, decision intelligence, and usability as crucial issues in pHealth. On this basis, a system-theoretical, ontology-based, policy-driven reference architecture model for open and intelligent pHealth ecosystems and its transformation into an appropriate ICT design and implementation is proposed.

  19. Applicability of IHE/Continua components for PHR systems: learning from experiences.

    PubMed

    Urbauer, Philipp; Sauermann, Stefan; Frohner, Matthias; Forjan, Mathias; Pohn, Birgit; Mense, Alexander

    2015-04-01

    Capturing personal health data using smartphones, PCs or other devices, and the reuse of the data in personal health records (PHR) is becoming more and more attractive for modern health-conscious populations. This paper analyses interoperability specifications targeting standards-based communication of computer systems and personal health devices (e.g. blood pressure monitor) in healthcare from initiatives like Integrating the Healthcare Enterprise (IHE) and Continua Health Alliance driven by industry and healthcare professionals. Furthermore it identifies certain contradictions and gaps in the specifications and suggests possible solutions. Despite these shortcomings, the specifications allow fully functional implementations of PHR systems. Henceforth, both big business and small and medium-sized enterprises (SMEs) can actively contribute to the widespread use of large-scale interoperable PHR systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Secure and interoperable communication infrastructures for PPDR organisations

    NASA Astrophysics Data System (ADS)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  1. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments

    DTIC Science & Technology

    2011-12-01

    Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization

  2. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    PubMed

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  3. Lowering Entry Barriers for Multidisciplinary Cyber(e)-Infrastructures

    NASA Astrophysics Data System (ADS)

    Nativi, S.

    2012-04-01

    Multidisciplinarity is more and more important to study the Earth System and address Global Changes. To achieve that, multidisciplinary cyber(e)-infrastructures are an important instrument. In the last years, several European, US and international initiatives have been started to carry out multidisciplinary infrastructures, including: the Spatial Information in the European Community (INSPIRE), the Global Monitoring for Environment and Security (GMES), the Data Observation Network for Earth (DataOne), and the Global Earth Observation System of Systems (GEOSS). The majority of these initiatives are developing service-based digital infrastructures asking scientific Communities (i.e. disciplinary Users and data Producers) to implement a set of standards for information interoperability. For scientific Communities, this has represented an entry barrier which has proved to be high, in several cases. In fact, both data Producers and Users do not seem to be willing to invest precious resources to become expert on interoperability solutions -on the contrary, they are focused on developing disciplinary and thematic capacities. Therefore, an important research topic is lowering entry barriers for joining multidisciplinary cyber(e)-Infrastructures. This presentation will introduce a new approach to achieve multidisciplinary interoperability underpinning multidisciplinary infrastructures and lowering the present entry barriers for both Users and data Producers. This is called the Brokering approach: it extends the service-based paradigm by introducing a new a Brokering layer or cloud which is in charge of managing all the interoperability complexity (e.g. data discovery, access, and use) thus easing Users' and Producers' burden. This approach was successfully experimented in the framework of several European FP7 Projects and in GEOSS.

  4. Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research

    NASA Astrophysics Data System (ADS)

    Schaap, D.; Thijsse, P.; Glaves, H.

    2017-12-01

    Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for the individual marine data infrastructures is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project.

  5. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    PubMed

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  6. Digital document imaging systems: An overview and guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This is an aid to NASA managers in planning the selection of a Digital Document Imaging System (DDIS) as a possible solution for document information processing and storage. Intended to serve as a manager's guide, this document contains basic information on digital imaging systems, technology, equipment standards, issues of interoperability and interconnectivity, and issues related to selecting appropriate imaging equipment based upon well defined needs.

  7. Design of a terminal solution for integration of in-home health care devices and services towards the Internet-of-Things

    NASA Astrophysics Data System (ADS)

    Pang, Zhibo; Zheng, Lirong; Tian, Junzhe; Kao-Walter, Sharon; Dubrova, Elena; Chen, Qiang

    2015-01-01

    In-home health care services based on the Internet-of-Things are promising to resolve the challenges caused by the ageing of population. But the existing research is rather scattered and shows lack of interoperability. In this article, a business-technology co-design methodology is proposed for cross-boundary integration of in-home health care devices and services. In this framework, three key elements of a solution (business model, device and service integration architecture and information system integration architecture) are organically integrated and aligned. In particular, a cooperative Health-IoT ecosystem is formulated, and information systems of all stakeholders are integrated in a cooperative health cloud as well as extended to patients' home through the in-home health care station (IHHS). Design principles of the IHHS includes the reuse of 3C platform, certification of the Health Extension, interoperability and extendibility, convenient and trusted software distribution, standardised and secured electrical health care record handling, effective service composition and efficient data fusion. These principles are applied to the design of an IHHS solution called iMedBox. Detailed device and service integration architecture and hardware and software architecture are presented and verified by an implemented prototype. The quantitative performance analysis and field trials have confirmed the feasibility of the proposed design methodology and solution.

  8. Future Interoperability of Camp Protection Systems (FICAPS)

    NASA Astrophysics Data System (ADS)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other force protection and ISR (Intelligence Surveillance Reconnaissance) tasks not only due to its flexibility but also due to the chosen interfacing.

  9. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    NASA Astrophysics Data System (ADS)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  10. Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?

    NASA Astrophysics Data System (ADS)

    Krisnadhi, A. A.

    2016-12-01

    Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology Language (OWL), we demonstrate how a data integration solution based on this idea can be realized over different data repositories.

  11. Secure E-Business applications based on the European Citizen Card

    NASA Astrophysics Data System (ADS)

    Zipfel, Christian; Daum, Henning; Meister, Gisela

    The introduction of ID cards enhanced with electronic authentication services opens up the possibility to use these for identification and authentication in e-business applications. To avoid incompatible national solutions, the specification of the European Citizen Card aims at defining interoperable services for such use cases. Especially the given device authentication methods can help to eliminate security problems with current e-business and online banking applications.

  12. A future-proof architecture for telemedicine using loose-coupled modules and HL7 FHIR.

    PubMed

    Gøeg, Kirstine Rosenbeck; Rasmussen, Rune Kongsgaard; Jensen, Lasse; Wollesen, Christian Møller; Larsen, Søren; Pape-Haugaard, Louise Bilenberg

    2018-07-01

    Most telemedicine solutions are proprietary and disease specific which cause a heterogeneous and silo-oriented system landscape with limited interoperability. Solving the interoperability problem would require a strong focus on data integration and standardization in telemedicine infrastructures. Our objective was to suggest a future-proof architecture, that consisted of small loose-coupled modules to allow flexible integration with new and existing services, and the use of international standards to allow high re-usability of modules, and interoperability in the health IT landscape. We identified core features of our future-proof architecture as the following (1) To provide extended functionality the system should be designed as a core with modules. Database handling and implementation of security protocols are modules, to improve flexibility compared to other frameworks. (2) To ensure loosely coupled modules the system should implement an inversion of control mechanism. (3) A focus on ease of implementation requires the system should use HL7 FHIR (Fast Interoperable Health Resources) as the primary standard because it is based on web-technologies. We evaluated the feasibility of our architecture by developing an open source implementation of the system called ORDS. ORDS is written in TypeScript, and makes use of the Express Framework and HL7 FHIR DSTU2. The code is distributed on GitHub. All modules have been tested unit wise, but end-to-end testing awaits our first clinical example implementations. Our study showed that highly adaptable and yet interoperable core frameworks for telemedicine can be designed and implemented. Future work includes implementation of a clinical use case and evaluation. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Research into alternative network approaches for space operations

    NASA Technical Reports Server (NTRS)

    Kusmanoff, Antone L.; Barton, Timothy J.

    1990-01-01

    The main goal is to resolve the interoperability problem of applications employing DOD TCP/IP (Department of Defence Transmission Control Protocol/Internet Protocol) family of protocols on a CCITT/ISO based network. The objective is to allow them to communicate over the CCITT/ISO protocol GPLAN (General Purpose Local Area Network) network without modification to the user's application programs. There were two primary assumptions associated with the solution that was actually realized. The first is that the solution had to allow for future movement to the exclusive use of the CCITT/ISO standards. The second is that the solution had to be software transparent to the currently installed TCP/IP and CCITT/ISO user application programs.

  14. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    PubMed Central

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  15. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    PubMed

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  16. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    PubMed Central

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2008-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259

  17. Interoperability challenges in river discharge modelling: A cross domain application scenario

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin

    2018-06-01

    River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.

  18. A common layer of interoperability for biomedical ontologies based on OWL EL.

    PubMed

    Hoehndorf, Robert; Dumontier, Michel; Oellrich, Anika; Wimalaratne, Sarala; Rebholz-Schuhmann, Dietrich; Schofield, Paul; Gkoutos, Georgios V

    2011-04-01

    Ontologies are essential in biomedical research due to their ability to semantically integrate content from different scientific databases and resources. Their application improves capabilities for querying and mining biological knowledge. An increasing number of ontologies is being developed for this purpose, and considerable effort is invested into formally defining them in order to represent their semantics explicitly. However, current biomedical ontologies do not facilitate data integration and interoperability yet, since reasoning over these ontologies is very complex and cannot be performed efficiently or is even impossible. We propose the use of less expressive subsets of ontology representation languages to enable efficient reasoning and achieve the goal of genuine interoperability between ontologies. We present and evaluate EL Vira, a framework that transforms OWL ontologies into the OWL EL subset, thereby enabling the use of tractable reasoning. We illustrate which OWL constructs and inferences are kept and lost following the conversion and demonstrate the performance gain of reasoning indicated by the significant reduction of processing time. We applied EL Vira to the open biomedical ontologies and provide a repository of ontologies resulting from this conversion. EL Vira creates a common layer of ontological interoperability that, for the first time, enables the creation of software solutions that can employ biomedical ontologies to perform inferences and answer complex queries to support scientific analyses. The EL Vira software is available from http://el-vira.googlecode.com and converted OBO ontologies and their mappings are available from http://bioonto.gen.cam.ac.uk/el-ont.

  19. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    PubMed Central

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  20. CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype

    NASA Technical Reports Server (NTRS)

    Lucord, Steve; Martinez, Lindolfo

    2009-01-01

    We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.

  1. PACS viewer interoperability for teleconsultation based on DICOM

    NASA Astrophysics Data System (ADS)

    Salant, Eliot; Shani, Uri

    2000-05-01

    Real-time teleconsultation in radiology enables physicians to perform same-time consultation between remote peers, based on medical images. Since digital medical images are commonly viewed on PACS workstations, it is possible to use one of several methods for remote sharing of the computer screen. For instance, software products such as Microsoft NetMeeting, or IBM SameTime, can be used. However, the amount of image data transmitted can be very high, since even minute changes in an image window/level requires re-transmitting the entire image again and again. This is too inefficient. Looking for better methods, when restricting the problem to the use of same hardware and software of the same vendor, it is easier to develop a solution that employs a proprietary specialized protocol to coordinate the visualization process. Such is a solution that we developed, and which demonstrated an excellent performance advantage by transmitting only the graphical events between the machines, rather than the image pixels. Our solution did not inter-operate with other viewers. It worked only on X11/Motif systems, and only between compatible versions of the same viewer application. Our purpose in this paper is to enable inter-operability between viewers of different platforms, and different vendors. We distinguish three parts: Session control, audiovisual (multimedia) data exchange, and medical image sharing. We intend to deal only with the third component, assuming the use of existing standards for the first two parts. After a session between two or more parties is established, and optional audiovisual data channels are set, the medical consultation is considered as the coordinated exchange of medical image contents. Some requirements for the contents exchange protocol: In the first stage, the parties negotiate the actual set of capabilities to be used during the consultation, using a formal description of these capabilities. The capabilities that one station lacks over the other (such as specific image processing algorithms) can be 'borrowed.' In the second stage, when interaction starts, it should assume that the graphical user interface of the stations might be different, as well as working procedures. During the consultation, data is exchanged based on DICOM for the data model of medical image folders, and the data format of image objects.

  2. Reminiscing about 15 years of interoperability efforts

    DOE PAGES

    Van de Sompel, Herbert; Nelson, Michael L.

    2015-11-01

    Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less

  3. Telemonitoring systems interoperability challenge: an updated review of the applicability of ISO/IEEE 11073 standards for interoperability in telemonitoring.

    PubMed

    Galarraga, M; Serrano, L; Martinez, I; de Toledo, P; Reynolds, Melvin

    2007-01-01

    Advances in Information and Communication Technologies, ICT, are bringing new opportunities and use cases in the field of systems and Personal Health Devices used for the telemonitoring of citizens in Home or Mobile scenarios. At a time of such challenges, this review arises from the need to identify robust technical telemonitoring solutions that are both open and interoperable. These systems demand standardized solutions to be cost effective and to take advantage of standardized operation and interoperability. Thus, the fundamental challenge is to design plug-&-play devices that, either as individual elements or as components, can be incorporated in a simple way into different Telecare systems, perhaps configuring a personal user network. Moreover, there is an increasing market pressure from companies not traditionally involved in medical markets, asking for a standard for Personal Health Devices, which foresee a vast demand for telemonitoring, wellness, Ambient Assisted Living (AAL) and e-health applications. However, the newly emerging situations imply very strict requirements for the protocols involved in the communication. The ISO/IEEE 11073 family of standards is adapting and moving in order to face the challenge and might appear the best positioned international standards to reach this goal. This work presents an updated survey of these standards, trying to track the changes that are being fulfilled, and tries to serve as a starting-point for those who want to familiarize themselves with them.

  4. Design and Implement AN Interoperable Internet of Things Application Based on AN Extended Ogc Sensorthings Api Standard

    NASA Astrophysics Data System (ADS)

    Huang, C. Y.; Wu, C. H.

    2016-06-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control IoT devices that use different communication protocols, which could consequently achieve the interoperable Internet of Things infrastructure.

  5. Creating personalised clinical pathways by semantic interoperability with electronic health records.

    PubMed

    Wang, Hua-Qiong; Li, Jing-Song; Zhang, Yi-Fan; Suzuki, Muneou; Araki, Kenji

    2013-06-01

    There is a growing realisation that clinical pathways (CPs) are vital for improving the treatment quality of healthcare organisations. However, treatment personalisation is one of the main challenges when implementing CPs, and the inadequate dynamic adaptability restricts the practicality of CPs. The purpose of this study is to improve the practicality of CPs using semantic interoperability between knowledge-based CPs and semantic electronic health records (EHRs). Simple protocol and resource description framework query language is used to gather patient information from semantic EHRs. The gathered patient information is entered into the CP ontology represented by web ontology language. Then, after reasoning over rules described by semantic web rule language in the Jena semantic framework, we adjust the standardised CPs to meet different patients' practical needs. A CP for acute appendicitis is used as an example to illustrate how to achieve CP customisation based on the semantic interoperability between knowledge-based CPs and semantic EHRs. A personalised care plan is generated by comprehensively analysing the patient's personal allergy history and past medical history, which are stored in semantic EHRs. Additionally, by monitoring the patient's clinical information, an exception is recorded and handled during CP execution. According to execution results of the actual example, the solutions we present are shown to be technically feasible. This study contributes towards improving the clinical personalised practicality of standardised CPs. In addition, this study establishes the foundation for future work on the research and development of an independent CP system. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Sustainability of Open-Source Software Organizations as Underpinning for Sustainable Interoperability on Large Scales

    NASA Astrophysics Data System (ADS)

    Fulker, D. W.; Gallagher, J. H. R.

    2015-12-01

    OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.

  7. Electronic health records and cardiac implantable electronic devices: new paradigms and efficiencies.

    PubMed

    Slotwiner, David J

    2016-10-01

    The anticipated advantages of electronic health records (EHRs)-improved efficiency and the ability to share information across the healthcare enterprise-have so far failed to materialize. There is growing recognition that interoperability holds the key to unlocking the greatest value of EHRs. Health information technology (HIT) systems including EHRs must be able to share data and be able to interpret the shared data. This requires a controlled vocabulary with explicit definitions (data elements) as well as protocols to communicate the context in which each data element is being used (syntactic structure). Cardiac implantable electronic devices (CIEDs) provide a clear example of the challenges faced by clinicians when data is not interoperable. The proprietary data formats created by each CIED manufacturer, as well as the multiple sources of data generated by CIEDs (hospital, office, remote monitoring, acute care setting), make it challenging to aggregate even a single patient's data into an EHR. The Heart Rhythm Society and CIED manufacturers have collaborated to develop and implement international standard-based specifications for interoperability that provide an end-to-end solution, enabling structured data to be communicated from CIED to a report generation system, EHR, research database, referring physician, registry, patient portal, and beyond. EHR and other health information technology vendors have been slow to implement these tools, in large part, because there have been no financial incentives for them to do so. It is incumbent upon us, as clinicians, to insist that the tools of interoperability be a prerequisite for the purchase of any and all health information technology systems.

  8. Ocean Data Interoperability Platform (ODIP): Developing a Common Framework for Marine Data Management on a Global Scale

    NASA Astrophysics Data System (ADS)

    Glaves, H. M.; Schaap, D.

    2014-12-01

    As marine research becomes increasingly multidisciplinary in its approach there has been a corresponding rise in the demand for large quantities of high quality interoperable data. A number of regional initiatives are already addressing this requirement through the establishment of e-infrastructures to improve the discovery and access of marine data. Projects such as Geo-Seas and SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and IMOS in Australia have implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these regional initiatives has been developed to address their own requirements and independently of other regions. To establish a common framework for marine data management on a global scale these is a need to develop interoperability solutions that can be implemented across these initiatives.Through a series of workshops attended by the relevant domain specialists, the Ocean Data Interoperability Platform (ODIP) project has identified areas of commonality between the regional infrastructures and used these as the foundation for the development of three prototype interoperability solutions addressing: the use of brokering services for the purposes of providing access to the data available in the regional data discovery and access services including via the GEOSS portal the development of interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) portal the establishment of a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE) These prototypes will be used to underpin the development of a common global approach to the management of marine data which can be promoted to the wider marine research community. ODIP is a community lead project that is currently focussed on regional initiatives in Europe, the USA and Australia but which is seeking to expand this framework to include other regional marine data infrastructures.

  9. An open repositories network development for medical teaching resources.

    PubMed

    Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius

    2010-01-01

    The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.

  10. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination

    PubMed Central

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-01-01

    Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122

  11. MENTOR: an enabler for interoperable intelligent systems

    NASA Astrophysics Data System (ADS)

    Sarraipa, João; Jardim-Goncalves, Ricardo; Steiger-Garcao, Adolfo

    2010-07-01

    A community with knowledge organisation based on ontologies will enable an increase in the computational intelligence of its information systems. However, due to the worldwide diversity of communities, a high number of knowledge representation elements, which are not semantically coincident, have appeared representing the same segment of reality, becoming a barrier to business communications. Even if a domain community uses the same kind of technologies in its information systems, such as ontologies, it doesn't solve its semantics differences. In order to solve this interoperability problem, a solution is to use a reference ontology as an intermediary in the communications between the community enterprises and the outside, while allowing the enterprises to keep their own ontology and semantics unchanged internally. This work proposes MENTOR, a methodology to support the development of a common reference ontology for a group of organisations sharing the same business domain. This methodology is based on the mediator ontology (MO) concept, which assists the semantic transformations among each enterprise's ontology and the referential one. The MO enables each organisation to keep its own terminology, glossary and ontological structures, while providing seamless communication and interaction with the others.

  12. Network Function Virtualization (NFV) based architecture to address connectivity, interoperability and manageability challenges in Internet of Things (IoT)

    NASA Astrophysics Data System (ADS)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Faris Ismail, Ahmad

    2017-11-01

    IoT aims to interconnect sensors and actuators built into devices (also known as Things) in order for them to share data and control each other to improve existing processes for making people’s life better. IoT aims to connect between all physical devices like fridges, cars, utilities, buildings and cities so that they can take advantage of small pieces of information collected by each one of these devices and derive more complex decisions. However, these devices are heterogeneous in nature because of various vendor support, connectivity options and protocol suit. Heterogeneity of such devices makes it difficult for them to leverage on each other’s capabilities in the traditional IoT architecture. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployments. Finally, the paper proposes a new architecture based on NFV to address the problems.

  13. HL7 and DICOM based integration of radiology departments with healthcare enterprise information systems.

    PubMed

    Blazona, Bojan; Koncar, Miroslav

    2007-12-01

    Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. However, this requirement represents one of the major challenges for the Information and Communication Technology (ICT) solutions, as systems today use diverse technologies, proprietary protocols and communication standards which are often not interoperable. One of the main producers of clinical information in healthcare settings represent Radiology Information Systems (RIS) that communicate using widely adopted DICOM (Digital Imaging and COmmunications in Medicine) standard, but in very few cases can efficiently integrate information of interest with other systems. In this context we identified HL7 standard as the world's leading medical ICT standard that is envisioned to provide the umbrella for medical data semantic interoperability, which amongst other things represents the cornerstone for the Croatia's National Integrated Healthcare Information System (IHCIS). The aim was to explore the ability to integrate and exchange RIS originated data with Hospital Information Systems based on HL7's CDA (Clinical Document Architecture) standard. We explored the ability of HL7 CDA specifications and methodology to address the need of RIS integration HL7 based healthcare information systems. We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers. The outcome of our pilot work proves our original assumption of HL7 standard being able to adopt radiology data into the integrated healthcare systems. Uniform DICOM to CDA translation scripts and business processes within IHCIS is desired and cost effective regarding to use of supporting IHCIS services aligned to SOA.

  14. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    NASA Astrophysics Data System (ADS)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard encoding. However, since the conceptual models are independent of concrete encodings, it is also possible to derive other encodings (e.g. based on RDF). Registers provide unique and persistent identifiers for a number of different types of information items (e.g. terms from a controlled vocabulary or units of measure) and allow their consistent management and versioning. By using these identifiers in data, references to specific information items can be made unique and unambiguous. It is important that these interoperability solutions are not developed in isolation - for Europe only. This has been identified from the beginning, and therefore, international standards have been taken into account and been widely referred to in INSPIRE. This mutual cooperation with international standardisation activities needs to be maintained or even extended. For example, where INSPIRE has gone beyond existing standards, the INSPIRE interoperability solutions should be introduced to the international standardisation initiatives. However, in some cases, it is difficult to choose the appropriate international organization or standardisation body (e.g. where there are several organizations overlapping in scope) or to achieve international agreements that accept European specifics. Furthermore, the development of the INSPIRE specifications (to be legally adopted in 2013) is only a beginning of the effort to make environmental data interoperable. Their actual implementation by data providers across Europe, as well as the rapid development in the earth sciences (e.g. from new simulation models, scientific advances, etc.) and ICT technology will lead to requests for changes. It is therefore crucial to ensure the long-term sustainable maintenance and further development of the proposed infrastructure. This task cannot be achieved by the INSPIRE coordination team of the European Commission alone. It is therefore crucial to closely involve relevant (where possible, umbrella) organisations in the earth sciences, who can provide the necessary domain knowledge and expert networks.

  15. Efficient Results in Semantic Interoperability for Health Care. Findings from the Section on Knowledge Representation and Management.

    PubMed

    Soualmia, L F; Charlet, J

    2016-11-10

    To summarize excellent current research in the field of Knowledge Representation and Management (KRM) within the health and medical care domain. We provide a synopsis of the 2016 IMIA selected articles as well as a related synthetic overview of the current and future field activities. A first step of the selection was performed through MEDLINE querying with a list of MeSH descriptors completed by a list of terms adapted to the KRM section. The second step of the selection was completed by the two section editors who separately evaluated the set of 1,432 articles. The third step of the selection consisted of a collective work that merged the evaluation results to retain 15 articles for peer-review. The selection and evaluation process of this Yearbook's section on Knowledge Representation and Management has yielded four excellent and interesting articles regarding semantic interoperability for health care by gathering heterogeneous sources (knowledge and data) and auditing ontologies. In the first article, the authors present a solution based on standards and Semantic Web technologies to access distributed and heterogeneous datasets in the domain of breast cancer clinical trials. The second article describes a knowledge-based recommendation system that relies on ontologies and Semantic Web rules in the context of chronic diseases dietary. The third article is related to concept-recognition and text-mining to derive common human diseases model and a phenotypic network of common diseases. In the fourth article, the authors highlight the need for auditing the SNOMED CT. They propose to use a crowdbased method for ontology engineering. The current research activities further illustrate the continuous convergence of Knowledge Representation and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care by proposing solutions to cope with the problem of semantic interoperability. Indeed, there is a need for powerful tools able to manage and interpret complex, large-scale and distributed datasets and knowledge bases, but also a need for user-friendly tools developed for the clinicians in their daily practice.

  16. Policy-Based Negotiation Engine for Cross-Domain Interoperability

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Chow, Edward T.

    2012-01-01

    A successful policy negotiation scheme for Policy-Based Management (PBM) has been implemented. Policy negotiation is the process of determining the "best" communication policy that all of the parties involved can agree on. Specifically, the problem is how to reconcile the various (and possibly conflicting) communication protocols used by different divisions. The solution must use protocols available to all parties involved, and should attempt to do so in the best way possible. Which protocols are commonly available, and what the definition of "best" is will be dependent on the parties involved and their individual communications priorities.

  17. Connectivity, interoperability and manageability challenges in internet of things

    NASA Astrophysics Data System (ADS)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Ismail, Ahmad Faris

    2017-09-01

    The vision of Internet of Things (IoT) is about interconnectivity between sensors, actuators, people and processes. IoT exploits connectivity between physical objects like fridges, cars, utilities, buildings and cities for enhancing the lives of people through automation and data analytics. However, this sudden increase in connected heterogeneous IoT devices takes a huge toll on the existing Internet infrastructure and introduces new challenges for researchers to embark upon. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployment. The paper finally concludes that IoT architecture and network infrastructure needs to be reengineered ground-up, so that IoT solutions can be safely and efficiently deployed.

  18. Software support in automation of medicinal product evaluations.

    PubMed

    Juric, Radmila; Shojanoori, Reza; Slevin, Lindi; Williams, Stephen

    2005-01-01

    Medicinal product evaluation is one of the most important tasks undertaken by government health departments and their regulatory authorities, in every country in the world. The automation and adequate software support are critical tasks that can improve the efficiency and interoperation of regulatory systems across the world. In this paper we propose a software solution that supports the automation of the (i) submission of licensing applications, and (ii) evaluations of submitted licensing applications, according to regulatory authorities' procedures. The novelty of our solution is in allowing licensing applications to be submitted in any country in the world and evaluated according to any evaluation procedure (which can be chosen by either regulatory authorities or pharmaceutical companies). Consequently, submission and evaluation procedures become interoperable and the associated data repositories/databases can be shared between various countries and regulatory authorities.

  19. Design and study of geosciences data share platform :platform framework, data interoperability, share approach

    NASA Astrophysics Data System (ADS)

    Lu, H.; Yi, D.

    2010-12-01

    The Deep Exploration is one of the important approaches to the Geoscience research. Since 1980s we had started it and achieved a lot of data. Researchers usually integrate both data of space exploration and deep exploration to study geological structures and represent the Earth’s subsurface, and analyze and explain on the base of integrated data. Due to the different exploration approach it results the heterogeneity of data, and therefore the data achievement is always of the import issue to make the researchers confused. The problem of data share and interaction has to be solved during the development of the SinoProbe research project. Through the research of domestic and overseas well-known exploration project and geosciences data platform, the subject explores the solution of data share and interaction. Based on SOA we present the deep exploration data share framework which comprises three level: data level is used for the solution of data store and the integration of the heterogeneous data; medial level provides the data service of geophysics, geochemistry, etc. by the means of Web service, and carry out kinds of application combination by the use of GIS middleware and Eclipse RCP; interaction level provides professional and non-professional customer the access to different accuracy data. The framework adopts GeoSciML data interaction approach. GeoSciML is a geosciences information markup language, as an application of the OpenGIS Consortium’s (OGC) Geography Markup Language (GML). It transfers heterogeneous data into one earth frame and implements inter-operation. We dissertate in this article the solution how to integrate the heterogeneous data and share the data in the project of SinoProbe.

  20. A multi-service data management platform for scientific oceanographic products

    NASA Astrophysics Data System (ADS)

    D'Anca, Alessandro; Conte, Laura; Nassisi, Paola; Palazzo, Cosimo; Lecci, Rita; Cretì, Sergio; Mancini, Marco; Nuzzo, Alessandra; Mirto, Maria; Mannarini, Gianandrea; Coppini, Giovanni; Fiore, Sandro; Aloisio, Giovanni

    2017-02-01

    An efficient, secure and interoperable data platform solution has been developed in the TESSA project to provide fast navigation and access to the data stored in the data archive, as well as a standard-based metadata management support. The platform mainly targets scientific users and the situational sea awareness high-level services such as the decision support systems (DSS). These datasets are accessible through the following three main components: the Data Access Service (DAS), the Metadata Service and the Complex Data Analysis Module (CDAM). The DAS allows access to data stored in the archive by providing interfaces for different protocols and services for downloading, variables selection, data subsetting or map generation. Metadata Service is the heart of the information system of the TESSA products and completes the overall infrastructure for data and metadata management. This component enables data search and discovery and addresses interoperability by exploiting widely adopted standards for geospatial data. Finally, the CDAM represents the back-end of the TESSA DSS by performing on-demand complex data analysis tasks.

  1. Relevance of health level 7 clinical document architecture and integrating the healthcare enterprise cross-enterprise document sharing profile for managing chronic wounds in a telemedicine context.

    PubMed

    Finet, Philippe; Gibaud, Bernard; Dameron, Olivier; Le Bouquin Jeannès, Régine

    2016-03-01

    The number of patients with complications associated with chronic diseases increases with the ageing population. In particular, complex chronic wounds raise the re-admission rate in hospitals. In this context, the implementation of a telemedicine application in Basse-Normandie, France, contributes to reduce hospital stays and transport. This application requires a new collaboration among general practitioners, private duty nurses and the hospital staff. However, the main constraint mentioned by the users of this system is the lack of interoperability between the information system of this application and various partners' information systems. To improve medical data exchanges, the authors propose a new implementation based on the introduction of interoperable clinical documents and a digital document repository for managing the sharing of the documents between the telemedicine application users. They then show that this technical solution is suitable for any telemedicine application and any document sharing system in a healthcare facility or network.

  2. [InlineEquation not available: see fulltext.]-Means Based Fingerprint Segmentation with Sensor Interoperability

    NASA Astrophysics Data System (ADS)

    Yang, Gongping; Zhou, Guang-Tong; Yin, Yilong; Yang, Xiukun

    2010-12-01

    A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a [InlineEquation not available: see fulltext.]-means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the [InlineEquation not available: see fulltext.]-means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV). SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  3. Towards an ontology for data quality in integrated chronic disease management: a realist review of the literature.

    PubMed

    Liaw, S T; Rahimi, A; Ray, P; Taggart, J; Dennis, S; de Lusignan, S; Jalaludin, B; Yeo, A E T; Talaei-Khoei, A

    2013-01-01

    Effective use of routine data to support integrated chronic disease management (CDM) and population health is dependent on underlying data quality (DQ) and, for cross system use of data, semantic interoperability. An ontological approach to DQ is a potential solution but research in this area is limited and fragmented. Identify mechanisms, including ontologies, to manage DQ in integrated CDM and whether improved DQ will better measure health outcomes. A realist review of English language studies (January 2001-March 2011) which addressed data quality, used ontology-based approaches and is relevant to CDM. We screened 245 papers, excluded 26 duplicates, 135 on abstract review and 31 on full-text review; leaving 61 papers for critical appraisal. Of the 33 papers that examined ontologies in chronic disease management, 13 defined data quality and 15 used ontologies for DQ. Most saw DQ as a multidimensional construct, the most used dimensions being completeness, accuracy, correctness, consistency and timeliness. The majority of studies reported tool design and development (80%), implementation (23%), and descriptive evaluations (15%). Ontological approaches were used to address semantic interoperability, decision support, flexibility of information management and integration/linkage, and complexity of information models. DQ lacks a consensus conceptual framework and definition. DQ and ontological research is relatively immature with little rigorous evaluation studies published. Ontology-based applications could support automated processes to address DQ and semantic interoperability in repositories of routinely collected data to deliver integrated CDM. We advocate moving to ontology-based design of information systems to enable more reliable use of routine data to measure health mechanisms and impacts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Ocean Data Interoperability Platform (ODIP): developing a common global framework for marine data management through international collaboration

    NASA Astrophysics Data System (ADS)

    Glaves, Helen

    2015-04-01

    Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.

  5. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    NASA Astrophysics Data System (ADS)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  6. Employing Semantic Technologies for the Orchestration of Government Services

    NASA Astrophysics Data System (ADS)

    Sabol, Tomáš; Furdík, Karol; Mach, Marián

    The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.

  7. Case Studies of Ecological Integrative Information Systems: The Luquillo and Sevilleta Information Management Systems

    NASA Astrophysics Data System (ADS)

    San Gil, Inigo; White, Marshall; Melendez, Eda; Vanderbilt, Kristin

    The thirty-year-old United States Long Term Ecological Research Network has developed extensive metadata to document their scientific data. Standard and interoperable metadata is a core component of the data-driven analytical solutions developed by this research network Content management systems offer an affordable solution for rapid deployment of metadata centered information management systems. We developed a customized integrative metadata management system based on the Drupal content management system technology. Building on knowledge and experience with the Sevilleta and Luquillo Long Term Ecological Research sites, we successfully deployed the first two medium-scale customized prototypes. In this paper, we describe the vision behind our Drupal based information management instances, and list the features offered through these Drupal based systems. We also outline the plans to expand the information services offered through these metadata centered management systems. We will conclude with the growing list of participants deploying similar instances.

  8. IHE cross-enterprise document sharing for imaging: interoperability testing software

    PubMed Central

    2010-01-01

    Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241

  9. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    PubMed

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  10. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  11. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  12. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination.

    PubMed

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-10-01

    Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association

  13. Local, regional and national interoperability in hospital-level systems architecture.

    PubMed

    Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J

    2007-01-01

    Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.

  14. EPA Scientific Knowledge Management Assessment and ...

    EPA Pesticide Factsheets

    A series of activities have been conducted by a core group of EPA scientists from across the Agency. The activities were initiated in 2012 and the focus was to increase the reuse and interoperability of science software at EPA. The need for increased reuse and interoperability is linked to the increased complexity of environmental assessments in the 21st century. This complexity is manifest in the form of problems that require integrated multi-disciplinary solutions. To enable the means to develop these solutions (i.e., science software systems) it is necessary to integrate software developed by disparate groups representing a variety of science domains. Thus, reuse and interoperability becomes imperative. This report briefly describes the chronology of activities conducted by the group of scientists to provide context for the primary purpose of this report, that is, to describe the proceedings and outcomes of the latest activity, a workshop entitled “Workshop on Advancing US EPA integration of environmental and information sciences”. The EPA has been lagging in digital maturity relative to the private sector and even other government agencies. This report helps begin the process of improving the agency’s use of digital technologies, especially in the areas of efficiency and transparency. This report contributes to SHC 1.61.2.

  15. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  16. Warthog: A MOOSE-Based Application for the Direct Code Coupling of BISON and PROTEUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.; Slattery, Stuart; Billings, Jay Jay

    The Nuclear Energy Advanced Modeling and Simulation (NEAMS) program from the Department of Energy's Office of Nuclear Energy provides a robust toolkit for the modeling and simulation of current and future advanced nuclear reactor designs. This toolkit provides these technologies organized across product lines: two divisions targeted at fuels and end-to-end reactor modeling, and a third for integration, coupling, and high-level workflow management. The Fuels Product Line and the Reactor Product line provide advanced computational technologies that serve each respective field well, however, their current lack of integration presents a major impediment to future improvements of simulation solution fidelity. Theremore » is a desire for the capability to mix and match tools across Product Lines in an effort to utilize the best from both to improve NEAMS modeling and simulation technologies. This report details a new effort to provide this Product Line interoperability through the development of a new application called Warthog. This application couples the BISON Fuel Performance application from the Fuels Product Line and the PROTEUS Core Neutronics application from the Reactors Product Line in an effort to utilize the best from all parts of the NEAMS toolkit and improve overall solution fidelity of nuclear fuel simulations. To achieve this, Warthog leverages as much prior work from the NEAMS program as possible, and in doing so, enables interoperability between the disparate MOOSE and SHARP frameworks, and the libMesh and MOAB mesh data formats. This report describes this work in full. We begin with a detailed look at the individual NEAMS framework technologies used and developed in the various Product Lines, and the current status of their interoperability. We then introduce the Warthog application: its overall architecture and the ways it leverages the best existing tools from across the NEAMS toolkit to enable BISON-PROTEUS integration. Furthermore, we show how Warthog leverages a tool known as DataTransferKit to seamlessly enable the transfer for solution data between disparate frameworks and mesh formats. To end, we demonstrate tests for the direct software coupling of BISON and PROTEUS using Warthog, and discuss current impediments and solutions to the construction of physically realistic input models for this coupled BISON-PROTEUS system.« less

  17. Distributed heterogeneous inspecting system and its middleware-based solution.

    PubMed

    Huang, Li-can; Wu, Zhao-hui; Pan, Yun-he

    2003-01-01

    There are many cases when an organization needs to monitor the data and operations of its supervised departments, especially those departments which are not owned by this organization and are managed by their own information systems. Distributed Heterogeneous Inspecting System (DHIS) is the system an organization uses to monitor its supervised departments by inspecting their information systems. In DHIS, the inspected systems are generally distributed, heterogeneous, and constructed by different companies. DHIS has three key processes-abstracting core data sets and core operation sets, collecting these sets, and inspecting these collected sets. In this paper, we present the concept and mathematical definition of DHIS, a metadata method for solving the interoperability, a security strategy for data transferring, and a middleware-based solution of DHIS. We also describe an example of the inspecting system at WENZHOU custom.

  18. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  19. Interoperability in Personalized Adaptive Learning

    ERIC Educational Resources Information Center

    Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin

    2006-01-01

    Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…

  20. Geospatial Brokering - Challenges and Future Directions

    NASA Astrophysics Data System (ADS)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  1. Multisensor interoperability for persistent surveillance and FOB protection with multiple technologies during the TNT exercise at Camp Roberts, California

    NASA Astrophysics Data System (ADS)

    Murarka, Naveen; Chambers, Jon

    2012-06-01

    Multiple sensors, providing actionable intelligence to the war fighter, often have difficulty interoperating with each other. Northrop Grumman (NG) is dedicated to solving these problems and providing complete solutions for persistent surveillance. In August, 2011, NG was invited to participate in the Tactical Network Topology (TNT) Capabilities Based Experimentation at Camp Roberts, CA to demonstrate integrated system capabilities providing Forward Operating Base (FOB) protection. This experiment was an opportunity to leverage previous efforts from NG's Rotorcraft Avionics Innovation Laboratory (RAIL) to integrate five prime systems with widely different capabilities. The five systems included a Hostile Fire and Missile Warning Sensor System, SCORPION II Unattended Ground Sensor system, Smart Integrated Vehicle Area Network (SiVAN), STARLite Synthetic Aperture Radar (SAR)/Ground Moving Target Indications (GMTI) radar system, and a vehicle with Target Location Module (TLM) and Laser Designation Module (LDM). These systems were integrated with each other and a Tactical Operations Center (TOC) equipped with RaptorX and Falconview providing a Common Operational Picture (COP) via Cursor on Target (CoT) messages. This paper will discuss this exercise, and the lessons learned, by integrating these five prime systems for persistent surveillance and FOB protection.

  2. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.

  3. Meeting People's Needs in a Fully Interoperable Domotic Environment

    PubMed Central

    Miori, Vittorio; Russo, Dario; Concordia, Cesare

    2012-01-01

    The key idea underlying many Ambient Intelligence (AmI) projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes ‘invisible’, as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users' quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today's incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space. PMID:22969322

  4. Meeting people's needs in a fully interoperable domotic environment.

    PubMed

    Miori, Vittorio; Russo, Dario; Concordia, Cesare

    2012-01-01

    The key idea underlying many Ambient Intelligence (AmI) projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes 'invisible', as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users' quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today's incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  5. An integrative solution for managing, tracing and citing sensor-related information

    NASA Astrophysics Data System (ADS)

    Koppe, Roland; Gerchow, Peter; Macario, Ana; Schewe, Ingo; Rehmcke, Steven; Düde, Tobias

    2017-04-01

    In a data-driven scientific world, the need to capture information on sensors used in the data acquisition process has become increasingly important. Following the recommendations of the Open Geospatial Consortium (OGC), we started by adopting the SensorML standard for describing platforms, devices and sensors. However, it soon became obvious to us that understanding, implementing and filling such standards costs significant effort and cannot be expected from every scientist individually. So we developed a web-based sensor management solution (https://sensor.awi.de) for describing platforms, devices and sensors as hierarchy of systems which supports tracing changes to a system whereas hiding complexity. Each platform contains devices where each device can have sensors associated with specific identifiers, contacts, events, related online resources (e.g. manufacturer factsheets, calibration documentation, data processing documentation), sensor output parameters and geo-location. In order to better understand and address real world requirements, we have closely interacted with field-going scientists in the context of the key national infrastructure project "FRontiers in Arctic marine Monitoring ocean observatory" (FRAM) during the software development. We learned that not only the lineage of observations is crucial for scientists but also alert services using value ranges, flexible output formats and information on data providers (e.g. FTP sources) for example. Mostly important, persistent and citable versions of sensor descriptions are required for traceability and reproducibility allowing seamless integration with existing information systems, e.g. PANGAEA. Within the context of the EU-funded Ocean Data Interoperability Platform project (ODIP II) and in cooperation with 52north we are proving near real-time data via Sensor Observation Services (SOS) along with sensor descriptions based on our sensor management solution. ODIP II also aims to develop a harmonized SensorML profile for the marine community which we will be adopting in our solution as soon as available. In this presentation we will show our sensor management solution which is embedded in our data flow framework to offer out-of-the-box interoperability with existing information systems and standards. In addition, we will present real world examples and challenges related to the description and traceability of sensor metadata.

  6. Semantics-Based Interoperability Framework for the Geosciences

    NASA Astrophysics Data System (ADS)

    Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.

    2008-12-01

    Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.

  7. Transportation communications interoperability : phase 2, resource evaluation.

    DOT National Transportation Integrated Search

    2006-12-01

    Based on the Arizona Department of Transportations (ADOT) previous SPR-561 Needs Assessment study, this : report continues the efforts to enhance radio interoperability between Department of Public Safety (DPS) Highway : Patrol officers and ...

  8. Semantically Interoperable XML Data

    PubMed Central

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  9. Towards semantic interoperability for electronic health records.

    PubMed

    Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam

    2007-01-01

    In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.

  10. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    PubMed

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  11. Achieving interoperability for metadata registries using comparative object modeling.

    PubMed

    Park, Yu Rang; Kim, Ju Han

    2010-01-01

    Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.

  12. An HLA-Based Approach to Quantify Achievable Performance for Tactical Edge Applications

    DTIC Science & Technology

    2011-05-01

    in: Proceedings of the 2002 Fall Simulation Interoperability Workshop, 02F- SIW -068, Nov 2002. [16] P. Knight, et al. ―WBT RTI Independent...Benchmark Tests: Design, Implementation, and Updated Results‖, in: Proceedings of the 2002 Spring Simulation Interoperability Workshop, 02S- SIW -081, March...Interoperability Workshop, 98F- SIW -085, Nov 1998. [18] S. Ferenci and R. Fujimoto. ―RTI Performance on Shared Memory and Message Passing Architectures‖, in

  13. Solving Identity Management and Interoperability Problems at Pan-European Level

    NASA Astrophysics Data System (ADS)

    Sánchez García, Sergio; Gómez Oliva, Ana

    In a globalized digital world, it is essential for persons and entities to have a recognized and unambiguous electronic identity that allows them to communicate with one another. The management of this identity by public administrations is an important challenge that becomes even more crucial when interoperability among public administrations of different countries becomes necessary, as persons and entities have different credentials depending on their own national legal frameworks. More specifically, different credentials and legal frameworks cause interoperability problems that prevent reliable access to public services in a cross-border scenarios like today's European Union. Work in this doctoral thesis try to analyze the problem in a carefully detailed manner by studying existing proposals (basically in Europe), proposing improvements in defined architectures and performing practical work to test the viability of solutions. Moreover, this thesis will also address the long-standing security problem of identity delegation, which is especially important in complex and heterogeneous service delivery environments like those mentioned above. This is a position paper.

  14. Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.

    PubMed

    Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P

    2016-11-14

    The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.

  15. The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology

    NASA Astrophysics Data System (ADS)

    Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.

    2015-12-01

    This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.

  16. A Semantic Cooperation and Interoperability Platform for the European Chambers of Commerce

    NASA Astrophysics Data System (ADS)

    Missikoff, Michele; Taglino, Francesco

    The LD-CAST project aims at developing a semantic cooperation and interoperability platform for the European Chambers of Commerce. Some of the key issues that this platform addresses are: The variety and number of different kinds of resources (i.e., business processes, concrete services) that concur to achieve a business service The diversity of cultural and procedural models emerging when composing articulated cross-country services The limited possibility of reusing similar services in different contexts (for instance, supporting the same service between different countries: an Italian-Romanian cooperation is different from an Italian-Polish one) The objective of the LD-CAST platform, and in particular of the semantic services provided therein, is to address the above problems with flexible solutions. We aim at introducing high levels of flexibility, both at the time of development of business processes and concrete services (i.e., operational services offered by service providers), with the possibility of dynamically binding c-services to the selected BP, according to user needs. To this end, an approach based on semantic services and a reference ontology has been proposed.

  17. The GEOSS solution for enabling data interoperability and integrative research.

    PubMed

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  18. Lemnos interoperable security project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halbgewachs, Ronald D.

    2010-03-01

    With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance inmore » a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.« less

  19. Understanding dental CAD/CAM for restorations--dental milling machines from a mechanical engineering viewpoint. Part B: labside milling machines.

    PubMed

    Lebon, Nicolas; Tapie, Laurent; Duret, Francois; Attal, Jean-Pierre

    2016-01-01

    Nowadays, dental numerical controlled (NC) milling machines are available for dental laboratories (labside solution) and dental production centers. This article provides a mechanical engineering approach to NC milling machines to help dental technicians understand the involvement of technology in digital dentistry practice. The technical and economic criteria are described for four labside and two production center dental NC milling machines available on the market. The technical criteria are focused on the capacities of the embedded technologies of milling machines to mill prosthetic materials and various restoration shapes. The economic criteria are focused on investment cost and interoperability with third-party software. The clinical relevance of the technology is discussed through the accuracy and integrity of the restoration. It can be asserted that dental production center milling machines offer a wider range of materials and types of restoration shapes than labside solutions, while labside solutions offer a wider range than chairside solutions. The accuracy and integrity of restorations may be improved as a function of the embedded technologies provided. However, the more complex the technical solutions available, the more skilled the user must be. Investment cost and interoperability with third-party software increase according to the quality of the embedded technologies implemented. Each private dental practice may decide which fabrication option to use depending on the scope of the practice.

  20. Ontology method for 3DGIS modeling

    NASA Astrophysics Data System (ADS)

    Sun, Min; Chen, Jun

    2006-10-01

    Data modeling is a baffling problem in 3DGIS, no satisfied solution has been provided until today, reason come from various sides. In this paper, a new solution named "Ontology method" is proposed. GIS traditional modeling method mainly focus on geometrical modeling, i.e., try to abstract geometry primitives for objects representation, this kind modeling method show it's awkward in 3DGIS modeling process. Ontology method begins modeling from establishing a set of ontology with different levels. The essential difference of this method is to swap the position of 'spatial data' and 'attribute data' in 2DGIS modeling process for 3DGIS modeling. Ontology method has great advantages in many sides, a system based on ontology is easy to realize interoperation for communication and data mining for knowledge deduction, in addition has many other advantages.

  1. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    NASA Technical Reports Server (NTRS)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  2. Reuse and Interoperability of Avionics for Space Systems

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.

    2007-01-01

    The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.

  3. Biometric identity management for standard mobile medical networks.

    PubMed

    Egner, Alexandru; Soceanu, Alexandru; Moldoveanu, Florica

    2012-01-01

    The explosion of healthcare costs over the last decade has prompted the ICT industry to respond with solutions for reducing costs while improving healthcare quality. The ISO/IEEE 11073 family of standards recently released is the first step towards interoperability of mobile medical devices used in patient environments. The standards do not, however, tackle security problems, such as identity management, or the secure exchange of medical data. This paper proposes an enhancement of the ISO/IEEE 11073-20601 protocol with an identity management system based on biometry. The paper describes a novel biometric-based authentication process, together with the biometric key generation algorithm. The proposed extension of the ISO/IEEE 11073-20601 is also presented.

  4. Moving Beyond the 10,000 Ways That Don't Work

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Arctur, D. K.; Rueda, C.

    2009-12-01

    From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.

  5. Smart Grid Interoperability Maturity Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Levinson, Alex; Mater, J.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less

  6. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    PubMed

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  7. Clinical data integration model. Core interoperability ontology for research using primary care data.

    PubMed

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.

  8. Application-Level Interoperability Across Grids and Clouds

    NASA Astrophysics Data System (ADS)

    Jha, Shantenu; Luckow, Andre; Merzky, Andre; Erdely, Miklos; Sehgal, Saurabh

    Application-level interoperability is defined as the ability of an application to utilize multiple distributed heterogeneous resources. Such interoperability is becoming increasingly important with increasing volumes of data, multiple sources of data as well as resource types. The primary aim of this chapter is to understand different ways in which application-level interoperability can be provided across distributed infrastructure. We achieve this by (i) using the canonical wordcount application, based on an enhanced version of MapReduce that scales-out across clusters, clouds, and HPC resources, (ii) establishing how SAGA enables the execution of wordcount application using MapReduce and other programming models such as Sphere concurrently, and (iii) demonstrating the scale-out of ensemble-based biomolecular simulations across multiple resources. We show user-level control of the relative placement of compute and data and also provide simple performance measures and analysis of SAGA-MapReduce when using multiple, different, heterogeneous infrastructures concurrently for the same problem instance. Finally, we discuss Azure and some of the system-level abstractions that it provides and show how it is used to support ensemble-based biomolecular simulations.

  9. MRML: an extensible communication protocol for interoperability and benchmarking of multimedia information retrieval systems

    NASA Astrophysics Data System (ADS)

    Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.

    2000-10-01

    While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.

  10. Nuclear Forensic Lab Interoperability and Criminal Investigation

    DTIC Science & Technology

    2014-08-01

    34 Hydrometallurgy (3-4): 175-180 13. Kolodynska, D., H. Hubicka, et al. (2008). " Sorption of heavy metal ions from aqueous solutions in the presence of...EDTA on monodisperse anion exchangers." Desalination (1-3): 150-166 14. Kolodynska, D. (2009). "Polyacrylate anion exchangers in sorption of heavy...F. D. and A. H. Martins (2004). "Selective sorption of nickel and cobalt from sulphate solutions using chelating resins." International Journal of

  11. Compatibility Between Metadata Standards: Import Pipeline of CDISC ODM to the Samply.MDR.

    PubMed

    Kock-Schoppenhauer, Ann-Kristin; Ulrich, Hannes; Wagen-Zink, Stefanie; Duhm-Harbeck, Petra; Ingenerf, Josef; Neuhaus, Philipp; Dugas, Martin; Bruland, Philipp

    2018-01-01

    The establishment of a digital healthcare system is a national and community task. The Federal Ministry of Education and Research in Germany is providing funding for consortia consisting of university hospitals among others participating in the "Medical Informatics Initiative". Exchange of medical data between research institutions necessitates a place where meta information for this data is made accessible. Within these consortia different metadata registry solutions were chosen. To promote interoperability between these solutions, we have examined whether the portal of Medical Data Models is eligible for managing and communicating metadata and relevant information across different data integration centres of the Medical Informatics Initiative and beyond. Apart from the MDM-portal, some ISO 11179-based systems such as Samply.MDR as well as openEHR-based solutions are going to be applyed. In this paper, we have focused on the creation of a mapping model between the CDISC ODM standard and the Samply.MDR import format. In summary, it can be stated that the mapping model is feasible and promote the exchangeability between different metadata registry approaches.

  12. Design challenges and gaps in standards in developing an interoperable zero footprint DI thin client for use in image-enabled electronic health record solutions

    NASA Astrophysics Data System (ADS)

    Agrawal, Arun; Koff, David; Bak, Peter; Bender, Duane; Castelli, Jane

    2015-03-01

    The deployment of regional and national Electronic Health Record solutions has been a focus of many countries throughout the past decade. A major challenge for these deployments has been support for ubiquitous image viewing. More specifically, these deployments require an imaging solution that can work over the Internet, leverage any point of service device: desktop, tablet, phone; and access imaging data from any source seamlessly. Whereas standards exist to enable ubiquitous image viewing, few if any solutions exist that leverage these standards and meet the challenge. Rather, most of the currently available web based DI viewing solutions are either proprietary solutions or require special plugins. We developed a true zero foot print browser based DI viewing solution based on the Web Access DICOM Objects (WADO) and Cross-enterprise Document Sharing for Imaging (XDS-I.b) standards to a) demonstrate that a truly ubiquitous image viewer can be deployed; b) identify the gaps in the current standards and the design challenges for developing such a solution. The objective was to develop a viewer, which works on all modern browsers on both desktop and mobile devices. The implementation allows basic viewing functionalities of scroll, zoom, pan and window leveling (limited). The major gaps identified in the current DICOM WADO standards are a lack of ability to allow any kind of 3D reconstruction or MPR views. Other design challenges explored include considerations related to optimization of the solution for response time and low memory foot print.

  13. Using ontological inference and hierarchical matchmaking to overcome semantic heterogeneity in remote sensing-based biodiversity monitoring

    NASA Astrophysics Data System (ADS)

    Nieland, Simon; Kleinschmit, Birgit; Förster, Michael

    2015-05-01

    Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.

  14. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran

    PubMed Central

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452

  15. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    PubMed

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  16. Setting the scene for the future: implications of key legal regulations for the development of e-health interoperability in the EU.

    PubMed

    Kautsch, Marcin; Lichoń, Mateusz; Matuszak, Natalia

    2017-10-01

    E-health has experienced a dynamic development across the European Union in the recent years and enjoys support from the European Commission that seeks to achieve interoperability of national healthcare systems in order to facilitate free movement. Differences that can be observed between the member states in legal regulations, cultural approaches and technological solutions may hinder this process. This study compares the legal standing of e-health in Denmark, Poland, Spain and the UK, along with key legal acts and their implications. The academic literature review along with an analysis of materials found through the desk study research (reports, legal acts, press articles, governmental web pages and so on) was performed in order to identify aspects relevant to e-health interoperability. The approach to legal regulation of e-health substantially differs by country. So do the procedures that they have developed regarding the requirement for patient's consent for the processing of their data, their rights to access to the medical data, to change the data, data confidentiality and types of electronic health records. The principles governing the assignment of responsibility for data protection are also different. These legal and technological differences must be reconciled if interoperability of European national e-health systems is to be achieved. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Leveraging the Semantic Web for Adaptive Education

    ERIC Educational Resources Information Center

    Kravcik, Milos; Gasevic, Dragan

    2007-01-01

    In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…

  18. SU-E-T-218: The IHE-RO Helper Tool: Demonstrating the Connectivity Issues Solved by IHE-RO.

    PubMed

    Kapoor, Rishabh; Yeung, Daniel; Kumar, Sabari Ajay; Alex, Daley; Kapur, Priyanka; Palta, Jatinder

    2012-06-01

    To develop a Web-based application (IHE-RO Helper) to allow comprehensive review of the interconnectivity and interoperability of various radiotherapy devices established through testing sanctioned by the Integrating Healthcare Enterprise-Radiation Oncology (IHE-RO). IHE-RO is an initiative sponsored by ASTRO to improve the way computer based systems in radiation oncology share information using well-defined data exchange standards (DICOM / HL7). At the IHE-RO Connectathon events over the last 4 years, 11 vendors with 14 different products have successfully tested and identified solutions to connectivity problems in treatment planning, simulation and delivery. Because the test results are highly technical, the interconnectivity issues amongst the RT devices may get overlooked by the end users. The IHE-RO helper tool is designed to operate in simple clinical terms with queries and presentations organized based on treatment techniques and clinical features that are familiar to the practitioners. For example, if you are planning to purchase a treatment planning system capable of generating plans (e.g. Stereotactic treatments) and are concerned whether the TPS can successfully transfer such data to your treatment management system (TMS) and subsequently to your treatment delivery system (TDS), the IHE-RO Helper can identify the connectivity requirements and list vendors that have successfully passed an IHE-RO Connectathon and validated their solution to the specific requirements. The IHE-RO helper tool provides a graphical and textual user interface to effectively demonstrate the solved interconnectivity problems between TPS, TMS and TDS. A report is also provided that explains the interconnectivity problems and its solutions. The IHE-RO helper is an effective tool to clearly identify vendor products that are IHE-RO compliant, thereby encourages vendor participation in testing and validation. Such a tool will be invaluable in procurement of new equipment to ensure a priori interoperability with anticipated RT devices deployed in the clinic. This research and development project is supported by the Bankhead-Coley Cancer Research Program grant # RC1-09BW-09-26833. © 2012 American Association of Physicists in Medicine.

  19. Filling the gaps between tools and users: a tool comparator, using protein-protein interaction as an example.

    PubMed

    Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi

    2008-01-01

    Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.

  20. Communication security in open health care networks.

    PubMed

    Blobel, B; Pharow, P; Engel, K; Spiegel, V; Krohn, R

    1999-01-01

    Fulfilling the shared care paradigm, health care networks providing open systems' interoperability in health care are needed. Such communicating and co-operating health information systems, dealing with sensitive personal medical information across organisational, regional, national or even international boundaries, require appropriate security solutions. Based on the generic security model, within the European MEDSEC project an open approach for secure EDI like HL7, EDIFACT, XDT or XML has been developed. The consideration includes both securing the message in an unsecure network and the transport of the unprotected information via secure channels (SSL, TLS etc.). Regarding EDI, an open and widely usable security solution has been specified and practically implemented for the examples of secure mailing and secure file transfer (FTP) via wrapping the sensitive information expressed by the corresponding protocols. The results are currently prepared for standardisation.

  1. Interoperable Archetypes With a Three Folded Terminology Governance.

    PubMed

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  2. The Next Generation of Interoperability Agents in Healthcare

    PubMed Central

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José

    2014-01-01

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351

  3. The role of markup for enabling interoperability in health informatics.

    PubMed

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  4. An Access Control and Trust Management Framework for Loosely-Coupled Multidomain Environments

    ERIC Educational Resources Information Center

    Zhang, Yue

    2010-01-01

    Multidomain environments where multiple organizations interoperate with each other are becoming a reality as can be seen in emerging Internet-based enterprise applications. Access control to ensure secure interoperation in such an environment is a crucial challenge. A multidomain environment can be categorized as "tightly-coupled" and…

  5. RuleML-Based Learning Object Interoperability on the Semantic Web

    ERIC Educational Resources Information Center

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  6. A Problem-Solving Environment for Biological Network Informatics: Bio-Spice

    DTIC Science & Technology

    2007-06-01

    user an environment to access software tools. The Dashboard is built upon the NetBeans Integrated Development Environment (IDE), an open source Java...based integration platform was demonstrated. During the subsequent six month development cycle, the first version of the NetBeans based Bio-SPICE...frameworks (OAA, NetBeans , and Systems Biology Workbench (SBW)[15]), it becomes possible for Bio-SPICE tools to truly interoperate. This interoperation

  7. Design and implementation of a CORBA-based genome mapping system prototype.

    PubMed

    Hu, J; Mungall, C; Nicholson, D; Archibald, A L

    1998-01-01

    CORBA (Common Object Request Broker Architecture), as an open standard, is considered to be a good solution for the development and deployment of applications in distributed heterogeneous environments. This technology can be applied in the bioinformatics area to enhance utilization, management and interoperation between biological resources. This paper investigates issues in developing CORBA applications for genome mapping information systems in the Internet environment with emphasis on database connectivity and graphical user interfaces. The design and implementation of a CORBA prototype for an animal genome mapping database are described. The prototype demonstration is available via: http://www.ri.bbsrc.ac.uk/ark_corba/. jian.hu@bbsrc.ac.uk

  8. Socially Aware Heterogeneous Wireless Networks

    PubMed Central

    Kosmides, Pavlos; Adamopoulou, Evgenia; Demestichas, Konstantinos; Theologou, Michael; Anagnostou, Miltiades; Rouskas, Angelos

    2015-01-01

    The development of smart cities has been the epicentre of many researchers’ efforts during the past decade. One of the key requirements for smart city networks is mobility and this is the reason stable, reliable and high-quality wireless communications are needed in order to connect people and devices. Most research efforts so far, have used different kinds of wireless and sensor networks, making interoperability rather difficult to accomplish in smart cities. One common solution proposed in the recent literature is the use of software defined networks (SDNs), in order to enhance interoperability among the various heterogeneous wireless networks. In addition, SDNs can take advantage of the data retrieved from available sensors and use them as part of the intelligent decision making process contacted during the resource allocation procedure. In this paper, we propose an architecture combining heterogeneous wireless networks with social networks using SDNs. Specifically, we exploit the information retrieved from location based social networks regarding users’ locations and we attempt to predict areas that will be crowded by using specially-designed machine learning techniques. By recognizing possible crowded areas, we can provide mobile operators with recommendations about areas requiring datacell activation or deactivation. PMID:26110402

  9. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    PubMed Central

    2011-01-01

    Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework. PMID:21806842

  10. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications.

    PubMed

    Katayama, Toshiaki; Wilkinson, Mark D; Vos, Rutger; Kawashima, Takeshi; Kawashima, Shuichi; Nakao, Mitsuteru; Yamamoto, Yasunori; Chun, Hong-Woo; Yamaguchi, Atsuko; Kawano, Shin; Aerts, Jan; Aoki-Kinoshita, Kiyoko F; Arakawa, Kazuharu; Aranda, Bruno; Bonnal, Raoul Jp; Fernández, José M; Fujisawa, Takatomo; Gordon, Paul Mk; Goto, Naohisa; Haider, Syed; Harris, Todd; Hatakeyama, Takashi; Ho, Isaac; Itoh, Masumi; Kasprzyk, Arek; Kido, Nobuhiro; Kim, Young-Joo; Kinjo, Akira R; Konishi, Fumikazu; Kovarskaya, Yulia; von Kuster, Greg; Labarga, Alberto; Limviphuvadh, Vachiranee; McCarthy, Luke; Nakamura, Yasukazu; Nam, Yunsun; Nishida, Kozo; Nishimura, Kunihiro; Nishizawa, Tatsuya; Ogishima, Soichi; Oinn, Tom; Okamoto, Shinobu; Okuda, Shujiro; Ono, Keiichiro; Oshita, Kazuki; Park, Keun-Joon; Putnam, Nicholas; Senger, Martin; Severin, Jessica; Shigemoto, Yasumasa; Sugawara, Hideaki; Taylor, James; Trelles, Oswaldo; Yamasaki, Chisato; Yamashita, Riu; Satoh, Noriyuki; Takagi, Toshihisa

    2011-08-02

    The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.

  11. Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2007-01-01

    This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.

  12. [Barriers to Digitalisation of Healthcare in Germany: A Survey of Experts].

    PubMed

    Nohl-Deryk, Pascal; Brinkmann, Jesaja Kenneth; Gerlach, Ferdinand Michael; Schreyögg, Jonas; Achelrod, Dmitrij

    2018-01-04

    Digital health is a growing area in healthcare with a huge potential. Nevertheless, the degree of digitalization in German healthcare is low when compared internationally and with other German industries. Despite political efforts, certain barriers seem to strongly impede the process of digitalization process in healthcare. We surveyed 18 representative healthcare experts from various sectors with semi-structured interviews on barriers and solutions for digital health. Thematic analysis by Braun and Clarke was used for interpretation. The interviewees identified barriers that were stakeholder-specific and across stakeholders. Self-regulatory bodies and the medical profession were found to lack willingness and organizational structure for digitalization. Lack of evidence and missing interoperability represented primary obstacles, while current legislation and financial regulations were rarely mentioned. In particular, infrastructure expansion and interoperability would require a coordinated, state intervention. Positive communication on possibilities and benefits of digital solutions was also considered important. A strong political will, an overarching strategy accompanied by a communication concept seems to be necessary in order for digital health to succeed. Regarding legislation, binding specifications, deadlines and sanctions may be needed for self-regulatory bodies, while also involving users in the development process at an early stage and creating positive incentives for using digital solutions. © Georg Thieme Verlag KG Stuttgart · New York.

  13. BabeLO--An Extensible Converter of Programming Exercises Formats

    ERIC Educational Resources Information Center

    Queiros, R.; Leal, J. P.

    2013-01-01

    In the last two decades, there was a proliferation of programming exercise formats that hinders interoperability in automatic assessment. In the lack of a widely accepted standard, a pragmatic solution is to convert content among the existing formats. BabeLO is a programming exercise converter providing services to a network of heterogeneous…

  14. Cross-domain Collaborative Research and People Interoperability: Beyond Knowledge Representation Frameworks

    NASA Astrophysics Data System (ADS)

    Fox, P. A.; Diviacco, P.; Busato, A.

    2016-12-01

    Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st century. A web based collaborative portal has been developed that integrates both approaches and will be presented. Reports will be given on initial tests that have encouraging results.

  15. Ambulatory measurement of the scapulohumeral rhythm: intra- and inter-operator agreement of a protocol based on inertial and magnetic sensors.

    PubMed

    Parel, I; Cutti, A G; Fiumana, G; Porcellini, G; Verni, G; Accardo, A P

    2012-04-01

    To measure the scapulohumeral rhythm (SHR) in outpatient settings, the motion analysis protocol named ISEO (INAIL Shoulder and Elbow Outpatient protocol) was developed, based on inertial and magnetic sensors. To complete the sensor-to-segment calibration, ISEO requires the involvement of an operator for sensor placement and for positioning the patient's arm in a predefined posture. Since this can affect the measure, this study aimed at quantifying ISEO intra- and inter-operator agreement. Forty subjects were considered, together with two operators, A and B. Three measurement sessions were completed for each subject: two by A and one by B. In each session, the humerus and scapula rotations were measured during sagittal and scapular plane elevation movements. ISEO intra- and inter-operator agreement were assessed by computing, between sessions, the: (1) similarity of the scapulohumeral patterns through the Coefficient of Multiple Correlation (CMC(2)), both considering and excluding the difference of the initial value of the scapula rotations between two sessions (inter-session offset); (2) 95% Smallest Detectable Difference (SDD(95)) in scapula range of motion. Results for CMC(2) showed that the intra- and inter-operator agreement is acceptable (median≥0.85, lower-whisker ≥ 0.75) for most of the scapula rotations, independently from the movement and the inter-session offset. The only exception is the agreement for scapula protraction-retraction and for scapula medio-lateral rotation during abduction (inter-operator), which is acceptable only if the inter-session offset is removed. SDD(95) values ranged from 4.4° to 8.6° for the inter-operator and between 4.9° and 8.5° for the intra-operator agreement. In conclusion, ISEO presents a high intra- and inter-operator agreement, particularly with the scapula inter-session offset removed. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    PubMed

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  17. Scientific Digital Libraries, Interoperability, and Ontologies

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  18. 49 CFR 232.603 - Design, interoperability, and configuration management requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: 1999; Revised 2002, 2007); (3) AAR S-4220, “ECP Cable-Based Brake DC Power Supply—Performance...; Revised: 2004); (7) AAR S-4260, “ECP Brake and Wire Distributed Power Interoperability Test Procedures...) Approval. A freight train or freight car equipped with an ECP brake system and equipment covered by the AAR...

  19. On the feasibility of interoperable schemes in hand biometrics.

    PubMed

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  20. On the Feasibility of Interoperable Schemes in Hand Biometrics

    PubMed Central

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  1. Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine

    PubMed Central

    King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas

    2014-01-01

    Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Dave; Stephan, Eric G.; Wang, Weimin

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less

  3. IHE based interoperability - benefits and challenges.

    PubMed

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  4. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    PubMed Central

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  5. [The role of Integrating the Healthcare Enterprise (IHE) in telemedicine].

    PubMed

    Bergh, B; Brandner, A; Heiß, J; Kutscha, U; Merzweiler, A; Pahontu, R; Schreiweis, B; Yüksekogul, N; Bronsch, T; Heinze, O

    2015-10-01

    Telemedicine systems are today already used in a variety of areas to improve patient care. The lack of standardization in those solutions creates a lack of interoperability of the systems. Internationally accepted standards can help to solve the lack of system interoperability. With Integrating the Healthcare Enterprise (IHE), a worldwide initiative of users and vendors is working on the use of defined standards for specific use cases by describing those use cases in so called IHE Profiles. The aim of this work is to determine how telemedicine applications can be implemented using IHE profiles. Based on a literature review, exemplary telemedicine applications are described and technical abilities of IHE Profiles are evaluated. These IHE Profiles are examined for their usability and are then evaluated in exemplary telemedicine application architectures. There are IHE Profiles which can be identified as being useful for intersectoral patient records (e.g. PEHR at Heidelberg), as well as for point to point communication where no patient record is involved. In the area of patient records, the IHE Profile "Cross-Enterprise Document Sharing (XDS)" is often used. The point to point communication can be supported using the IHE "Cross-Enterprise Document Media Interchange (XDM)". IHE-based telemedicine applications offer caregivers the possibility to be informed about their patients using data from intersectoral patient records, but also there are possible savings by reusing the standardized interfaces in other scenarios.

  6. Electronic Health Record Application Support Service Enablers.

    PubMed

    Neofytou, M S; Neokleous, K; Aristodemou, A; Constantinou, I; Antoniou, Z; Schiza, E C; Pattichis, C S; Schizas, C N

    2015-08-01

    There is a huge need for open source software solutions in the healthcare domain, given the flexibility, interoperability and resource savings characteristics they offer. In this context, this paper presents the development of three open source libraries - Specific Enablers (SEs) for eHealth applications that were developed under the European project titled "Future Internet Social and Technological Alignment Research" (FI-STAR) funded under the "Future Internet Public Private Partnership" (FI-PPP) program. The three SEs developed under the Electronic Health Record Application Support Service Enablers (EHR-EN) correspond to: a) an Electronic Health Record enabler (EHR SE), b) a patient summary enabler based on the EU project "European patient Summary Open Source services" (epSOS SE) supporting patient mobility and the offering of interoperable services, and c) a Picture Archiving and Communications System (PACS) enabler (PACS SE) based on the dcm4che open source system for the support of medical imaging functionality. The EHR SE follows the HL7 Clinical Document Architecture (CDA) V2.0 and supports the Integrating the Healthcare Enterprise (IHE) profiles (recently awarded in Connectathon 2015). These three FI-STAR platform enablers are designed to facilitate the deployment of innovative applications and value added services in the health care sector. They can be downloaded from the FI-STAR cataloque website. Work in progress focuses in the validation and evaluation scenarios for the proving and demonstration of the usability, applicability and adaptability of the proposed enablers.

  7. Requirements Development for Interoperability Simulation Capability for Law Enforcement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holter, Gregory M.

    2004-05-19

    The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less

  8. An Open Source Extensible Smart Energy Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rankin, Linda

    Aggregated distributed energy resources are the subject of much interest in the energy industry and are expected to play an important role in meeting our future energy needs by changing how we use, distribute and generate electricity. This energy future includes an increased amount of energy from renewable resources, load management techniques to improve resiliency and reliability, and distributed energy storage and generation capabilities that can be managed to meet the needs of the grid as well as individual customers. These energy assets are commonly referred to as Distributed Energy Resources (DER). DERs rely on a means to communicate informationmore » between an energy provider and multitudes of devices. Today DER control systems are typically vendor-specific, using custom hardware and software solutions. As a result, customers are locked into communication transport protocols, applications, tools, and data formats. Today’s systems are often difficult to extend to meet new application requirements, resulting in stranded assets when business requirements or energy management models evolve. By partnering with industry advisors and researchers, an implementation DER research platform was developed called the Smart Energy Framework (SEF). The hypothesis of this research was that an open source Internet of Things (IoT) framework could play a role in creating a commodity-based eco-system for DER assets that would reduce costs and provide interoperable products. SEF is based on the AllJoynTM IoT open source framework. The demonstration system incorporated DER assets, specifically batteries and smart water heaters. To verify the behavior of the distributed system, models of water heaters and batteries were also developed. An IoT interface for communicating between the assets and a control server was defined. This interface supports a series of “events” and telemetry reporting, similar to those defined by current smart grid communication standards. The results of this effort demonstrated the feasibility and application potential of using IoT frameworks for the creation of commodity-based DER systems. All of the identified commodity-based system requirements were met by the AllJoyn framework. By having commodity solutions, small vendors can enter the market and the cost of implementation for all parties is reduced. Utilities and aggregators can choose from multiple interoperable products reducing the risk of stranded assets. Based on this research it is recommended that interfaces based on existing smart grid communication protocol standards be created for these emerging IoT frameworks. These interfaces should be standardized as part of the IoT framework allowing for interoperability testing and certification. Similarly, IoT frameworks are introducing application level security. This type of security is needed for protecting application and platforms and will be important moving forward. Recommendations are that along with DER-based data model interfaces, platform and application security requirements also be prescribed when IoT devices support DER applications.« less

  9. The Research Data Alliance

    NASA Astrophysics Data System (ADS)

    Fontaine, K. S.

    2015-12-01

    The Research Data Alliance (RDA) is an international organization created in 2012 to provide researchers with a forum for identifying and removing barriers to data sharing. Since then, RDA has gained over 3000 individual members, over three dozen organizational members, 47 Interest Groups, and 17 Working Groups, all focused on research data sharing. Interoperability is one instantiation of data sharing, but is not the only barrier to overcome. Technology limitations, discipline-specific cultures that do not support sharing, lack of best-practices, or lack of good definitions, are only three of a long list of situations preventing researchers from sharing their data. This presentation will cover how RDA has grown, some details on how the first eight solutions contribute to interoperability and sharing, and a sneak peek at what's in the pipeline.

  10. An IMS-Based Middleware Solution for Energy-Efficient and Cost-Effective Mobile Multimedia Services

    NASA Astrophysics Data System (ADS)

    Bellavista, Paolo; Corradi, Antonio; Foschini, Luca

    Mobile multimedia services have recently become of extreme industrial relevance due to the advances in both wireless client devices and multimedia communications. That has motivated important standardization efforts, such as the IP Multimedia Subsystem (IMS) to support session control, mobility, and interoperability in all-IP next generation networks. Notwithstanding the central role of IMS in novel mobile multimedia, the potential of IMS-based service composition for the development of new classes of ready-to-use, energy-efficient, and cost-effective services is still widely unexplored. The paper proposes an original solution for the dynamic and standard-compliant redirection of incoming voice calls towards WiFi-equipped smart phones. The primary design guideline is to reduce energy consumption and service costs for the final user by automatically switching from the 3G to the WiFi infrastructure whenever possible. The proposal is fully compliant with the IMS standard and exploits the recently released IMS presence service to update device location and current communication opportunities. The reported experimental results point out that our solution, in a simple way and with full compliance with state-of-the-art industrially-accepted standards, can significantly increase battery lifetime without negative effects on call initiation delay.

  11. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    NASA Astrophysics Data System (ADS)

    Schaap, D.

    2015-12-01

    Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP 2 for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; establishing common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)

  12. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    NASA Astrophysics Data System (ADS)

    Schaap, Dick M. A.; Glaves, Helen

    2016-04-01

    Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; 3. the establishment of common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)

  13. The role of basic data registers in cross-border interconnection of eHealth solutions.

    PubMed

    Kregar, Mirjana; Marčun, Tomaž; Dovžan, Irma; Cehovin, Lojzka

    2011-01-01

    The increasingly closer international business cooperation in the areas of production, trade, transport and activities such as tourism and education is promoting the mobility of people. This increases the need for the provision of health care services across borders. In order to provide increasingly safer and effective treatment that is of ever higher quality in these cases as well, it is necessary to ensure that data accompanies patients even when they travel to other regions, countries or continents. eHealth solutions are one of the key tools for achieving such objectives. When building these solutions, it is necessary to take into account the different aspects and limitations brought about by the differences in the environments where such a treatment of a patient takes place. In the debates on the various types of cross-border interoperability of eHealth solutions, it is necessary to bring to attention the necessity of suitable management and interconnection of data registers that form the basis of every information system: data on patients, health care service providers and basic code tables. It is necessary to promote well-arranged and quality data in the patient's domestic environment and the best possible options for transferring and using those data in the foreign environment where the patient is receiving medical care at a particular moment. Many of the discussions dealing with conditions for the interoperability of health care information systems actually start with questions of how to ensure the interconnectivity of basic data registers.

  14. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  15. Development and application of a real-time testbed for multiagent system interoperability: A case study on hierarchical microgrid control

    DOE PAGES

    Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.

    2016-08-10

    This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less

  16. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    ERIC Educational Resources Information Center

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  17. Recent advances on terrain database correlation testing

    NASA Astrophysics Data System (ADS)

    Sakude, Milton T.; Schiavone, Guy A.; Morelos-Borja, Hector; Martin, Glenn; Cortes, Art

    1998-08-01

    Terrain database correlation is a major requirement for interoperability in distributed simulation. There are numerous situations in which terrain database correlation problems can occur that, in turn, lead to lack of interoperability in distributed training simulations. Examples are the use of different run-time terrain databases derived from inconsistent on source data, the use of different resolutions, and the use of different data models between databases for both terrain and culture data. IST has been developing a suite of software tools, named ZCAP, to address terrain database interoperability issues. In this paper we discuss recent enhancements made to this suite, including improved algorithms for sampling and calculating line-of-sight, an improved method for measuring terrain roughness, and the application of a sparse matrix method to the terrain remediation solution developed at the Visual Systems Lab of the Institute for Simulation and Training. We review the application of some of these new algorithms to the terrain correlation measurement processes. The application of these new algorithms improves our support for very large terrain databases, and provides the capability for performing test replications to estimate the sampling error of the tests. With this set of tools, a user can quantitatively assess the degree of correlation between large terrain databases.

  18. Food product tracing technology capabilities and interoperability.

    PubMed

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise identify the supply-chain partner who provided the information prior to sharing this information with product tracing technology providers. The 9 traceability solution providers who agreed to participate in this project have their systems deployed in a wide range of sectors within the food industry including, but not limited to, livestock, dairy, produce, fruits, seafood, meat, and pork; as well as in pharmaceutical, automotive, retail, and other industries. Some have also been implemented across the globe including Canada, China, USA, Norway, and the EU, among others. This broad commercial use ensures that the findings of this work are applicable to a broad spectrum of the food system. Six of the 9 participants successfully completed the data entry phase of this test. To verify successful data entry for these 6, a demo or screenshots of the data set from each system's user interface was requested. Only 4 of the 6 were able to provide us with this evidence for verification. Of the 6 that completed data entry and moved on to the scenarios phase of the test, 5 were able to provide us with the responses to the scenarios. Time metrics were useful for evaluating the scalability and usability of each technology. Scalability was derived from the time it took to enter the nonstandardized data set into the system (ranges from 7 to 11 d). Usability was derived from the time it took to query the scenarios and provide the results (from a few hours to a week). The time was measured in days it took for the participants to respond after we supplied them all the information they would need to successfully execute each test/scenario. Two of the technology solution providers successfully implemented and participated in a proof-of-concept interoperable framework during Year 2 of this study. While not required, they also demonstrated this interoperability capability on the FSMA-mandated food product tracing pilots for the U.S. FDA. This has significant real-world impact since the demonstration of interoperability enables U.S. FDA to obtain evidence on the importance and impact of data-sharing moving forward. Another real-world accomplishment is the modification or upgrade of commercial technology solutions to enhance or implement interoperability. As these systems get deployed by clients in the food industry, interoperability will no longer be an afterthought but will be built into their traceability systems. In turn, industry and regulators will better understand the capabilities of the currently available technologies, and the technology provider community will identify ways in which their systems may be further developed to increase interoperability and utility. © 2013 Institute of Food Technologists®

  19. Metadata behind the Interoperability of Wireless Sensor Networks

    PubMed Central

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability. PMID:22412330

  20. Metadata behind the Interoperability of Wireless Sensor Networks.

    PubMed

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  1. Sharing and interoperation of Digital Dongying geospatial data

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an

    2006-10-01

    Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.

  2. Systems 2020: Strategic Initiative

    DTIC Science & Technology

    2010-08-29

    research areas that enable agile, assured, efficient, and scalable systems engineering approaches to support the development of these systems. This...To increase development efficiency and ensure flexible solutions in the field, systems engineers need powerful, agile, interoperable, and scalable...design and development will be transformed as a result of Systems 2020, along with complementary enabling acquisition practice improvements initiated in

  3. Delivery of QTIiv2 Question Types

    ERIC Educational Resources Information Center

    Wills, Gary B.; Davis, Hugh C.; Gilbert, Lester; Hare, Jonathon; Howard, Yvonne; Jeyes, Steve; Millard, David; Sherratt, Robert

    2009-01-01

    The IMS Question and Test Interoperability (QTI) standard identifies 16 different question types which may be used in online assessment. While some partial implementations exist, the R2Q2 project has developed a complete solution that renders and responds to all 16 question types as specified. In addition, care has been taken in the R2Q2 project…

  4. Applying an Archetype-Based Approach to Electroencephalography/Event-Related Potential Experiments in the EEGBase Resource.

    PubMed

    Papež, Václav; Mouček, Roman

    2017-01-01

    The purpose of this study is to investigate the feasibility of applying openEHR (an archetype-based approach for electronic health records representation) to modeling data stored in EEGBase, a portal for experimental electroencephalography/event-related potential (EEG/ERP) data management. The study evaluates re-usage of existing openEHR archetypes and proposes a set of new archetypes together with the openEHR templates covering the domain. The main goals of the study are to (i) link existing EEGBase data/metadata and openEHR archetype structures and (ii) propose a new openEHR archetype set describing the EEG/ERP domain since this set of archetypes currently does not exist in public repositories. The main methodology is based on the determination of the concepts obtained from EEGBase experimental data and metadata that are expressible structurally by the openEHR reference model and semantically by openEHR archetypes. In addition, templates as the third openEHR resource allow us to define constraints over archetypes. Clinical Knowledge Manager (CKM), a public openEHR archetype repository, was searched for the archetypes matching the determined concepts. According to the search results, the archetypes already existing in CKM were applied and the archetypes not existing in the CKM were newly developed. openEHR archetypes support linkage to external terminologies. To increase semantic interoperability of the new archetypes, binding with the existing odML electrophysiological terminology was assured. Further, to increase structural interoperability, also other current solutions besides EEGBase were considered during the development phase. Finally, a set of templates using the selected archetypes was created to meet EEGBase requirements. A set of eleven archetypes that encompassed the domain of experimental EEG/ERP measurements were identified. Of these, six were reused without changes, one was extended, and four were newly created. All archetypes were arranged in the templates reflecting the EEGBase metadata structure. A mechanism of odML terminology referencing was proposed to assure semantic interoperability of the archetypes. The openEHR approach was found to be useful not only for clinical purposes but also for experimental data modeling.

  5. Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.

    PubMed

    Blobel, B G M E; Engel, K; Pharow, P

    2006-01-01

    To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.

  6. Open architecture design and approach for the Integrated Sensor Architecture (ISA)

    NASA Astrophysics Data System (ADS)

    Moulton, Christine L.; Krzywicki, Alan T.; Hepp, Jared J.; Harrell, John; Kogut, Michael

    2015-05-01

    Integrated Sensor Architecture (ISA) is designed in response to stovepiped integration approaches. The design, based on the principles of Service Oriented Architectures (SOA) and Open Architectures, addresses the problem of integration, and is not designed for specific sensors or systems. The use of SOA and Open Architecture approaches has led to a flexible, extensible architecture. Using these approaches, and supported with common data formats, open protocol specifications, and Department of Defense Architecture Framework (DoDAF) system architecture documents, an integration-focused architecture has been developed. ISA can help move the Department of Defense (DoD) from costly stovepipe solutions to a more cost-effective plug-and-play design to support interoperability.

  7. What do we know about developing patient portals? a systematic literature review.

    PubMed

    Otte-Trojel, Terese; de Bont, Antoinette; Rundall, Thomas G; van de Klundert, Joris

    2016-04-01

    Numerous articles have reported on the development of patient portals, including development problems and solutions. We review these articles to inform future patient portal development efforts and to provide a summary of the evidence base that can guide future research. We performed a systematic review of relevant literature to answer 5 questions: (1) What categories of problems related to patient portal development have been defined? (2) What causal factors have been identified by problem analysis and diagnosis? (3) What solutions have been proposed to ameliorate these causal factors? (4) Which proposed solutions have been implemented and in which organizational contexts? (5) Have implemented solutions been evaluated and what learning has been generated? Through searches on PubMed, ScienceDirect and LISTA, we included 109 articles. We identified 5 main problem categories: achieving patient engagement, provider engagement, appropriate data governance, security and interoperability, and a sustainable business model. Further, we identified key factors contributing to these problems as well as solutions proposed to ameliorate them. While about half (45) of the 109 articles proposed solutions, fewer than half of these solutions (18) were implemented, and even fewer (5) were evaluated to generate learning about their effects. Few studies systematically report on the patient portal development processes. As a result, the review does not provide an evidence base for portal development. Our findings support a set of recommendations for advancement of the evidence base: future research should build on existing evidence, draw on principles from design sciences conveyed in the problem-solving cycle, and seek to produce evidence within various different organizational contexts. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. What do we know about developing patient portals? a systematic literature review

    PubMed Central

    de Bont, Antoinette; Rundall, Thomas G; van de Klundert, Joris

    2016-01-01

    Objective Numerous articles have reported on the development of patient portals, including development problems and solutions. We review these articles to inform future patient portal development efforts and to provide a summary of the evidence base that can guide future research. Materials and Methods We performed a systematic review of relevant literature to answer 5 questions: (1) What categories of problems related to patient portal development have been defined? (2) What causal factors have been identified by problem analysis and diagnosis? (3) What solutions have been proposed to ameliorate these causal factors? (4) Which proposed solutions have been implemented and in which organizational contexts? (5) Have implemented solutions been evaluated and what learning has been generated? Through searches on PubMed, ScienceDirect and LISTA, we included 109 articles. Results We identified 5 main problem categories: achieving patient engagement, provider engagement, appropriate data governance, security and interoperability, and a sustainable business model. Further, we identified key factors contributing to these problems as well as solutions proposed to ameliorate them. While about half (45) of the 109 articles proposed solutions, fewer than half of these solutions (18) were implemented, and even fewer (5) were evaluated to generate learning about their effects. Discussion Few studies systematically report on the patient portal development processes. As a result, the review does not provide an evidence base for portal development. Conclusion Our findings support a set of recommendations for advancement of the evidence base: future research should build on existing evidence, draw on principles from design sciences conveyed in the problem-solving cycle, and seek to produce evidence within various different organizational contexts. PMID:26335985

  9. EVA safety: Space suit system interoperability

    NASA Technical Reports Server (NTRS)

    Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.

    1995-01-01

    The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.

  10. Using Fast Healthcare Interoperability Resources (FHIR) for the Integration of Risk Minimization Systems in Hospitals.

    PubMed

    Saleh, Kutaiba; Stucke, Stephan; Uciteli, Alexandr; Faulbrück-Röhr, Sebastian; Neumann, Juliane; Tahar, Kais; Ammon, Danny; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Portheine, Frank; Herre, Heinrich; Kaeding, André; Specht, Martin

    2017-01-01

    With the growing strain of medical staff and complexity of patient care, the risk of medical errors increases. In this work we present the use of Fast Healthcare Interoperability Resources (FHIR) as communication standard for the integration of an ontology- and agent-based system to identify risks across medical processes in a clinical environment.

  11. Achieving control and interoperability through unified model-based systems and software engineering

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  12. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm.

    PubMed

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.

  13. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm

    PubMed Central

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850

  14. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  15. A distributed framework for health information exchange using smartphone technologies.

    PubMed

    Abdulnabi, Mohamed; Al-Haiqi, Ahmed; Kiah, M L M; Zaidan, A A; Zaidan, B B; Hussain, Muzammil

    2017-05-01

    Nationwide health information exchange (NHIE) continues to be a persistent concern for government agencies, despite the many efforts and the conceived benefits of sharing patient data among healthcare providers. Difficulties in ensuring global connectivity, interoperability, and concerns on security have always hampered the government from successfully deploying NHIE. By looking at NHIE from a fresh perspective and bearing in mind the pervasiveness and power of modern mobile platforms, this paper proposes a new approach to NHIE that builds on the notion of consumer-mediated HIE, albeit without the focus on central health record banks. With the growing acceptance of smartphones as reliable, indispensable, and most personal devices, we suggest to leverage the concept of mobile personal health records (PHRs installed on smartphones) to the next level. We envision mPHRs that take the form of distributed storage units for health information, under the full control and direct possession of patients, who can have ready access to their personal data whenever needed. However, for the actual exchange of data with health information systems managed by healthcare providers, the latter have to be interoperable with patient-carried mPHRs. Computer industry has long ago solved a similar problem of interoperability between peripheral devices and operating systems. We borrow from that solution the idea of providing special interfaces between mPHRs and provider systems. This interface enables the two entities to communicate with no change to either end. The design and operation of the proposed approach is explained. Additional pointers on potential implementations are provided, and issues that pertain to any solution to implement NHIE are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. VLT instruments: industrial solutions for non-scientific detector systems

    NASA Astrophysics Data System (ADS)

    Duhoux, P.; Knudstrup, J.; Lilley, P.; Di Marcantonio, P.; Cirami, R.; Mannetta, M.

    2014-07-01

    Recent improvements in industrial vision technology and products together with the increasing need for high performance, cost efficient technical detectors for astronomical instrumentation have led ESO with the contribution of INAF to evaluate this trend and elaborate ad-hoc solutions which are interoperable and compatible with the evolution of VLT standards. The ESPRESSO spectrograph shall be the first instrument deploying this technology. ESO's Technical CCD (hereafter TCCD) requirements are extensive and demanding. A lightweight, low maintenance, rugged and high performance TCCD camera product or family of products is required which can operate in the extreme environmental conditions present at ESO's observatories with minimum maintenance and minimal downtime. In addition the camera solution needs to be interchangeable between different technical roles e.g. slit viewing, pupil and field stabilization, with excellent performance characteristics under a wide range of observing conditions together with ease of use for the end user. Interoperability is enhanced by conformance to recognized electrical, mechanical and software standards. Technical requirements and evaluation criteria for the TCCD solution are discussed in more detail. A software architecture has been adopted which facilitates easy integration with TCCD's from different vendors. The communication with the devices is implemented by means of dedicated adapters allowing usage of the same core framework (business logic). The preference has been given to cameras with an Ethernet interface, using standard TCP/IP based communication. While the preferred protocol is the industrial standard GigE Vision, not all vendors supply cameras with this interface, hence proprietary socket-based protocols are also acceptable with the provision of a validated Linux compliant API. A fundamental requirement of the TCCD software is that it shall allow for a seamless integration with the existing VLT software framework. ESPRESSO is a fiber-fed, cross-dispersed echelle spectrograph that will be located in the Combined-Coudé Laboratory of the VLT in the Paranal Observatory in Chile. It will be able to operate either using the light of any of the UT's or using the incoherently combined light of up to four UT's. The stabilization of the incoming beam is achieved by dedicated piezo systems controlled via active loops closed on 4 + 4 dedicated TCCD's for the stabilization of the pupil image and of the field with a frequency goal of 3 Hz on a 2nd to 3rd magnitude star. An additional 9th TCCD system shall be used as an exposure-meter. In this paper we will present the technical CCD solution for future VLT instruments.

  17. UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.

    DTIC Science & Technology

    1986-10-06

    crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable

  18. Interoperability after deployment: persistent challenges and regional strategies in Denmark.

    PubMed

    Kierkegaard, Patrick

    2015-04-01

    The European Union has identified Denmark as one of the countries who have the potential to provide leadership and inspiration for other countries in eHealth implementation and adoption. However, Denmark has historically struggled to facilitate data exchange between their public hospitals' electronic health records (EHRs). Furthermore, state-led projects failed to adequately address the challenges of interoperability after deployment. Changes in the organizational setup and division of responsibilities concerning the future of eHealth implementations in hospitals took place, which granted the Danish regions the full responsibility for all hospital systems, specifically the consolidation of EHRs to one system per region. The regions reduced the number of different EHRs to six systems by 2014. Additionally, the first version of the National Health Record was launched to provide health care practitioners with an overview of a patient's data stored in all EHRs across the regions and within the various health sectors. The governance of national eHealth implementation plays a crucial role in the development and diffusion of interoperable technologies. Changes in the organizational setup and redistribution of responsibilities between the Danish regions and the state play a pivotal role in producing viable and coherent solutions in a timely manner. Interoperability initiatives are best managed on a regional level or by the authorities responsible for the provision of local health care services. Cross-regional communication is essential during the initial phases of planning in order to set a common goal for countrywide harmonization, coherence and collaboration. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  19. Towards technical interoperability in telemedicine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  20. Providing interoperability of eHealth communities through peer-to-peer networks.

    PubMed

    Kilic, Ozgur; Dogac, Asuman; Eichelberg, Marco

    2010-05-01

    Providing an interoperability infrastructure for Electronic Healthcare Records (EHRs) is on the agenda of many national and regional eHealth initiatives. Two important integration profiles have been specified for this purpose, namely, the "Integrating the Healthcare Enterprise (IHE) Cross-enterprise Document Sharing (XDS)" and the "IHE Cross Community Access (XCA)." IHE XDS describes how to share EHRs in a community of healthcare enterprises and IHE XCA describes how EHRs are shared across communities. However, the current version of the IHE XCA integration profile does not address some of the important challenges of cross-community exchange environments. The first challenge is scalability. If every community that joins the network needs to connect to every other community, i.e., a pure peer-to-peer network, this solution will not scale. Furthermore, each community may use a different coding vocabulary for the same metadata attribute, in which case, the target community cannot interpret the query involving such an attribute. Yet another important challenge is that each community may (and typically will) have a different patient identifier domain. Querying for the patient identifiers in the target community using patient demographic data may create patient privacy concerns. In this paper, we address each of these challenges and show how they can be handled effectively in a superpeer-based peer-to-peer architecture.

  1. Integration of design and manufacturing in a virtual enterprise using enterprise rules, intelligent agents, STEP, and work flow

    NASA Astrophysics Data System (ADS)

    Gilman, Charles R.; Aparicio, Manuel; Barry, J.; Durniak, Timothy; Lam, Herman; Ramnath, Rajiv

    1997-12-01

    An enterprise's ability to deliver new products quickly and efficiently to market is critical for competitive success. While manufactureres recognize the need for speed and flexibility to compete in this market place, companies do not have the time or capital to move to new automation technologies. The National Industrial Information Infrastructure Protocols Consortium's Solutions for MES Adaptable Replicable Technology (NIIIP SMART) subgroup is developing an information infrastructure to enable the integration and interoperation among Manufacturing Execution Systems (MES) and Enterprise Information Systems within an enterprise or among enterprises. The goal of these developments is an adaptable, affordable, reconfigurable, integratable manufacturing system. Key innovative aspects of NIIIP SMART are: (1) Design of an industry standard object model that represents the diverse aspects of MES. (2) Design of a distributed object network to support real-time information sharing. (3) Product data exchange based on STEP and EXPRESS (ISO 10303). (4) Application of workflow and knowledge management technologies to enact manufacturing and business procedures and policy. (5) Application of intelligent agents to support emergent factories. This paper illustrates how these technologies have been incorporated into the NIIIP SMART system architecture to enable the integration and interoperation of existing tools and future MES applications in a 'plug and play' environment.

  2. Ambient assisted living healthcare frameworks, platforms, standards, and quality attributes.

    PubMed

    Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard

    2014-03-04

    Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions.

  3. Ambient Assisted Living Healthcare Frameworks, Platforms, Standards, and Quality Attributes

    PubMed Central

    Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard

    2014-01-01

    Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions. PMID:24599192

  4. The interoperability force in the ERP field

    NASA Astrophysics Data System (ADS)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  5. Endpoint Naming for Space Delay/Disruption Tolerant Networking

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Burleigh, Scott; Scott, Keith

    2010-01-01

    Delay/Disruption Tolerant Networking (DTN) provides solutions to space communication challenges such as disconnections when orbiters lose line-of-sight with landers, long propagation delays over interplanetary links, and other operational constraints. DTN is critical to enabling the future space internetworking envisioned by NASA. Interoperability with international partners is essential and standardization is progressing through both the CCSDS and the IETF.

  6. Military Interoperable Digital Hospital Testbed (MIDHT)

    DTIC Science & Technology

    2010-07-01

    solutions to optimize healthcare resources for rural communities and identify lessons learned and best practices that benefit both the global MHS...providers and three CHS facilities on their business practices and process flows. Research initiatives will focus on the impact of an electronic...strategic goals and the Nationwide Health Information Network (NHIN). The MIDHT will continue to identify lessons learned/best practices that benefit

  7. Next-generation digital camera integration and software development issues

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Peters, Ken; Hecht, Richard

    1998-04-01

    This paper investigates the complexities associated with the development of next generation digital cameras due to requirements in connectivity and interoperability. Each successive generation of digital camera improves drastically in cost, performance, resolution, image quality and interoperability features. This is being accomplished by advancements in a number of areas: research, silicon, standards, etc. As the capabilities of these cameras increase, so do the requirements for both hardware and software. Today, there are two single chip camera solutions in the market including the Motorola MPC 823 and LSI DCAM- 101. Real time constraints for a digital camera may be defined by the maximum time allowable between capture of images. Constraints in the design of an embedded digital camera include processor architecture, memory, processing speed and the real-time operating systems. This paper will present the LSI DCAM-101, a single-chip digital camera solution. It will present an overview of the architecture and the challenges in hardware and software for supporting streaming video in such a complex device. Issues presented include the development of the data flow software architecture, testing and integration on this complex silicon device. The strategy for optimizing performance on the architecture will also be presented.

  8. The importance of using open source technologies and common standards for interoperability within eHealth: Perspectives from the Millennium Villages Project

    PubMed Central

    Borland, Rob; Barasa, Mourice; Iiams-Hauser, Casey; Velez, Olivia; Kaonga, Nadi Nina; Berg, Matt

    2013-01-01

    The purpose of this paper is to illustrate the importance of using open source technologies and common standards for interoperability when implementing eHealth systems and illustrate this through case studies, where possible. The sources used to inform this paper draw from the implementation and evaluation of the eHealth Program in the context of the Millennium Villages Project (MVP). As the eHealth Team was tasked to deploy an eHealth architecture, the Millennium Villages Global-Network (MVG-Net), across all fourteen of the MVP sites in Sub-Saharan Africa, the team recognized the need for standards and uniformity but also realized that context would be an important factor. Therefore, the team decided to utilize open source solutions. The MVP implementation of MVG-Net provides a model for those looking to implement informatics solutions across disciplines and countries. Furthermore, there are valuable lessons learned that the eHealth community can benefit from. By sharing lessons learned and developing an accessible, open-source eHealth platform, we believe that we can more efficiently and rapidly achieve the health-related and collaborative Millennium Development Goals (MDGs). PMID:22894051

  9. A Framework for Integration of Heterogeneous Medical Imaging Networks

    PubMed Central

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021

  10. A framework for integration of heterogeneous medical imaging networks.

    PubMed

    Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos

    2014-01-01

    Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.

  11. WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation

    NASA Astrophysics Data System (ADS)

    Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry

    An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.

  12. Inter-operator Reliability of Magnetic Resonance Image-Based Computational Fluid Dynamics Prediction of Cerebrospinal Fluid Motion in the Cervical Spine.

    PubMed

    Martin, Bryn A; Yiallourou, Theresia I; Pahlavian, Soroush Heidari; Thyagaraj, Suraj; Bunck, Alexander C; Loth, Francis; Sheffer, Daniel B; Kröger, Jan Robert; Stergiopulos, Nikolaos

    2016-05-01

    For the first time, inter-operator dependence of MRI based computational fluid dynamics (CFD) modeling of cerebrospinal fluid (CSF) in the cervical spinal subarachnoid space (SSS) is evaluated. In vivo MRI flow measurements and anatomy MRI images were obtained at the cervico-medullary junction of a healthy subject and a Chiari I malformation patient. 3D anatomies of the SSS were reconstructed by manual segmentation by four independent operators for both cases. CFD results were compared at nine axial locations along the SSS in terms of hydrodynamic and geometric parameters. Intraclass correlation (ICC) assessed the inter-operator agreement for each parameter over the axial locations and coefficient of variance (CV) compared the percentage of variance for each parameter between the operators. Greater operator dependence was found for the patient (0.19 < ICC < 0.99) near the craniovertebral junction compared to the healthy subject (ICC > 0.78). For the healthy subject, hydraulic diameter and Womersley number had the least variance (CV = ~2%). For the patient, peak diastolic velocity and Reynolds number had the smallest variance (CV = ~3%). These results show a high degree of inter-operator reliability for MRI-based CFD simulations of CSF flow in the cervical spine for healthy subjects and a lower degree of reliability for patients with Type I Chiari malformation.

  13. Inter-Operator Dependence of Magnetic Resonance Image-Based Computational Fluid Dynamics Prediction of Cerebrospinal Fluid Motion in the Cervical Spine

    PubMed Central

    Martin, Bryn A.; Yiallourou, Theresia I.; Pahlavian, Soroush Heidari; Thyagaraj, Suraj; Bunck, Alexander C.; Loth, Francis; Sheffer, Daniel B.; Kröger, Jan Robert; Stergiopulos, Nikolaos

    2015-01-01

    For the first time, inter-operator dependence of MRI based computational fluid dynamics (CFD) modeling of cerebrospinal fluid (CSF) in the cervical spinal subarachnoid space (SSS) is evaluated. In vivo MRI flow measurements and anatomy MRI images were obtained at the cervico-medullary junction of a healthy subject and a Chiari I malformation patient. 3D anatomies of the SSS were reconstructed by manual segmentation by four independent operators for both cases. CFD results were compared at nine axial locations along the SSS in terms of hydrodynamic and geometric parameters. Intraclass correlation (ICC) assessed the inter-operator agreement for each parameter over the axial locations and coefficient of variance (CV) compared the percentage of variance for each parameter between the operators. Greater operator dependence was found for the patient (0.19 0.78). For the healthy subject, hydraulic diameter and Womersley number had the least variance (CV= ~2%). For the patient, peak diastolic velocity and Reynolds number had the smallest variance (CV= ~3%). These results show a high degree of inter-operator reliability for MRI-based CFD simulations of CSF flow in the cervical spine for healthy subjects and a lower degree of reliability for patients with Type I Chiari malformation. PMID:26446009

  14. Enabling Interoperable and Selective Data Sharing among Social Networking Sites

    NASA Astrophysics Data System (ADS)

    Shin, Dongwan; Lopes, Rodrigo

    With the widespread use of social networking (SN) sites and even introduction of a social component in non-social oriented services, there is a growing concern over user privacy in general, how to handle and share user profiles across SN sites in particular. Although there have been several proprietary or open source-based approaches to unifying the creation of third party applications, the availability and retrieval of user profile information are still limited to the site where the third party application is run, mostly devoid of the support for data interoperability. In this paper we propose an approach to enabling interopearable and selective data sharing among SN sites. To support selective data sharing, we discuss an authenticated dictionary (ADT)-based credential which enables a user to share only a subset of her information certified by external SN sites with applications running on an SN site. For interoperable data sharing, we propose an extension to the OpenSocial API so that it can provide an open source-based framework for allowing the ADT-based credential to be used seamlessly among different SN sites.

  15. 802.16e System Profile for NASA Extra-Vehicular Activities

    NASA Technical Reports Server (NTRS)

    Foore, Lawrence R.; Chelmins, David T.; Nguyen, Hung D.; Downey, Joseph A.; Finn, Gregory G.; Cagley, Richard E.; Bakula, Casey J.

    2009-01-01

    This report identifies an 802.16e system profile that is applicable to a lunar surface wireless network, and specifically for meeting extra-vehicular activity (EVA) data flow requirements. EVA suit communication needs are addressed. Design-driving operational scenarios are considered. These scenarios are then used to identify a configuration of the 802.16e system (system profile) that meets EVA requirements, but also aim to make the radio realizable within EVA constraints. Limitations of this system configuration are highlighted. An overview and development status is presented by Toyon Research Corporation concerning the development of an 802.16e compatible modem under NASA s Small Business Innovative Research (SBIR) Program. This modem is based on the recommended system profile developed as part of this report. Last, a path forward is outlined that presents an evolvable solution for the EVA radio system and lunar surface radio networks. This solution is based on a custom link layer, and 802.16e compliant physical layer compliant to the identified system profile, and a later progression to a fully interoperable 802.16e system.

  16. Interoperating Cloud-based Virtual Farms

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Colamaria, F.; Colella, D.; Casula, E.; Elia, D.; Franco, A.; Lusso, S.; Luparello, G.; Masera, M.; Miniello, G.; Mura, D.; Piano, S.; Vallero, S.; Venaruzzo, M.; Vino, G.

    2015-12-01

    The present work aims at optimizing the use of computing resources available at the grid Italian Tier-2 sites of the ALICE experiment at CERN LHC by making them accessible to interactive distributed analysis, thanks to modern solutions based on cloud computing. The scalability and elasticity of the computing resources via dynamic (“on-demand”) provisioning is essentially limited by the size of the computing site, reaching the theoretical optimum only in the asymptotic case of infinite resources. The main challenge of the project is to overcome this limitation by federating different sites through a distributed cloud facility. Storage capacities of the participating sites are seen as a single federated storage area, preventing the need of mirroring data across them: high data access efficiency is guaranteed by location-aware analysis software and storage interfaces, in a transparent way from an end-user perspective. Moreover, the interactive analysis on the federated cloud reduces the execution time with respect to grid batch jobs. The tests of the investigated solutions for both cloud computing and distributed storage on wide area network will be presented.

  17. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  18. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Treesearch

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  19. KAPSE Interface Team (KIT) Public Report. Volume 7

    DTIC Science & Technology

    1989-10-01

    e E&V STATUS - Ray Szymanski (Wriaht-Patterson) was announced as the new chairman of the Evaluation and Validation Team. Two RFP’s are coming, for...are language interoperability problems (implementations using multiple languages where interoperability problems are experienced such as transferring...visitors were introduced. Ray Szymanski , the Evaluation & Validation Team Leader, is replacing Jinny Castor from Wright-Patterson Air Force Base. Dr

  20. Provision of Training for the IT Industry: The ELEVATE Project

    NASA Astrophysics Data System (ADS)

    Paraskakis, Iraklis; Konstantinidis, Andreas; Bouras, Thanassis; Perakis, Kostas; Pantelopoulos, Stelios; Hatziapostolou, Thanos

    This paper will present ELEVATE that aims to deliver an innovative training, educational and certification environment integrating the application software to be taught with the training procedure. ELEVATE aspires to address the training needs of software development SMEs and the solution proposed is based on three basic notions: to provide competence training that is tailored to the needs of the individual trainee, to allow the trainee to carry out authentic activities as well as problem based learning that draws from real life scenarios and finally to allow for the assessment and certification of the skills and competences acquired. In order to achieve the desired results the ELEVATE architecture utilises an Interactive Interoperability Layer, an Intelligent Personalization Trainer as well as the Training, Evaluation & Certification component. As an end product, the ELEVATE project The ELEVATE pedagogical model is based on blended learning, the e-Training component (an intelligent system that provides tailored training) and Learning 2.0.

  1. Maturity model for enterprise interoperability

    NASA Astrophysics Data System (ADS)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  2. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  3. Electronic health records and support for primary care teamwork.

    PubMed

    O'Malley, Ann S; Draper, Kevin; Gourevitch, Rebecca; Cross, Dori A; Scholle, Sarah Hudson

    2015-03-01

    Consensus that enhanced teamwork is necessary for efficient and effective primary care delivery is growing. We sought to identify how electronic health records (EHRs) facilitate and pose challenges to primary care teams as well as how practices are overcoming these challenges. Practices in this qualitative study were selected from those recognized as patient-centered medical homes via the National Committee for Quality Assurance 2011 tool, which included a section on practice teamwork. We interviewed 63 respondents, ranging from physicians to front-desk staff, from 27 primary care practices ranging in size, type, geography, and population size. EHRs were found to facilitate communication and task delegation in primary care teams through instant messaging, task management software, and the ability to create evidence-based templates for symptom-specific data collection from patients by medical assistants and nurses (which can offload work from physicians). Areas where respondents felt that electronic medical record EHR functionalities were weakest and posed challenges to teamwork included the lack of integrated care manager software and care plans in EHRs, poor practice registry functionality and interoperability, and inadequate ease of tracking patient data in the EHR over time. Practices developed solutions for some of the challenges they faced when attempting to use EHRs to support teamwork but wanted more permanent vendor and policy solutions for other challenges. EHR vendors in the United States need to work alongside practicing primary care teams to create more clinically useful EHRs that support dynamic care plans, integrated care management software, more functional and interoperable practice registries, and greater ease of data tracking over time. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  4. Applying standards to ICT models, tools and data in Europe to improve river basin networks and spread innovation on water sector

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier

    2015-04-01

    This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking each data element to a controlled, shared vocabulary. In Europe, INSPIRE defines a shared vocabulary and its associated links to an ontology. For hydrographical information this can be used as a baseline. • Organizational: Harmonizing policy aspects This level of interoperability deals with operational methodologies and procedures that organizations use to administrate their own data and processing capabilities and to share those capabilities with others. This layer is addressed by the adoption of common policy briefs that facilitate both robust protocols and flexibility to interact with others. • Data visualization: Making data easy to see The WMS and WMTS standards are the most commonly used geographic information visualization standards for sharing information in web portals. Our solution will incorporate a quality extension of these standards for visualizing data quality as nested layers linked to the different data sets. In the presented approach, the use of standards can be seen twofold: the tools and products should leverage standards wherever possible to ensure interoperability between solution providers, and the platform itself must utilize standards as much as possible, to allow for example the integration with other systems through open APIs or the description of available items.

  5. Integrating semantic dimension into openEHR archetypes for the management of cerebral palsy electronic medical records.

    PubMed

    Ellouze, Afef Samet; Bouaziz, Rafik; Ghorbel, Hanen

    2016-10-01

    Integrating semantic dimension into clinical archetypes is necessary once modeling medical records. First, it enables semantic interoperability and, it offers applying semantic activities on clinical data and provides a higher design quality of Electronic Medical Record (EMR) systems. However, to obtain these advantages, designers need to use archetypes that cover semantic features of clinical concepts involved in their specific applications. In fact, most of archetypes filed within open repositories are expressed in the Archetype Definition Language (ALD) which allows defining only the syntactic structure of clinical concepts weakening semantic activities on the EMR content in the semantic web environment. This paper focuses on the modeling of an EMR prototype for infants affected by Cerebral Palsy (CP), using the dual model approach and integrating semantic web technologies. Such a modeling provides a better delivery of quality of care and ensures semantic interoperability between all involved therapies' information systems. First, data to be documented are identified and collected from the involved therapies. Subsequently, data are analyzed and arranged into archetypes expressed in accordance of ADL. During this step, open archetype repositories are explored, in order to find the suitable archetypes. Then, ADL archetypes are transformed into archetypes expressed in OWL-DL (Ontology Web Language - Description Language). Finally, we construct an ontological source related to these archetypes enabling hence their annotation to facilitate data extraction and providing possibility to exercise semantic activities on such archetypes. Semantic dimension integration into EMR modeled in accordance to the archetype approach. The feasibility of our solution is shown through the development of a prototype, baptized "CP-SMS", which ensures semantic exploitation of CP EMR. This prototype provides the following features: (i) creation of CP EMR instances and their checking by using a knowledge base which we have constructed by interviews with domain experts, (ii) translation of initially CP ADL archetypes into CP OWL-DL archetypes, (iii) creation of an ontological source which we can use to annotate obtained archetypes and (vi) enrichment and supply of the ontological source and integration of semantic relations by providing hence fueling the ontology with new concepts, ensuring consistency and eliminating ambiguity between concepts. The degree of semantic interoperability that could be reached between EMR systems depends strongly on the quality of the used archetypes. Thus, the integration of semantic dimension in archetypes modeling process is crucial. By creating an ontological source and annotating archetypes, we create a supportive platform ensuring semantic interoperability between archetypes-based EMR-systems. Copyright © 2016. Published by Elsevier Inc.

  6. Simplifying HL7 Version 3 messages.

    PubMed

    Worden, Robert; Scott, Philip

    2011-01-01

    HL7 Version 3 offers a semantically robust method for healthcare interoperability but has been criticized as overly complex to implement. This paper reviews initiatives to simplify HL7 Version 3 messaging and presents a novel approach based on semantic mapping. Based on user-defined definitions, precise transforms between simple and full messages are automatically generated. Systems can be interfaced with the simple messages and achieve interoperability with full Version 3 messages through the transforms. This reduces the costs of HL7 interfacing and will encourage better uptake of HL7 Version 3 and CDA.

  7. Securing the Global Airspace System Via Identity-Based Security

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2015-01-01

    Current telecommunications systems have very good security architectures that include authentication and authorization as well as accounting. These three features enable an edge system to obtain access into a radio communication network, request specific Quality-of-Service (QoS) requirements and ensure proper billing for service. Furthermore, the links are secure. Widely used telecommunication technologies are Long Term Evolution (LTE) and Worldwide Interoperability for Microwave Access (WiMAX) This paper provides a system-level view of network-centric operations for the global airspace system and the problems and issues with deploying new technologies into the system. The paper then focuses on applying the basic security architectures of commercial telecommunication systems and deployment of federated Authentication, Authorization and Accounting systems to provide a scalable, evolvable reliable and maintainable solution to enable a globally deployable identity-based secure airspace system.

  8. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  9. A common type system for clinical natural language processing

    PubMed Central

    2013-01-01

    Background One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. Results We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. Conclusions We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types. PMID:23286462

  10. A common type system for clinical natural language processing.

    PubMed

    Wu, Stephen T; Kaggal, Vinod C; Dligach, Dmitriy; Masanz, James J; Chen, Pei; Becker, Lee; Chapman, Wendy W; Savova, Guergana K; Liu, Hongfang; Chute, Christopher G

    2013-01-03

    One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types.

  11. Cloud access to interoperable IVOA-compliant VOSpace storage

    NASA Astrophysics Data System (ADS)

    Bertocco, S.; Dowler, P.; Gaudet, S.; Major, B.; Pasian, F.; Taffoni, G.

    2018-07-01

    Handling, processing and archiving the huge amount of data produced by the new generation of experiments and instruments in Astronomy and Astrophysics are among the more exciting challenges to address in designing the future data management infrastructures and computing services. We investigated the feasibility of a data management and computation infrastructure, available world-wide, with the aim of merging the FAIR data management provided by IVOA standards with the efficiency and reliability of a cloud approach. Our work involved the Canadian Advanced Network for Astronomy Research (CANFAR) infrastructure and the European EGI federated cloud (EFC). We designed and deployed a pilot data management and computation infrastructure that provides IVOA-compliant VOSpace storage resources and wide access to interoperable federated clouds. In this paper, we detail the main user requirements covered, the technical choices and the implemented solutions and we describe the resulting Hybrid cloud Worldwide infrastructure, its benefits and limitations.

  12. An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.

    PubMed

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2010-10-01

    The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.

  13. Interoperability of Electronic Health Records: A Physician-Driven Redesign.

    PubMed

    Miller, Holly; Johns, Lucy

    2018-01-01

    PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.

  14. Privacy-Driven Design of Learning Analytics Applications: Exploring the Design Space of Solutions for Data Sharing and Interoperability

    ERIC Educational Resources Information Center

    Hoel, Tore; Chen, Weiqin

    2016-01-01

    Studies have shown that issues of privacy, control of data, and trust are essential to implementation of learning analytics systems. If these issues are not addressed appropriately, systems will tend to collapse due to a legitimacy crisis, or they will not be implemented in the first place due to resistance from learners, their parents, or their…

  15. Low-earth-orbit Satellite Internet Protocol Communications Concept and Design

    NASA Technical Reports Server (NTRS)

    Slywezak, Richard A.

    2004-01-01

    This report presents a design concept for a low-Earth-orbit end-to-end Internet-Protocol- (IP-) based mission. The goal is to maintain an up-to-date communications infrastructure that makes communications seamless with the protocols used in terrestrial computing. It is based on the premise that the use of IPs will permit greater interoperability while also reducing costs and providing users the ability to retrieve data directly from the satellite. However, implementing an IP-based solution also has a number of challenges, since wireless communications have different characteristics than wired communications. This report outlines the design of a low-Earth-orbit end-to-end IP-based mission; the ideas and concepts of Space Internet architectures and networks are beyond the scope of this document. The findings of this report show that an IP-based mission is plausible and would provide benefits to the user community, but the outstanding issues must be resolved before a design can be implemented.

  16. Bridging data models and terminologies to support adverse drug event reporting using EHR data.

    PubMed

    Declerck, G; Hussain, S; Daniel, C; Yuksel, M; Laleci, G B; Twagirumukiza, M; Jaulent, M-C

    2015-01-01

    This article is part of the Focus Theme of METHODs of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". SALUS project aims at building an interoperability platform and a dedicated toolkit to enable secondary use of electronic health records (EHR) data for post marketing drug surveillance. An important component of this toolkit is a drug-related adverse events (AE) reporting system designed to facilitate and accelerate the reporting process using automatic prepopulation mechanisms. To demonstrate SALUS approach for establishing syntactic and semantic interoperability for AE reporting. Standard (e.g. HL7 CDA-CCD) and proprietary EHR data models are mapped to the E2B(R2) data model via SALUS Common Information Model. Terminology mapping and terminology reasoning services are designed to ensure the automatic conversion of source EHR terminologies (e.g. ICD-9-CM, ICD-10, LOINC or SNOMED-CT) to the target terminology MedDRA which is expected in AE reporting forms. A validated set of terminology mappings is used to ensure the reliability of the reasoning mechanisms. The percentage of data elements of a standard E2B report that can be completed automatically has been estimated for two pilot sites. In the best scenario (i.e. the available fields in the EHR have actually been filled), only 36% (pilot site 1) and 38% (pilot site 2) of E2B data elements remain to be filled manually. In addition, most of these data elements shall not be filled in each report. SALUS platform's interoperability solutions enable partial automation of the AE reporting process, which could contribute to improve current spontaneous reporting practices and reduce under-reporting, which is currently one major obstacle in the process of acquisition of pharmacovigilance data.

  17. Applications Based on Service-Oriented Architecture (SOA) in the Field of Home Healthcare

    PubMed Central

    Sanmartin, Paul; Jimeno, Miguel

    2017-01-01

    This article makes a literature review of applications developed in the health industry which are focused on patient care from home and implement a service-oriented (SOA) design in architecture. Throughout this work, the applicability of the concept of Internet of Things (IoT) in the field of telemedicine and health care in general is evaluated. It also performs an introduction to the concept of SOA and its main features, making a small emphasis on safety aspects. As a central theme, the description of different solutions that can be found in the health industry is developed, especially those whose goal is health care at home; the main component of these solutions are body sensor networks. Finally, an analysis of the literature from the perspectives of functionalities, security implementation and semantic interoperability is made to have a better understanding of what has been done and which are probable research paths to be studied in the future. PMID:28757557

  18. Applications Based on Service-Oriented Architecture (SOA) in the Field of Home Healthcare.

    PubMed

    Avila, Karen; Sanmartin, Paul; Jabba, Daladier; Jimeno, Miguel

    2017-07-25

    This article makes a literature review of applications developed in the health industry which are focused on patient care from home and implement a service-oriented (SOA) design in architecture. Throughout this work, the applicability of the concept of Internet of Things (IoT) in the field of telemedicine and health care in general is evaluated. It also performs an introduction to the concept of SOA and its main features, making a small emphasis on safety aspects. As a central theme, the description of different solutions that can be found in the health industry is developed, especially those whose goal is health care at home; the main component of these solutions are body sensor networks. Finally, an analysis of the literature from the perspectives of functionalities, security implementation and semantic interoperability is made to have a better understanding of what has been done and which are probable research paths to be studied in the future.

  19. Designing learning management system interoperability in semantic web

    NASA Astrophysics Data System (ADS)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  20. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  1. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  2. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    NASA Astrophysics Data System (ADS)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  3. A loosely coupled framework for terminology controlled distributed EHR search for patient cohort identification in clinical research.

    PubMed

    Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N

    2012-01-01

    Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.

  4. Opportunities for Next Generation BML: Semantic C-BML

    DTIC Science & Technology

    2014-06-01

    Simulation Interoperability Workshop, 05S- SIW -007, San Diego, CA. 2005. [11] Schade, Ulrich, Bastian Haarmann, and Michael R. Hieb. "A Grammar for...Language (C-BML) Product Development Group.” Paper 06F- SIW -003. In Proceed-ings of the Fall Simulation Interoperability Workshop. Simulation...process – Based on a specification provided by the C-BML product development group (Blais, Curtis, et al; SISO Fall 2011 SIW ) – My work provides insight

  5. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.

  6. Personalized-detailed clinical model for data interoperability among clinical standards.

    PubMed

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung

    2013-08-01

    Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.

  7. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    PubMed Central

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir

    2013-01-01

    Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730

  8. Standard-compliant real-time transmission of ECGs: harmonization of ISO/IEEE 11073-PHD and SCP-ECG.

    PubMed

    Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José

    2009-01-01

    Ambient assisted living and integrated care in an aging society is based on the vision of the lifelong Electronic Health Record calling for HealthCare Information Systems and medical device interoperability. For medical devices this aim can be achieved by the consistent implementation of harmonized international interoperability standards. The ISO/IEEE 11073 (x73) family of standards is a reference standard for medical device interoperability. In its Personal Health Device (PHD) version several devices have been included, but an ECG device specialization is not yet available. On the other hand, the SCP-ECG standard for short-term diagnostic ECGs (EN1064) has been recently approved as an international standard ISO/IEEE 11073-91064:2009. In this paper, the relationships between a proposed x73-PHD model for an ECG device and the fields of the SCP-ECG standard are investigated. A proof-of-concept implementation of the proposed x73-PHD ECG model is also presented, identifying open issues to be addressed by standards development for the wider interoperability adoption of x73-PHD standards.

  9. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2010-10-01

    Philips Medical Systems Impact of ARRA/HITECH on Device Connectivity: Safe? Effective? Say what?! Todd Cooper President Breakthrough Solutions...that could notify the physician when, say , one of the devices comes discon- nected in the high-vibration environment of the plane. There is no way at...Electronic record-keeping promises to be an improvement over previous methods (eliminating problems such as illeg- ible handwriting and records

  10. Introduction to Command, Control and Communications (C3) Through Comparative Case Analysis

    DTIC Science & Technology

    1990-03-01

    enhancing the process of learning from experience. Case study allows the student to apply concepts , theories, and techniques to an actual incident within...part of the thesis describes selected principles and concepts of 33 related to cormruication management, interoperability, command structure and...The solutions to the cases require applying the principles and concepts presented in the first rart. The four cases are: (1) the Iran hostage rescue

  11. Human Interoperability: Experimentation to Understand & Improve the Human Component of Complex Systems

    DTIC Science & Technology

    2008-09-30

    and Accountability Act of 1996 ) prohibit health care providers from sharing certain information about patients, and the Posse Comitatus Act (1878...Publishers, pp. 333-380. Carley, K.M., and Svoboda, D.M. ( 1996 ). Modeling Organizational Adaptation as a Simulated Annealing Process. Sociological...Privacy: Interdisciplinary Frameworks and Solution. Hershey , PA: IGI Global. Salas, E., Sims, D.E., & Burke, C.S. (2005). Is there a big five in

  12. The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination

    PubMed Central

    Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David

    2015-01-01

    Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way. PMID:27682123

  13. Integrating the healthcare enterprise in radiation oncology plug and play--the future of radiation oncology?

    PubMed

    Abdel-Wahab, May; Rengan, Ramesh; Curran, Bruce; Swerdloff, Stuart; Miettinen, Mika; Field, Colin; Ranjitkar, Sunita; Palta, Jatinder; Tripuraneni, Prabhakar

    2010-02-01

    To describe the processes and benefits of the integrating healthcare enterprises in radiation oncology (IHE-RO). The IHE-RO process includes five basic steps. The first step is to identify common interoperability issues encountered in radiation treatment planning and the delivery process. IHE-RO committees partner with vendors to develop solutions (integration profiles) to interoperability problems. The broad application of these integration profiles across a variety of vender platforms is tested annually at the Connectathon event. Demonstration of the seamless integration and transfer of patient data to the potential users are then presented by vendors at the public demonstration event. Users can then integrate these profiles into requests for proposals and vendor contracts by institutions. Incorporation of completed integration profiles into requests for proposals can be done when purchasing new equipment. Vendors can publish IHE integration statements to document the integration profiles supported by their products. As a result, users can reference integration profiles in requests for proposals, simplifying the systems acquisition process. These IHE-RO solutions are now available in many of the commercial radiation oncology-related treatment planning, delivery, and information systems. They are also implemented at cancer care sites around the world. IHE-RO serves an important purpose for the radiation oncology community at large. Copyright 2010 Elsevier Inc. All rights reserved.

  14. The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination.

    PubMed

    Stackebrandt, Erko; Schüngel, Manuela; Martin, Dunja; Smith, David

    2015-11-18

    Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC) provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure) aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way.

  15. Integrated multi-sensor package (IMSP) for unmanned vehicle operations

    NASA Astrophysics Data System (ADS)

    Crow, Eddie C.; Reichard, Karl; Rogan, Chris; Callen, Jeff; Seifert, Elwood

    2007-10-01

    This paper describes recent efforts to develop integrated multi-sensor payloads for small robotic platforms for improved operator situational awareness and ultimately for greater robot autonomy. The focus is on enhancements to perception through integration of electro-optic, acoustic, and other sensors for navigation and inspection. The goals are to provide easier control and operation of the robot through fusion of multiple sensor outputs, to improve interoperability of the sensor payload package across multiple platforms through the use of open standards and architectures, and to reduce integration costs by embedded sensor data processing and fusion within the sensor payload package. The solutions investigated in this project to be discussed include: improved capture, processing and display of sensor data from multiple, non-commensurate sensors; an extensible architecture to support plug and play of integrated sensor packages; built-in health, power and system status monitoring using embedded diagnostics/prognostics; sensor payload integration into standard product forms for optimized size, weight and power; and the use of the open Joint Architecture for Unmanned Systems (JAUS)/ Society of Automotive Engineers (SAE) AS-4 interoperability standard. This project is in its first of three years. This paper will discuss the applicability of each of the solutions in terms of its projected impact to reducing operational time for the robot and teleoperator.

  16. M and S supporting unmanned autonomous systems (UAxS) concept development and experimentation

    NASA Astrophysics Data System (ADS)

    Biagini, Marco; Scaccianoce, Alfio; Corona, Fabio; Forconi, Sonia; Byrum, Frank; Fowler, Olivia; Sidoran, James L.

    2017-05-01

    The development of the next generation of multi-domain unmanned semi and fully autonomous C4ISR systems involves a multitude of security concerns and interoperability challenges. Conceptual solutions to capability shortfalls and gaps can be identified through Concept Development and Experimentation (CD and E) cycles. Modelling and Simulation (M and S) is a key tool in supporting unmanned autonomous systems (UAxS) CD and E activities and addressing associated security challenges. This paper serves to illustrate the application of M and S to UAxS development and highlight initiatives made by the North Atlantic Treaty Organization (NATO) M and S Centre of Excellence (CoE) to facilitate interoperability. The NATO M and S CoE collaborates with other NATO and Nations bodies in order to develop UAxS projects such as the Allied Command for Transformation Counter Unmanned Autonomous Systems (CUAxS) project or the work of Science and Technology Organization (STO) panels. Some initiatives, such as Simulated Interactive Robotics Initiative (SIRI) made the baseline for further developments and to study emerging technologies in M and S and robotics fields. Artificial Intelligence algorithm modelling, Robot Operating Systems (ROS), network operations, cyber security, interoperable languages and related data models are some of the main aspects considered in this paper. In particular, the implementation of interoperable languages like C-BML and NIEM MilOps are discussed in relation to a Command and Control - Simulation Interoperability (C2SIM) paradigm. All these technologies are used to build a conceptual architecture to support UAxS CD and E.In addition, other projects that the NATO M and S CoE is involved in, such as the NATO Urbanization Project could provide credible future operational environments and benefit UAxS project development, as dual application of UAxS technology in large urbanized areas.In conclusion, this paper contains a detailed overview regarding how applying Modelling and Simulation to support CD and E activities is a valid approach to develop and validate future capabilities requirements in general and next generation UAxS.

  17. Interoperability of medical device information and the clinical applications: an HL7 RMIM based on the ISO/IEEE 11073 DIM.

    PubMed

    Yuksel, Mustafa; Dogac, Asuman

    2011-07-01

    Medical devices are essential to the practice of modern healthcare services. Their benefits will increase if clinical software applications can seamlessly acquire the medical device data. The need to represent medical device observations in a format that can be consumable by clinical applications has already been recognized by the industry. Yet, the solutions proposed involve bilateral mappings from the ISO/IEEE 11073 Domain Information Model (DIM) to specific message or document standards. Considering that there are many different types of clinical applications such as the electronic health record and the personal health record systems, the clinical workflows, and the clinical decision support systems each conforming to different standard interfaces, detailing a mapping mechanism for every one of them introduces significant work and, thus, limits the potential health benefits of medical devices. In this paper, to facilitate the interoperability of clinical applications and the medical device data, we use the ISO/IEEE 11073 DIM to derive an HL7 v3 Refined Message Information Model (RMIM) of the medical device domain from the HL7 v3 Reference Information Mode (RIM). This makes it possible to trace the medical device data back to a standard common denominator, that is, HL7 v3 RIM from which all the other medical domains under HL7 v3 are derived. Hence, once the medical device data are obtained in the RMIM format, it can easily be transformed into HL7-based standard interfaces through XML transformations because these interfaces all have their building blocks from the same RIM. To demonstrate this, we provide the mappings from the developed RMIM to some of the widely used HL7 v3-based standard interfaces.

  18. A cryptographic key management solution for HIPAA privacy/security regulations.

    PubMed

    Lee, W-B; Lee, C-D

    2008-01-01

    The Health Insurance Portability and Accountability Act (HIPAA) privacy and security regulations are two crucial provisions in the protection of healthcare privacy. Privacy regulations create a principle to assure that patients have more control over their health information and set limits on the use and disclosure of health information. The security regulations stipulate the provisions implemented to guard data integrity, confidentiality, and availability. Undoubtedly, the cryptographic mechanisms are well defined to provide suitable solutions. In this paper, to comply with the HIPAA regulations, a flexible cryptographic key management solution is proposed to facilitate interoperations among the applied cryptographic mechanisms. In addition, case of consent exceptions intended to facilitate emergency applications and other possible exceptions can also be handled easily.

  19. Mission Control Technologies: A New Way of Designing and Evolving Mission Systems

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Walton, Joan; Saddler, Harry

    2006-01-01

    Current mission operations systems are built as a collection of monolithic software applications. Each application serves the needs of a specific user base associated with a discipline or functional role. Built to accomplish specific tasks, each application embodies specialized functional knowledge and has its own data storage, data models, programmatic interfaces, user interfaces, and customized business logic. In effect, each application creates its own walled-off environment. While individual applications are sometimes reused across multiple missions, it is expensive and time consuming to maintain these systems, and both costly and risky to upgrade them in the light of new requirements or modify them for new purposes. It is even more expensive to achieve new integrated activities across a set of monolithic applications. These problems impact the lifecycle cost (especially design, development, testing, training, maintenance, and integration) of each new mission operations system. They also inhibit system innovation and evolution. This in turn hinders NASA's ability to adopt new operations paradigms, including increasingly automated space systems, such as autonomous rovers, autonomous onboard crew systems, and integrated control of human and robotic missions. Hence, in order to achieve NASA's vision affordably and reliably, we need to consider and mature new ways to build mission control systems that overcome the problems inherent in systems of monolithic applications. The keys to the solution are modularity and interoperability. Modularity will increase extensibility (evolution), reusability, and maintainability. Interoperability will enable composition of larger systems out of smaller parts, and enable the construction of new integrated activities that tie together, at a deep level, the capabilities of many of the components. Modularity and interoperability together contribute to flexibility. The Mission Control Technologies (MCT) Project, a collaboration of multiple NASA Centers, led by NASA Ames Research Center, is building a framework to enable software to be assembled from flexible collections of components and services.

  20. Supply Chain Interoperability Measurement

    DTIC Science & Technology

    2015-06-19

    Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair

  1. SEnviro: a sensorized platform proposal using open hardware and open standards.

    PubMed

    Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín

    2015-03-06

    The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and theWeb of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented.

  2. SEnviro: A Sensorized Platform Proposal Using Open Hardware and Open Standards

    PubMed Central

    Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín

    2015-01-01

    The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and the Web of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented. PMID:25756864

  3. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study.

    PubMed

    Zhou, Yuan; Ancker, Jessica S; Upadhye, Mandar; McGeorge, Nicolette M; Guarrera, Theresa K; Hegde, Sudeep; Crane, Peter W; Fairbanks, Rollin J; Bisantz, Ann M; Kaushal, Rainu; Lin, Li

    2013-01-01

    The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques. To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices. Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members. High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients. This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  4. A Scalable QoS-Aware VoD Resource Sharing Scheme for Next Generation Networks

    NASA Astrophysics Data System (ADS)

    Huang, Chenn-Jung; Luo, Yun-Cheng; Chen, Chun-Hua; Hu, Kai-Wen

    In network-aware concept, applications are aware of network conditions and are adaptable to the varying environment to achieve acceptable and predictable performance. In this work, a solution for video on demand service that integrates wireless and wired networks by using the network aware concepts is proposed to reduce the blocking probability and dropping probability of mobile requests. Fuzzy logic inference system is employed to select appropriate cache relay nodes to cache published video streams and distribute them to different peers through service oriented architecture (SOA). SIP-based control protocol and IMS standard are adopted to ensure the possibility of heterogeneous communication and provide a framework for delivering real-time multimedia services over an IP-based network to ensure interoperability, roaming, and end-to-end session management. The experimental results demonstrate that effectiveness and practicability of the proposed work.

  5. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    PubMed

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  6. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  7. 80 FR 46010 - Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2015-08-03

    ...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...

  8. Electronic health record interoperability as realized in the Turkish health information system.

    PubMed

    Dogac, A; Yuksel, M; Avci, A; Ceyhan, B; Hülür, U; Eryilmaz, Z; Mollahaliloglu, S; Atbakan, E; Akdag, R

    2011-01-01

    The objective of this paper is to describe the techniques used in developing the National Health Information System of Turkey (NHIS-T), a nation-wide infrastructure for sharing electronic health records (EHRs). The UN/CEFACT Core Components Technical Specification (CCTS) methodology was applied to design the logical EHR structure and to increase the reuse of common information blocks in EHRs. The NHIS-T became operational on January 15, 2009. By June 2010, 99% of the public hospitals and 71% of the private and university hospitals were connected to NHIS-T with daily feeds of their patients' EHRs. Out of the 72 million citizens of Turkey, electronic healthcare records of 43 million citizens have already been created in NHIS-T. Currently, only the general practitioners can access the EHRs of their patients. In the second phase of the implementation and once the legal framework is completed, the proper patient consent mechanisms will be available through the personal health record system that is under development. At this time authorized healthcare professionals in secondary and tertiary healthcare systems can access the patients' EHRs. A number of factors affected the successful implementation of NHIS-T. First, all stakeholders have to adopt the specified standards. Second, the UN/CEFACT CCTS approach was applied which facilitated the development and understanding of rather complex EHR schemas. Finally, the comprehensive testing of vendor-based hospital information systems for their conformance to and interoperability with NHIS-T through an automated testing platform enhanced substantially the fast integration of vendor-based solutions with the NHIS-T.

  9. Ocean Data Interoperability Platform: developing a common global framework for marine data management

    NASA Astrophysics Data System (ADS)

    Glaves, Helen; Schaap, Dick

    2017-04-01

    In recent years there has been a paradigm shift in marine research moving from the traditional discipline based methodology employed at the national level by one or more organizations, to a multidisciplinary, ecosystem level approach conducted on an international scale. This increasingly holistic approach to marine research is in part being driven by policy and legislation. For example, the European Commission's Blue Growth strategy promotes sustainable growth in the marine environment including the development of sea-basin strategies (European Commission 2014). As well as this policy driven shift to ecosystem level marine research there are also scientific and economic drivers for a basin level approach. Marine monitoring is essential for assessing the health of an ecosystem and determining the impacts of specific factors and activities on it. The availability of large volumes of good quality data is fundamental to this increasingly holistic approach to ocean research but there are significant barriers to its re-use. These are due to the heterogeneity of the data resulting from having been collected by many organizations around the globe using a variety of sensors mounted on a range of different platforms. The data is then delivered and archived in a range of formats, using various spatial coordinate systems and aligned with different standards. This heterogeneity coupled with organizational and national policies on data sharing make access and re-use of marine data problematic. In response to the need for greater sharing of marine data a number of e-infrastructures have been developed but these have different levels of granularity with the majority having been developed at the regional level to address specific requirements for data e.g. SeaDataNet in Europe, the Australian Ocean Data Network (AODN). These data infrastructures are also frequently aligned with the priorities of the local funding agencies and have been created in isolation from those developed elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.

  10. International Convergence on Geoscience Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Allison, M. L.; Atkinson, R.; Arctur, D. K.; Cox, S.; Jackson, I.; Nativi, S.; Wyborn, L. A.

    2012-04-01

    There is growing international consensus on addressing the challenges to cyber(e)-infrastructure for the geosciences. These challenges include: Creating common standards and protocols; Engaging the vast number of distributed data resources; Establishing practices for recognition of and respect for intellectual property; Developing simple data and resource discovery and access systems; Building mechanisms to encourage development of web service tools and workflows for data analysis; Brokering the diverse disciplinary service buses; Creating sustainable business models for maintenance and evolution of information resources; Integrating the data management life-cycle into the practice of science. Efforts around the world are converging towards de facto creation of an integrated global digital data network for the geosciences based on common standards and protocols for data discovery and access, and a shared vision of distributed, web-based, open source interoperable data access and integration. Commonalities include use of Open Geospatial Consortium (OGC) and ISO specifications and standardized data interchange mechanisms. For multidisciplinarity, mediation, adaptation, and profiling services have been successfully introduced to leverage the geosciences standards which are commonly used by the different geoscience communities -introducing a brokering approach which extends the basic SOA archetype. Principal challenges are less technical than cultural, social, and organizational. Before we can make data interoperable, we must make people interoperable. These challenges are being met by increased coordination of development activities (technical, organizational, social) among leaders and practitioners in national and international efforts across the geosciences to foster commonalities across disparate networks. In doing so, we will 1) leverage and share resources, and developments, 2) facilitate and enhance emerging technical and structural advances, 3) promote interoperability across scientific domains, 4) support the promulgation and institutionalization of agreed-upon standards, protocols, and practice, and 5) enhance knowledge transfer not only across the community, but into the domain sciences, 6) lower existing entry barriers for users and data producers, 7) build on the existing disciplinary infrastructures leveraging their service buses. . All of these objectives are required for establishing a permanent and sustainable cyber(e)-infrastructure for the geosciences. The rationale for this approach is well articulated in the AuScope mission statement: "Many of these problems can only be solved on a national, if not global scale. No single researcher, research institution, discipline or jurisdiction can provide the solutions. We increasingly need to embrace e-Research techniques and use the internet not only to access nationally distributed datasets, instruments and compute infrastructure, but also to build online, 'virtual' communities of globally dispersed researchers." Multidisciplinary interoperability can be successfully pursued by adopting a "system of systems" or a "Network of Networks" philosophy. This approach aims to: (a) supplement but not supplant systems mandates and governance arrangements; (b) keep the existing capacities as autonomous as possible; (c) lower entry barriers; (d) Build incrementally on existing infrastructures (information systems); (e) incorporate heterogeneous resources by introducing distribution and mediation functionalities. This approach has been adopted by the European INSPIRE (Infrastructure for Spatial Information in the European Community) initiative and by the international GEOSS (Global Earth Observation System of Systems) programme.

  11. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...

  12. 81 FR 68435 - Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2016-10-04

    ...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...

  13. Towards a single seismological service infrastructure in Europe

    NASA Astrophysics Data System (ADS)

    Spinuso, A.; Trani, L.; Frobert, L.; Van Eck, T.

    2012-04-01

    In the last five year services and data providers, within the seismological community in Europe, focused their efforts in migrating the way of opening their archives towards a Service Oriented Architecture (SOA). This process tries to follow pragmatically the technological trends and available solutions aiming at effectively improving all the data stewardship activities. These advancements are possible thanks to the cooperation and the follow-ups of several EC infrastructural projects that, by looking at general purpose techniques, combine their developments envisioning a multidisciplinary platform for the earth observation as the final common objective (EPOS, Earth Plate Observation System) One of the first results of this effort is the Earthquake Data Portal (http://www.seismicportal.eu), which provides a collection of tools to discover, visualize and access a variety of seismological data sets like seismic waveform, accelerometric data, earthquake catalogs and parameters. The Portal offers a cohesive distributed search environment, linking data search and access across multiple data providers through interactive web-services, map-based tools and diverse command-line clients. Our work continues under other EU FP7 projects. Here we will address initiatives in two of those projects. The NERA, (Network of European Research Infrastructures for Earthquake Risk Assessment and Mitigation) project will implement a Common Services Architecture based on OGC services APIs, in order to provide Resource-Oriented common interfaces across the data access and processing services. This will improve interoperability between tools and across projects, enabling the development of higher-level applications that can uniformly access the data and processing services of all participants. This effort will be conducted jointly with the VERCE project (Virtual Earthquake and Seismology Research Community for Europe). VERCE aims to enable seismologists to exploit the wealth of seismic data within a data-intensive computation framework, which will be tailored to the specific needs of the community. It will provide a new interoperable infrastructure, as the computational backbone laying behind the publicly available interfaces. VERCE will have to face the challenges of implementing a service oriented architecture providing an efficient layer between the Data and the Grid infrastructures, coupling HPC data analysis and HPC data modeling applications through the execution of workflows and data sharing mechanism. Online registries of interoperable worklflow components, storage of intermediate results and data provenance are those aspects that are currently under investigations to make the VERCE facilities usable from a large scale of users, data and service providers. For such purposes the adoption of a Digital Object Architecture, to create online catalogs referencing and describing semantically all these distributed resources, such as datasets, computational processes and derivative products, is seen as one of the viable solution to monitor and steer the usage of the infrastructure, increasing its efficiency and the cooperation among the community.

  14. A Management Model for International Participation in Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    George, Patrick J.; Pease, Gary M.; Tyburski, Timothy E.

    2005-01-01

    This paper proposes an engineering management model for NASA's future space exploration missions based on past experiences working with the International Partners of the International Space Station. The authors have over 25 years of combined experience working with the European Space Agency, Japan Aerospace Exploration Agency, Canadian Space Agency, Italian Space Agency, Russian Space Agency, and their respective contractors in the design, manufacturing, verification, and integration of their elements electric power system into the United States on-orbit segment. The perspective presented is one from a specific sub-system integration role and is offered so that the lessons learned from solving issues of technical and cultural nature may be taken into account during the formulation of international partnerships. Descriptions of the types of unique problems encountered relative to interactions between international partnerships are reviewed. Solutions to the problems are offered, taking into consideration the technical implications. Through the process of investigating each solution, the important and significant issues associated with working with international engineers and managers are outlined. Potential solutions are then characterized by proposing a set of specific methodologies to jointly develop spacecraft configurations that benefits all international participants, maximizes mission success and vehicle interoperability while minimizing cost.

  15. Transformation of standardized clinical models based on OWL technologies: from CEM to OpenEHR archetypes

    PubMed Central

    Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui

    2015-01-01

    Introduction The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Methods Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. Results We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. Conclusions We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. PMID:25670753

  16. Gaps Analysis of Integrating Product Design, Manufacturing, and Quality Data in The Supply Chain Using Model-Based Definition.

    PubMed

    Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil

    2016-01-01

    Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry.

  17. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    DTIC Science & Technology

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  18. Distributed Multi-interface Catalogue for Geospatial Data

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.; Boldrini, E.

    2007-12-01

    Several geosciences communities (e.g. atmospheric science, oceanography, hydrology) have developed tailored data and metadata models and service protocol specifications for enabling online data discovery, inventory, evaluation, access and download. These specifications are conceived either profiling geospatial information standards or extending the well-accepted geosciences data models and protocols in order to capture more semantics. These artifacts have generated a set of related catalog -and inventory services- characterizing different communities, initiatives and projects. In fact, these geospatial data catalogs are discovery and access systems that use metadata as the target for query on geospatial information. The indexed and searchable metadata provide a disciplined vocabulary against which intelligent geospatial search can be performed within or among communities. There exists a clear need to conceive and achieve solutions to implement interoperability among geosciences communities, in the context of the more general geospatial information interoperability framework. Such solutions should provide search and access capabilities across catalogs, inventory lists and their registered resources. Thus, the development of catalog clearinghouse solutions is a near-term challenge in support of fully functional and useful infrastructures for spatial data (e.g. INSPIRE, GMES, NSDI, GEOSS). This implies the implementation of components for query distribution and virtual resource aggregation. These solutions must implement distributed discovery functionalities in an heterogeneous environment, requiring metadata profiles harmonization as well as protocol adaptation and mediation. We present a catalog clearinghouse solution for the interoperability of several well-known cataloguing systems (e.g. OGC CSW, THREDDS catalog and data services). The solution implements consistent resource discovery and evaluation over a dynamic federation of several well-known cataloguing and inventory systems. Prominent features include: 1)Support to distributed queries over a hierarchical data model, supporting incremental queries (i.e. query over collections, to be subsequently refined) and opaque/translucent chaining; 2)Support to several client protocols, through a compound front-end interface module. This allows to accommodate a (growing) number of cataloguing standards, or profiles thereof, including the OGC CSW interface, ebRIM Application Profile (for Core ISO Metadata and other data models), and the ISO Application Profile. The presented catalog clearinghouse supports both the opaque and translucent pattern for service chaining. In fact, the clearinghouse catalog may be configured either to completely hide the underlying federated services or to provide clients with services information. In both cases, the clearinghouse solution presents a higher level interface (i.e. OGC CSW) which harmonizes multiple lower level services (e.g. OGC CSW, WMS and WCS, THREDDS, etc.), and handles all control and interaction with them. In the translucent case, client has the option to directly access the lower level services (e.g. to improve performances). In the GEOSS context, the solution has been experimented both as a stand-alone user application and as a service framework. The first scenario allows a user to download a multi-platform client software and query a federation of cataloguing systems, that he can customize at will. The second scenario support server-side deployment and can be flexibly adapted to several use-cases, such as intranet proxy, catalog broker, etc.

  19. An Optimized, Data Distribution Service-Based Solution for Reliable Data Exchange Among Autonomous Underwater Vehicles

    PubMed Central

    Bilbao, Sonia; Martínez, Belén; Frasheri, Mirgita; Cürüklü, Baran

    2017-01-01

    Major challenges are presented when managing a large number of heterogeneous vehicles that have to communicate underwater in order to complete a global mission in a cooperative manner. In this kind of application domain, sending data through the environment presents issues that surpass the ones found in other overwater, distributed, cyber-physical systems (i.e., low bandwidth, unreliable transport medium, data representation and hardware high heterogeneity). This manuscript presents a Publish/Subscribe-based semantic middleware solution for unreliable scenarios and vehicle interoperability across cooperative and heterogeneous autonomous vehicles. The middleware relies on different iterations of the Data Distribution Service (DDS) software standard and their combined work between autonomous maritime vehicles and a control entity. It also uses several components with different functionalities deemed as mandatory for a semantic middleware architecture oriented to maritime operations (device and service registration, context awareness, access to the application layer) where other technologies are also interweaved with middleware (wireless communications, acoustic networks). Implementation details and test results, both in a laboratory and a deployment scenario, have been provided as a way to assess the quality of the system and its satisfactory performance. PMID:28783049

  20. Behavioral Reference Model for Pervasive Healthcare Systems.

    PubMed

    Tahmasbi, Arezoo; Adabi, Sahar; Rezaee, Ali

    2016-12-01

    The emergence of mobile healthcare systems is an important outcome of application of pervasive computing concepts for medical care purposes. These systems provide the facilities and infrastructure required for automatic and ubiquitous sharing of medical information. Healthcare systems have a dynamic structure and configuration, therefore having an architecture is essential for future development of these systems. The need for increased response rate, problem limited storage, accelerated processing and etc. the tendency toward creating a new generation of healthcare system architecture highlight the need for further focus on cloud-based solutions for transfer data and data processing challenges. Integrity and reliability of healthcare systems are of critical importance, as even the slightest error may put the patients' lives in danger; therefore acquiring a behavioral model for these systems and developing the tools required to model their behaviors are of significant importance. The high-level designs may contain some flaws, therefor the system must be fully examined for different scenarios and conditions. This paper presents a software architecture for development of healthcare systems based on pervasive computing concepts, and then models the behavior of described system. A set of solutions are then proposed to improve the design's qualitative characteristics including, availability, interoperability and performance.

  1. An Optimized, Data Distribution Service-Based Solution for Reliable Data Exchange Among Autonomous Underwater Vehicles.

    PubMed

    Rodríguez-Molina, Jesús; Bilbao, Sonia; Martínez, Belén; Frasheri, Mirgita; Cürüklü, Baran

    2017-08-05

    Major challenges are presented when managing a large number of heterogeneous vehicles that have to communicate underwater in order to complete a global mission in a cooperative manner. In this kind of application domain, sending data through the environment presents issues that surpass the ones found in other overwater, distributed, cyber-physical systems (i.e., low bandwidth, unreliable transport medium, data representation and hardware high heterogeneity). This manuscript presents a Publish/Subscribe-based semantic middleware solution for unreliable scenarios and vehicle interoperability across cooperative and heterogeneous autonomous vehicles. The middleware relies on different iterations of the Data Distribution Service (DDS) software standard and their combined work between autonomous maritime vehicles and a control entity. It also uses several components with different functionalities deemed as mandatory for a semantic middleware architecture oriented to maritime operations (device and service registration, context awareness, access to the application layer) where other technologies are also interweaved with middleware (wireless communications, acoustic networks). Implementation details and test results, both in a laboratory and a deployment scenario, have been provided as a way to assess the quality of the system and its satisfactory performance.

  2. Systems Harmonization and Convergence - the GIGAS Approach

    NASA Astrophysics Data System (ADS)

    Marchetti, P. G.; Biancalana, A.; Coene, Y.; Uslander, T.

    2009-04-01

    0.1 Background The GIGAS1 Support Action promotes the coherent and interoperable development of the GMES, INSPIRE and GEOSS initiatives through their concerted adoption of standards, protocols, and open architectures. 0.2 Preparing for Coordinated Data Access The GMES Coordinated Data Access System is under design and implementation2. This objective has motivated the definition of the interoperability standards between the contributing missions. The following elements have been addressed with associated papers submitted to OGC: The EO Product Metadata has been based on the OGC Geographic Markup Language, addressing sensor characteristics for optical, radar and atmospheric products. Collection and service discovery: an ISO extension package for CSW ebRim has been proposed. Catalogue Service (CSW): an Earth Observation extension package of the CSW ebRim has been proposed. Feasibility Analysis and Order: an Order interface control document and an Earth Observation profile of the Sensor Planning Service have been proposed. Online Data Access: an Earth Observation profile of the Web Map Services (WMS) for visualization and evaluation purposes has been proposed. Identity (user) management: the objective in the long term is to allow for a single sign-on to the Coordinated Data Access system by users registered in the various Earth Observation ground segments by providing a federated identity across participating ground segments, exploiting OASIS standards. 0.3 The GIGAS proposed harmonization approach The approach proposed by GIGAS is based on three elements: Technology watch Comparative analysis Shaping of initiatives and standards This paper concentrates on the methodology for technology watch and comparative analysis. The complexity of the GIGAS scenario involving huge systems (i.e. GEOSS, INSPIRE, GMES etc.) entails the interaction with different heterogeneous partners, each with a specific competence, expertise and know-how. 0.3.1 Technology watch The methodology proposed is based on an RM-ODP based study supported by interoperability use cases and scenarios used to derive requirements. GIGAS will monitor the INSPIRE, GMES and GEOSS evolution and analyze the requirements, the standards, the services and the architecture, the models, the processes and the consensus mechanisms with the same elements of the other systems under analysis. activities in the fields of standard development that are part of the three initiatives. This task will provide the basis for how these three initiatives will strategically support consensus and efficient standards development going forward. architecture, specifications, innovative concepts and software developments of past or ongoing FP6/FP7 research topics. The use of an RM-ODP approach is selected as: most of the architectural approaches to be compared are based on RM-ODP, it supports distributed processing, it aims at fostering interoperability across heterogeneous systems, it tries to hide distribution to systems developers. However, as most of the systems to be considered have the characteristic of a loosely-coupled network of systems and services instead of a "distributed processing system based on interacting objects", the RM-ODP concepts are tailored for the GIGAS needs. The usage of RM-ODP for GIGAS Requirements and Technology Watch is two-fold: Architectural analysis: It is performed for all projects and initiatives. Its purpose is to identify possibilities but also major obstacles for interoperability. Furthermore, it identifies the major use cases to be analysed in more detail. Use Case Implementation Analysis: It is used to describe how selected use cases of the projects and initiatives are implemented in the different architectures. Its purpose is to identify technological gaps and concrete problems of interoperability. It is performed only for selected use cases. The output of the Technology Watch is an RM-ODP based report containing parallel analysis on the same aspects on the three initiatives integrated by analysis of relevant FP6-FP7 projects and standardization activities. 0.3.2 Comparative Analysis Based on the outcomes of the previous monitoring tasks, GIGAS undertakes a comparative analysis on solutions, requirements, architecture, models, processes and consensus mechanisms used by INSPIRE, GMES and GEOSS, taking into account the inputs from the monitoring of FP6/FP7 research projects and the ongoing standardization activities. Initiative Contact Points will insure that the overall policy framework and schedules for each of the three initiatives will be factored in. The result of the Comparative Analysis includes: A list of recommendations to GEOSS, INSPIRE and GMES to be expanded and processed in depth in the following shaping phase The identification of technological gaps to be explored in the following shaping phase. Guidelines and objectives for the architectural approach within GIGAS Analysis on the schedules of the three initiatives and on the FP6/FP7 programs and standardization activities, with identification of key milestones or intervention points.

  3. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  4. Warfighter IT Interoperability Standards Study

    DTIC Science & Technology

    2012-07-22

    data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data

  5. MQ-4C Triton Unmanned Aircraft System (MQ-4C Triton)

    DTIC Science & Technology

    2015-12-01

    will respond to theater level operational or national strategic taskings. MQ-4C Triton December 2015 SAR March 23, 2016 15:26:01 UNCLASSIFIED 6...88% at a mission radius of 2,000 nm Level of Interoperability 1-5 BLOS and LOS from MOB/ FOB (Land Based) MCS BLOS and LOS from MOB/ FOB (Land...MCS UA Mission Radius >=3,000 nm >=3,000 nm >=2,000 nm TBD >=2,000 nm Level Of Interoperability 2 Capability LOS/BLOS multi-ISR payload reception to

  6. Arc-An OAI Service Provider for Digital Library Federation; Kepler-An OAI Data/Service Provider for the Individual; Information Objects and Rights Management: A Mediation-Based Approach to DRM Interoperability; Automated Name Authority Control and Enhanced Searching in the Levy Collection; Renardus Project Developments and the Wider Digital Library Context.

    ERIC Educational Resources Information Center

    Liu, Xiaoming; Maly, Kurt; Zubair, Mohammad; Nelson, Michael L.; Erickson, John S.; DiLauro, Tim; Choudhury, G. Sayeed; Patton, Mark; Warner, James W.; Brown, Elizabeth W.; Heery, Rachel; Carpenter, Leona; Day, Michael

    2001-01-01

    Includes five articles that discuss the OAI (Open Archive Initiative), an interface between data providers and service providers; information objects and digital rights management interoperability; digitizing library collections, including automated name authority control, metadata, and text searching engines; and building digital library services…

  7. Kalman approach to accuracy management for interoperable heterogeneous model abstraction within an HLA-compliant simulation

    NASA Astrophysics Data System (ADS)

    Leskiw, Donald M.; Zhau, Junmei

    2000-06-01

    This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.

  8. Component Technology for High-Performance Scientific Simulation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epperly, T; Kohn, S; Kumfert, G

    2000-11-09

    We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less

  9. Expanding the scope of health information systems. Challenges and developments.

    PubMed

    Kuhn, K A; Wurst, S H R; Bott, O J; Giuse, D A

    2006-01-01

    To identify current challenges and developments in health information systems. Reports on HIS, eHealth and process support were analyzed, core problems and challenges were identified. Health information systems are extending their scope towards regional networks and health IT infrastructures. Integration, interoperability and interaction design are still today's core problems. Additional problems arise through the integration of genetic information into the health care process. There are noticeable trends towards solutions for these problems.

  10. Collaborative business processes for enhancing partnerships among software services providers

    NASA Astrophysics Data System (ADS)

    Heil Cancian, Maiara; Rabelo, Ricardo; Gresse von Wangenheim, Christiane

    2015-08-01

    Software services have represented a powerful view to support the realisation of the service-oriented architecture (SOA) paradigm. Using open standards and facilitating systems projects, they have increasingly been used as a corporate architectural approach to create interoperable services-based software solutions that can more easily be reused and shared across disparate applications. In the context of software companies, most of them are small firms having enormous difficulties to keep competitive. One strategy to enhance their sustainability is to enlarge partnerships among them at a more valuable level by jointly offering (web) services-based solutions. However, their culture of collaboration is low, and partnerships are usually done with the same companies and sporadically. This article presents an approach to support a more intense collaboration among software companies to attend business opportunities in a more agile way, joining capacities and capabilities which they would not have if they worked alone. This requires, however, some preparedness. From the perspective of business processes, they should understand how to carry out a collaboration more properly. This is essentially what this article is about. It presents a comprehensive list of collaborative business processes and base practices that can also act as a guide for service providers' managers to implement and manage the collaboration along its lifecycle. Processes have been validated and results are discussed.

  11. Reviewing innovative Earth observation solutions for filling science-policy gaps in hydrology

    NASA Astrophysics Data System (ADS)

    Lehmann, Anthony; Giuliani, Gregory; Ray, Nicolas; Rahman, Kazi; Abbaspour, Karim C.; Nativi, Stefano; Craglia, Massimo; Cripe, Douglas; Quevauviller, Philippe; Beniston, Martin

    2014-10-01

    Improved data sharing is needed for hydrological modeling and water management that require better integration of data, information and models. Technological advances in Earth observation and Web technologies have allowed the development of Spatial Data Infrastructures (SDIs) for improved data sharing at various scales. International initiatives catalyze data sharing by promoting interoperability standards to maximize the use of data and by supporting easy access to and utilization of geospatial data. A series of recent European projects are contributing to the promotion of innovative Earth observation solutions and the uptake of scientific outcomes in policy. Several success stories involving different hydrologists' communities can be reported around the World. Gaps still exist in hydrological, agricultural, meteorological and climatological data access because of various issues. While many sources of data exists at all scales it remains difficult and time-consuming to assemble hydrological information for most projects. Furthermore, data and sharing formats remain very heterogeneous. Improvements require implementing/endorsing some commonly agreed standards and documenting data with adequate metadata. The brokering approach allows binding heterogeneous resources published by different data providers and adapting them to tools and interfaces commonly used by consumers of these resources. The challenge is to provide decision-makers with reliable information, based on integrated data and tools derived from both Earth observations and scientific models. Successful SDIs rely therefore on various aspects: a shared vision between all participants, necessity to solve a common problem, adequate data policies, incentives, and sufficient resources. New data streams from remote sensing or crowd sourcing are also producing valuable information to improve our understanding of the water cycle, while field sensors are developing rapidly and becoming less costly. More recent data standards are enhancing interoperability between hydrology and other scientific disciplines, while solutions exist to communicate uncertainty of data and models, which is an essential pre-requisite for decision-making. Distributed computing infrastructures can handle complex and large hydrological data and models, while Web Processing Services bring the flexibility to develop and execute simple to complex workflows over the Internet. The need for capacity building at human, infrastructure and institutional levels is also a major driver for reinforcing the commitment to SDI concepts.

  12. An efficient architecture to support digital pathology in standard medical imaging repositories.

    PubMed

    Marques Godinho, Tiago; Lebre, Rui; Silva, Luís Bastião; Costa, Carlos

    2017-07-01

    In the past decade, digital pathology and whole-slide imaging (WSI) have been gaining momentum with the proliferation of digital scanners from different manufacturers. The literature reports significant advantages associated with the adoption of digital images in pathology, namely, improvements in diagnostic accuracy and better support for telepathology. Moreover, it also offers new clinical and research applications. However, numerous barriers have been slowing the adoption of WSI, among which the most important are performance issues associated with storage and distribution of huge volumes of data, and lack of interoperability with other hospital information systems, most notably Picture Archive and Communications Systems (PACS) based on the DICOM standard. This article proposes an architecture of a Web Pathology PACS fully compliant with DICOM standard communications and data formats. The solution includes a PACS Archive responsible for storing whole-slide imaging data in DICOM WSI format and offers a communication interface based on the most recent DICOM Web services. The second component is a zero-footprint viewer that runs in any web-browser. It consumes data using the PACS archive standard web services. Moreover, it features a tiling engine especially suited to deal with the WSI image pyramids. These components were designed with special focus on efficiency and usability. The performance of our system was assessed through a comparative analysis of the state-of-the-art solutions. The results demonstrate that it is possible to have a very competitive solution based on standard workflows. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Implementation experiences of ISO/IEEE11073 standard applied to new use cases for e-health environments.

    PubMed

    Martinez, I; Escayola, J; Martinez-Espronceda, M; Serrano, L; Trigo, J D; Led, S; Garcia, J

    2009-01-01

    Recent advances in biomedical engineering and continuous technological innovations in last decade are promoting new challenges, especially in e-Health environments. In this context, the medical devices interoperability is one of the interest fields wherein these improvements require a standard-based design in order to achieve homogeneous solutions. Furthermore, the spreading of wearable devices, oriented to the paradigm of patient environment and supported by wireless technologies as Bluetooth or ZigBee, is bringing new medical use cases based on Ambient Assisted Living, home monitoring of elderly, heart failure, chronic, under palliative care or patients who have undergone surgery, urgencies and emergencies, or even fitness auto-control and health follow-up. In this paper, several implementation experiences based on ISO/IEEE11073 standard are detailed. These evolved e-Health services can improve the quality of the patient's care, increase the user's interaction, and assure these e-Health applications to be fully compatible with global telemedicine systems.

  14. 76 FR 51271 - Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the 700 MHz Band

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ... Docket 07-100; FCC 11-6] Implementing a Nationwide, Broadband, Interoperable Public Safety Network in the... interoperable public safety broadband network. The establishment of a common air interface for 700 MHz public safety broadband networks will create a foundation for interoperability and provide a clear path for the...

  15. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    PubMed

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  16. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    PubMed

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  17. Establishing semantic interoperability of biomedical metadata registries using extended semantic relationships.

    PubMed

    Park, Yu Rang; Yoon, Young Jo; Kim, Hye Hyeon; Kim, Ju Han

    2013-01-01

    Achieving semantic interoperability is critical for biomedical data sharing between individuals, organizations and systems. The ISO/IEC 11179 MetaData Registry (MDR) standard has been recognized as one of the solutions for this purpose. The standard model, however, is limited. Representing concepts consist of two or more values, for instance, are not allowed including blood pressure with systolic and diastolic values. We addressed the structural limitations of ISO/IEC 11179 by an integrated metadata object model in our previous research. In the present study, we introduce semantic extensions for the model by defining three new types of semantic relationships; dependency, composite and variable relationships. To evaluate our extensions in a real world setting, we measured the efficiency of metadata reduction by means of mapping to existing others. We extracted metadata from the College of American Pathologist Cancer Protocols and then evaluated our extensions. With no semantic loss, one third of the extracted metadata could be successfully eliminated, suggesting better strategy for implementing clinical MDRs with improved efficiency and utility.

  18. Request redirection paradigm in medical image archive implementation.

    PubMed

    Dragan, Dinu; Ivetić, Dragan

    2012-08-01

    It is widely recognized that the JPEG2000 facilitates issues in medical imaging: storage, communication, sharing, remote access, interoperability, and presentation scalability. Therefore, JPEG2000 support was added to the DICOM standard Supplement 61. Two approaches to support JPEG2000 medical image are explicitly defined by the DICOM standard: replacing the DICOM image format with corresponding JPEG2000 codestream, or by the Pixel Data Provider service, DICOM supplement 106. The latest one supposes two-step retrieval of medical image: DICOM request and response from a DICOM server, and then JPIP request and response from a JPEG2000 server. We propose a novel strategy for transmission of scalable JPEG2000 images extracted from a single codestream over DICOM network using the DICOM Private Data Element without sacrificing system interoperability. It employs the request redirection paradigm: DICOM request and response from JPEG2000 server through DICOM server. The paper presents programming solution for implementation of request redirection paradigm in a DICOM transparent manner. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  19. A Collaboration-Oriented M2M Messaging Mechanism for the Collaborative Automation between Machines in Future Industrial Networks

    PubMed Central

    Gray, John

    2017-01-01

    Machine-to-machine (M2M) communication is a key enabling technology for industrial internet of things (IIoT)-empowered industrial networks, where machines communicate with one another for collaborative automation and intelligent optimisation. This new industrial computing paradigm features high-quality connectivity, ubiquitous messaging, and interoperable interactions between machines. However, manufacturing IIoT applications have specificities that distinguish them from many other internet of things (IoT) scenarios in machine communications. By highlighting the key requirements and the major technical gaps of M2M in industrial applications, this article describes a collaboration-oriented M2M (CoM2M) messaging mechanism focusing on flexible connectivity and discovery, ubiquitous messaging, and semantic interoperability that are well suited for the production line-scale interoperability of manufacturing applications. The designs toward machine collaboration and data interoperability at both the communication and semantic level are presented. Then, the application scenarios of the presented methods are illustrated with a proof-of-concept implementation in the PicknPack food packaging line. Eventually, the advantages and some potential issues are discussed based on the PicknPack practice. PMID:29165347

  20. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    PubMed

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  1. A review on digital ECG formats and the relationships between them.

    PubMed

    Trigo, Jesús Daniel; Alesanco, Alvaro; Martínez, Ignacio; García, José

    2012-05-01

    A plethora of digital ECG formats have been proposed and implemented. This heterogeneity hinders the design and development of interoperable systems and entails critical integration issues for the healthcare information systems. This paper aims at performing a comprehensive overview on the current state of affairs of the interoperable exchange of digital ECG signals. This includes 1) a review on existing digital ECG formats, 2) a collection of applications and cardiology settings using such formats, 3) a compilation of the relationships between such formats, and 4) a reflection on the current situation and foreseeable future of the interoperable exchange of digital ECG signals. The objectives have been approached by completing and updating previous reviews on the topic through appropriate database mining. 39 digital ECG formats, 56 applications, tools or implantation experiences, 47 mappings/converters, and 6 relationships between such formats have been found in the literature. The creation and generalization of a single standardized ECG format is a desirable goal. However, this unification requires political commitment and international cooperation among different standardization bodies. Ongoing ontology-based approaches covering ECG domain have recently emerged as a promising alternative for reaching fully fledged ECG interoperability in the near future.

  2. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    PubMed

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  3. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  4. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard

    PubMed Central

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong

    2014-01-01

    Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817

  5. National electronic health record interoperability chronology.

    PubMed

    Hufnagel, Stephen P

    2009-05-01

    The federal initiative for electronic health record (EHR) interoperability began in 2000 and set the stage for the establishment of the 2004 Executive Order for EHR interoperability by 2014. This article discusses the chronology from the 2001 e-Government Consolidated Health Informatics (CHI) initiative through the current congressional mandates for an aligned, interoperable, and agile DoD AHLTA and VA VistA.

  6. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    PubMed

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.

  7. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification

    PubMed Central

    Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658

  8. Empowering open systems through cross-platform interoperability

    NASA Astrophysics Data System (ADS)

    Lyke, James C.

    2014-06-01

    Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.

  9. Transformation of standardized clinical models based on OWL technologies: from CEM to OpenEHR archetypes.

    PubMed

    Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui

    2015-05-01

    The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Foundations of reusable and interoperable facet models using category theory

    PubMed Central

    2016-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight ontologies, but in many regards, they are implementations of faceted browsing rather than a specification of the basic, underlying structures and interactions. We will demonstrate that category theory allows us to specify faceted objects and study the relationships and interactions within a faceted browsing system. Resulting implementations can then be constructed through a category-theoretic lens using these models, allowing abstract comparison and communication that naturally support interoperability and reuse. PMID:27942248

  11. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    PubMed

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  12. Developing Interoperable Air Quality Community Portals

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Husar, R. B.; Yang, C. P.; Robinson, E. M.; Fialkowski, W. E.

    2009-04-01

    Web portals are intended to provide consolidated discovery, filtering and aggregation of content from multiple, distributed web sources targeted at particular user communities. This paper presents a standards-based information architectural approach to developing portals aimed at air quality community collaboration in data access and analysis. An important characteristic of the approach is to advance beyond the present stand-alone design of most portals to achieve interoperability with other portals and information sources. We show how using metadata standards, web services, RSS feeds and other Web 2.0 technologies, such as Yahoo! Pipes and del.icio.us, helps increase interoperability among portals. The approach is illustrated within the context of the GEOSS Architecture Implementation Pilot where an air quality community portal is being developed to provide a user interface between the portals and clearinghouse of the GEOSS Common Infrastructure and the air quality community catalog of metadata and data services.

  13. Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.

    PubMed

    Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P

    2012-01-01

    The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.

  14. Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling

    NASA Astrophysics Data System (ADS)

    Tommasi, C.; Achille, C.

    2017-02-01

    Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?

  15. Managing Complex Interoperability Solutions using Model-Driven Architecture

    DTIC Science & Technology

    2011-06-01

    such as Oracle or MySQL . Each data model for a specific RDBMS is a distinct PSM. Or the system may want to exchange information with other C2...reduced number of transformations, e.g., from an RDBMS physical schema to the corresponding SQL script needed to instantiate the tables in a relational...tance of models. In engineering, a model serves several purposes: 1. It presents an abstract view of a complex system or of a complex information

  16. Dynamic Communication Resource Negotiations

    NASA Technical Reports Server (NTRS)

    Chow, Edward; Vatan, Farrokh; Paloulian, George; Frisbie, Steve; Srostlik, Zuzana; Kalomiris, Vasilios; Apgar, Daniel

    2012-01-01

    Today's advanced network management systems can automate many aspects of the tactical networking operations within a military domain. However, automation of joint and coalition tactical networking across multiple domains remains challenging. Due to potentially conflicting goals and priorities, human agreement is often required before implementation into the network operations. This is further complicated by incompatible network management systems and security policies, rendering it difficult to implement automatic network management, thus requiring manual human intervention to the communication protocols used at various network routers and endpoints. This process of manual human intervention is tedious, error-prone, and slow. In order to facilitate a better solution, we are pursuing a technology which makes network management automated, reliable, and fast. Automating the negotiation of the common network communication parameters between different parties is the subject of this paper. We present the technology that enables inter-force dynamic communication resource negotiations to enable ad-hoc inter-operation in the field between force domains, without pre-planning. It also will enable a dynamic response to changing conditions within the area of operations. Our solution enables the rapid blending of intra-domain policies so that the forces involved are able to inter-operate effectively without overwhelming each other's networks with in-appropriate or un-warranted traffic. It will evaluate the policy rules and configuration data for each of the domains, then generate a compatible inter-domain policy and configuration that will update the gateway systems between the two domains.

  17. Adaptation of interoperability standards for cross domain usage

    NASA Astrophysics Data System (ADS)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  18. Managing Interoperability for GEOSS - A Report from the SIF

    NASA Astrophysics Data System (ADS)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.

  19. Interoperability of Information Systems Managed and Used by the Local Health Departments

    PubMed Central

    Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616

  20. A Brokering Solution for Business Process Execution

    NASA Astrophysics Data System (ADS)

    Santoro, M.; Bigagli, L.; Roncella, R.; Mazzetti, P.; Nativi, S.

    2012-12-01

    Predicting the climate change impact on biodiversity and ecosystems, advancing our knowledge of environmental phenomena interconnection, assessing the validity of simulations and other key challenges of Earth Sciences require intensive use of environmental modeling. The complexity of Earth system requires the use of more than one model (often from different disciplines) to represent complex processes. The identification of appropriate mechanisms for reuse, chaining and composition of environmental models is considered a key enabler for an effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. The Group on Earth Observation (GEO) Model Web initiative aims to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): 1. Open access 2. Minimal entry-barriers 3. Service-driven approach 4. Scalability In this work we propose an architectural solution aiming to contribute to the Model Web vision. This solution applies the Brokering approach for facilitiating complex multidisciplinary interoperability. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. We designed and prototyped a component called BP Broker that is able to: (i) read an abstract BP, (ii) "compile" the abstract BP into an executable one (eBP) - in this phase the BP Broker might also provide recommendations for incomplete BPs and parameter mismatch resolution - and (iii) finally execute the eBP using a Workflow engine. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15

  1. Approaching semantic interoperability in Health Level Seven

    PubMed Central

    Alschuler, Liora

    2010-01-01

    ‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995

  2. Gaps Analysis of Integrating Product Design, Manufacturing, and Quality Data in The Supply Chain Using Model-Based Definition

    PubMed Central

    Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil

    2017-01-01

    Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated – from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry. PMID:28691120

  3. OSI for hardware/software interoperability

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.; Harvey, Donald L.; Linderman, Richard W.; Gardener, Gary A.; Capraro, Gerard T.

    1994-03-01

    There is a need in public safety for real-time data collection and transmission from one or more sensors. The Rome Laboratory and the Ballistic Missile Defense Organization are pursuing an effort to bring the benefits of Open System Architectures (OSA) to embedded systems within the Department of Defense. When developed properly OSA provides interoperability, commonality, graceful upgradeability, survivability and hardware/software transportability to greatly minimize life cycle costs, integration and supportability. Architecture flexibility can be achieved to take advantage of commercial accomplishments by basing these developments on vendor-neutral commercially accepted standards and protocols.

  4. Integrating hospital information systems in healthcare institutions: a mediation architecture.

    PubMed

    El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian

    2012-10-01

    Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.

  5. Research into a distributed fault diagnosis system and its application

    NASA Astrophysics Data System (ADS)

    Qian, Suxiang; Jiao, Weidong; Lou, Yongjian; Shen, Xiaomei

    2005-12-01

    CORBA (Common Object Request Broker Architecture) is a solution to distributed computing methods over heterogeneity systems, which establishes a communication protocol between distributed objects. It takes great emphasis on realizing the interoperation between distributed objects. However, only after developing some application approaches and some practical technology in monitoring and diagnosis, can the customers share the monitoring and diagnosis information, so that the purpose of realizing remote multi-expert cooperation diagnosis online can be achieved. This paper aims at building an open fault monitoring and diagnosis platform combining CORBA, Web and agent. Heterogeneity diagnosis object interoperate in independent thread through the CORBA (soft-bus), realizing sharing resource and multi-expert cooperation diagnosis online, solving the disadvantage such as lack of diagnosis knowledge, oneness of diagnosis technique and imperfectness of analysis function, so that more complicated and further diagnosis can be carried on. Take high-speed centrifugal air compressor set for example, we demonstrate a distributed diagnosis based on CORBA. It proves that we can find out more efficient approaches to settle the problems such as real-time monitoring and diagnosis on the net and the break-up of complicated tasks, inosculating CORBA, Web technique and agent frame model to carry on complemental research. In this system, Multi-diagnosis Intelligent Agent helps improve diagnosis efficiency. Besides, this system offers an open circumstances, which is easy for the diagnosis objects to upgrade and for new diagnosis server objects to join in.

  6. The Need of Nested Grids for Aerial and Satellite Images and Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Villa, G.; Mas, S.; Fernández-Villarino, X.; Martínez-Luceño, J.; Ojeda, J. C.; Pérez-Martín, B.; Tejeiro, J. A.; García-González, C.; López-Romero, E.; Soteres, C.

    2016-06-01

    Usual workflows for production, archiving, dissemination and use of Earth observation images (both aerial and from remote sensing satellites) pose big interoperability problems, as for example: non-alignment of pixels at the different levels of the pyramids that makes it impossible to overlay, compare and mosaic different orthoimages, without resampling them and the need to apply multiple resamplings and compression-decompression cycles. These problems cause great inefficiencies in production, dissemination through web services and processing in "Big Data" environments. Most of them can be avoided, or at least greatly reduced, with the use of a common "nested grid" for mutiresolution production, archiving, dissemination and exploitation of orthoimagery, digital elevation models and other raster data. "Nested grids" are space allocation schemas that organize image footprints, pixel sizes and pixel positions at all pyramid levels, in order to achieve coherent and consistent multiresolution coverage of a whole working area. A "nested grid" must be complemented by an appropriate "tiling schema", ideally based on the "quad-tree" concept. In the last years a "de facto standard" grid and Tiling Schema has emerged and has been adopted by virtually all major geospatial data providers. It has also been adopted by OGC in its "WMTS Simple Profile" standard. In this paper we explain how the adequate use of this tiling schema as common nested grid for orthoimagery, DEMs and other types of raster data constitutes the most practical solution to most of the interoperability problems of these types of data.

  7. PACS/information systems interoperability using Enterprise Communication Framework.

    PubMed

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  8. Latest developments for the IAGOS database: Interoperability and metadata

    NASA Astrophysics Data System (ADS)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.

  9. Modeling Interoperable Information Systems with 3LGM² and IHE.

    PubMed

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE developers can use or develop IHE profiles systematically. In order to improve the usability and handling of the IHE-master-model and its usage as a reference model, some further refinements have to be done. Evaluating the use of the IHE-master-model by information managers and IHE developers is subject to further research.

  10. Integrated Visualization of Multi-sensor Ocean Data across the Web

    NASA Astrophysics Data System (ADS)

    Platt, F.; Thompson, C. K.; Roberts, J. T.; Tsontos, V. M.; Hin Lam, C.; Arms, S. C.; Quach, N.

    2017-12-01

    Whether for research or operational decision support, oceanographic applications rely on the visualization of multivariate in situ and remote sensing data as an integral part of analysis workflows. However, given their inherently 3D-spatial and temporally dynamic nature, the visual representation of marine in situ data in particular poses a challenge. The Oceanographic In situ data Interoperability Project (OIIP) is a collaborative project funded under the NASA/ACCESS program that seeks to leverage and enhance higher TRL (technology readiness level) informatics technologies to address key data interoperability and integration issues associated with in situ ocean data, including the dearth of effective web-based visualization solutions. Existing web tools for the visualization of key in situ data types - point, profile, trajectory series - are limited in their support for integrated, dynamic and coordinated views of the spatiotemporal characteristics of the data. Via the extension of the JPL Common Mapping Client (CMC) software framework, OIIP seeks to provide improved visualization support for oceanographic in situ data sets. More specifically, this entails improved representation of both horizontal and vertical aspects of these data, which inherently are depth resolved and time referenced, as well as the visual synchronization with relevant remotely-sensed gridded data products, such as sea surface temperature and salinity. Electronic tagging datasets, which are a focal use case for OIIP, provide a representative, if somewhat complex, visualization challenge in this regard. Critical to the achievement of these development objectives has been compilation of a well-rounded set of visualization use cases and requirements based on a series of end-user consultations aimed at understanding their satellite-in situ visualization needs. Here we summarize progress on aspects of the technical work and our approach.

  11. Building a portable data and information interoperability infrastructure-framework for a standard Taiwan Electronic Medical Record Template.

    PubMed

    Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun

    2007-11-01

    Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.

  12. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  13. SCHeMA web-based observation data information system

    NASA Astrophysics Data System (ADS)

    Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise

    2016-04-01

    It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.

  14. The Need for Information on Standards on eAccessibility&eInclusion - Based on the Experience of the EU-Project IN LIFE.

    PubMed

    Galinski, Christian; Giraldo Perez, Blanca Stella

    2017-01-01

    Recent investigations in several EU-projects, incl. IN LIFE revealed that experts in the field of eAccessibility & eInclusion (eAcc&eIncl) - but also general ICT developers, decision makers in industry and administration - are quite unaware of the importance of standards for interoperability and sustainability of ICT solutions. Especially, if persons with disabilities (PwD) are concerned, system development and the design of services can become unnecessarily costly. For accessibility in general and eAcc&eIncl in particular, knowing about pertinent standards is becoming an asset of personal competencies of experts and decision makers, and particularly benefit small enterprises. Given the complex world of standardization and the multitude of standards developing organizations (SDOs) easy access to information on standards is critical.

  15. Towards a flexible middleware for context-aware pervasive and wearable systems.

    PubMed

    Muro, Marco; Amoretti, Michele; Zanichelli, Francesco; Conte, Gianni

    2012-11-01

    Ambient intelligence and wearable computing call for innovative hardware and software technologies, including a highly capable, flexible and efficient middleware, allowing for the reuse of existing pervasive applications when developing new ones. In the considered application domain, middleware should also support self-management, interoperability among different platforms, efficient communications, and context awareness. In the on-going "everything is networked" scenario scalability appears as a very important issue, for which the peer-to-peer (P2P) paradigm emerges as an appealing solution for connecting software components in an overlay network, allowing for efficient and balanced data distribution mechanisms. In this paper, we illustrate how all these concepts can be placed into a theoretical tool, called networked autonomic machine (NAM), implemented into a NAM-based middleware, and evaluated against practical problems of pervasive computing.

  16. Endpoint Security Using Biometric Authentication for Secure Remote Mission Operations

    NASA Technical Reports Server (NTRS)

    Donohue, John T.; Critchfield, Anna R.

    2000-01-01

    We propose a flexible security authentication solution for the spacecraft end-user, which will allow the user to interact over Internet with the spacecraft, its instruments, or with the ground segment from anywhere, anytime based on the user's pre-defined set of privileges. This package includes biometrics authentication products, such as face, voice or fingerprint recognition, authentication services and procedures, such as: user registration and verification over the Internet and user database maintenance, with a configurable schema of spacecraft users' privileges. This fast and reliable user authentication mechanism will become an integral part of end-to-end ground-to-space secure Internet communications and migration from current practice to the future. All modules and services of the proposed package are commercially available and built to the NIST BioAPI standard, which facilitates "pluggability" and interoperability.

  17. A semantically-aided architecture for a web-based monitoring system for carotid atherosclerosis.

    PubMed

    Kolias, Vassileios D; Stamou, Giorgos; Golemati, Spyretta; Stoitsis, Giannis; Gkekas, Christos D; Liapis, Christos D; Nikita, Konstantina S

    2015-08-01

    Carotid atherosclerosis is a multifactorial disease and its clinical diagnosis depends on the evaluation of heterogeneous clinical data, such as imaging exams, biochemical tests and the patient's clinical history. The lack of interoperability between Health Information Systems (HIS) does not allow the physicians to acquire all the necessary data for the diagnostic process. In this paper, a semantically-aided architecture is proposed for a web-based monitoring system for carotid atherosclerosis that is able to gather and unify heterogeneous data with the use of an ontology and to create a common interface for data access enhancing the interoperability of HIS. The architecture is based on an application ontology of carotid atherosclerosis that is used to (a) integrate heterogeneous data sources on the basis of semantic representation and ontological reasoning and (b) access the critical information using SPARQL query rewriting and ontology-based data access services. The architecture was tested over a carotid atherosclerosis dataset consisting of the imaging exams and the clinical profile of 233 patients, using a set of complex queries, constructed by the physicians. The proposed architecture was evaluated with respect to the complexity of the queries that the physicians could make and the retrieval speed. The proposed architecture gave promising results in terms of interoperability, data integration of heterogeneous sources with an ontological way and expanded capabilities of query and retrieval in HIS.

  18. System and methods of resource usage using an interoperable management framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  19. Information Management Challenges in Achieving Coalition Interoperability

    DTIC Science & Technology

    2001-12-01

    by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less

  1. Space Network Interoperability Panel (SNIP) study

    NASA Technical Reports Server (NTRS)

    Ryan, Thomas; Lenhart, Klaus; Hara, Hideo

    1991-01-01

    The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.

  2. NASA's Geospatial Interoperability Office(GIO)Program

    NASA Technical Reports Server (NTRS)

    Weir, Patricia

    2004-01-01

    NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.

  3. Developing sustainable software solutions for bioinformatics by the “ Butterfly” paradigm

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas

    2014-01-01

    Software design and sustainable software engineering are essential for the long-term development of bioinformatics software. Typical challenges in an academic environment are short-term contracts, island solutions, pragmatic approaches and loose documentation. Upcoming new challenges are big data, complex data sets, software compatibility and rapid changes in data representation. Our approach to cope with these challenges consists of iterative intertwined cycles of development (“ Butterfly” paradigm) for key steps in scientific software engineering. User feedback is valued as well as software planning in a sustainable and interoperable way. Tool usage should be easy and intuitive. A middleware supports a user-friendly Graphical User Interface (GUI) as well as a database/tool development independently. We validated the approach of our own software development and compared the different design paradigms in various software solutions. PMID:25383181

  4. Applying Sensor Web Technology to Marine Sensor Data

    NASA Astrophysics Data System (ADS)

    Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric

    2015-04-01

    In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC SWE specifications that provide stricter guidance how these standards shall be applied to marine data (e.g. SensorML 2.0 profiles stating which metadata elements are mandatory building upon the ESONET Sensor Registry developments, etc.). Within the NeXOS project the presented architecture is implemented as a set of open source components. These implementations can be re-used by all interested scientists and data providers needing tools for publishing or consuming oceanographic sensor data. In further projects such as the European project FixO3 (Fixed-point Open Ocean Observatories), these software development activities are complemented with additional efforts to provide guidance how Sensor Web technology can be applied in an efficient manner. This way, not only software components are made available but also documentation and information resources that help to understand which types of Sensor Web deployments are best suited to fulfil different types of user requirements.

  5. a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps

    NASA Astrophysics Data System (ADS)

    Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.

    2016-06-01

    Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.

  6. A Definitive Interoperability Test Methodology for the Malicious Activity Simulation Tool (MAST)

    DTIC Science & Technology

    2013-03-01

    Information Assurance Range DON Department of the Navy DON CIO Department of the Navy Chief Information Officer DoS Denial of Service EOL End-of...came “as part of PMW 160’s solution to the risk posed by Windows™ NT End-of-Life 43 ( EOL ).” Second, “[it] marked the beginning of a steady and...sometimes outdated, systems and programs. [34]. Table 1 shows the basic implementation and EOL timeline, the OS version for both server and

  7. Impact of coalition interoperability on PKI

    NASA Astrophysics Data System (ADS)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  8. Telemedicine system interoperability architecture: concept description and architecture overview.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  9. Evaluation of a Framework to Implement Electronic Health Record Systems Based on the openEHR Standard

    NASA Astrophysics Data System (ADS)

    Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.

    2016-04-01

    The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.

  10. Interoperable End-to-End Remote Patient Monitoring Platform Based on IEEE 11073 PHD and ZigBee Health Care Profile.

    PubMed

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2018-05-01

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  11. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  12. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  13. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION Organization Public Safety and Homeland Security Bureau § 0.192 Emergency Response Interoperability Center. (a...

  14. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  15. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  16. Evaluation of bluetooth low power for physiological monitoring in a home based cardiac rehabilitation program.

    PubMed

    Martin, Timothy; Ding, Hang; D'Souza, Matthew; Karunanithi, Mohan

    2012-01-01

    Cardiovascular disease (CVD) is the leading cause of mortality in Australia, and places large burdens on the healthcare system. To assist patients with CVDs in recovering from cardiac events and mediating cardiac risk factors, a home based cardiac rehabilitation program, known as the Care Assessment Platform (CAP), was developed. In the CAP program, patients are required to manually enter health information into their mobile phones on a daily basis. The manual operation is often subject to human errors and is inconvenient for some elderly patients. To improve this, an automated wireless solution has been desired. The objectives of this paper are to investigate the feasibility of implementing the newly released Bluetooth 4.0 (BT4.0) for the CAP program, and practically evaluate BT4.0 communications between a developed mobile application and some emulated healthcare devices. The study demonstrated that BT4.0 addresses usability, interoperability and security for healthcare applications, reduces the power consumption in wireless communication, and improves the flexibility of interface for software development. This evaluation study provides an essential mobile BT4.0 framework to incorporate a large range of healthcare devices for clinical assessment and intervention in the CAP program, and hence it is useful for similar development and research work of other mobile healthcare solutions.

  17. MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system

    NASA Astrophysics Data System (ADS)

    Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg

    2005-01-01

    We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.

  18. Quasi-Monopoly Status of a Non-for-Profit Reference Terminology Provider - An Exemplary Approach on Transparency Improvement Regarding Licensing.

    PubMed

    Dewenter, Heike; Thun, Sylvia

    2018-01-01

    As the reference terminology SNOMED CT is gaining in significance and seems without alternative in interoperable Electronic Health Records, the holder of its intellectual property, the non-for-profit organization SNOMED International has achieved a quasi-monopoly status as a provider. We examine the current dealing with corporate transparency regarding SNOMED CT licensing together with policy recommendations derived from the research project ASSESS CT, in the context of collaboration with Standardization Organizations. In addition, transparency improvement is proposed based on the economic Principal-Agent-Theory, assuming SNOMED CT Licensees as principals. In this paper we introduce improvement measures with regard to increase transparency in the licensing process addressing to the reference terminology users and especially the terminology provider. The aim is to present strategies towards transparency, with the intent to remove barriers concerning indecisive organization stakeholders and users of a license and fee-based terminology solutions, as well as to overcome resentments connected to the quasi-monopoly status of the provider.

  19. Cloud Environment Automation: from infrastructure deployment to application monitoring

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Costantini, A.; Bucchi, R.; Italiano, A.; Michelotto, D.; Panella, M.; Pergolesi, M.; Saletta, M.; Traldi, S.; Vistoli, C.; Zizzi, G.; Salomoni, D.

    2017-10-01

    The potential offered by the cloud paradigm is often limited by technical issues, rules and regulations. In particular, the activities related to the design and deployment of the Infrastructure as a Service (IaaS) cloud layer can be difficult to apply and time-consuming for the infrastructure maintainers. In this paper the research activity, carried out during the Open City Platform (OCP) research project [1], aimed at designing and developing an automatic tool for cloud-based IaaS deployment is presented. Open City Platform is an industrial research project funded by the Italian Ministry of University and Research (MIUR), started in 2014. It intends to research, develop and test new technological solutions open, interoperable and usable on-demand in the field of Cloud Computing, along with new sustainable organizational models that can be deployed for and adopted by the Public Administrations (PA). The presented work and the related outcomes are aimed at simplifying the deployment and maintenance of a complete IaaS cloud-based infrastructure.

  20. Hearing Device Manufacturers Call for Interoperability and Standardization of Internet and Audiology.

    PubMed

    Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep

    2016-10-01

    This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.

  1. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  2. Profiling Fast Healthcare Interoperability Resources (FHIR) of Family Health History based on the Clinical Element Models.

    PubMed

    Lee, Jaehoon; Hulse, Nathan C; Wood, Grant M; Oniki, Thomas A; Huff, Stanley M

    2016-01-01

    In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM elements and existing FHIR resources. Based on the analysis, we developed a profile that consists of 1) FHIR resources for essential FHH data elements, 2) extensions for additional elements that were not covered by the resources, and 3) a structured definition to integrate patient and family member information in a FHIR message. We implemented the profile using an open-source based FHIR framework and validated it using patient-entered FHH data that was captured through a locally developed FHH tool.

  3. 77 FR 67815 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-14

    ..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...

  4. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  5. Brokerage services for Earth Science data: the EuroGEOSS legacy (Invited)

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Craglia, M.; Pearlman, J.

    2013-12-01

    Global sustainability research requires an integrated multidisciplinary effort underpinned by a collaborative environment discovering and accessing heterogeneous data across disciplines. Traditionally, interoperability has been achieved by implementing federation of systems. The federating approach entails the adoption of a set of common technologies and standards. This presentation argues that for complex (and uncontrolled) environments (such as global, multidisciplinary, and voluntary-based infrastructures) federated solutions must be completed and enhanced by a brokering approach -making available a set of brokerage services. In fact, brokerage services allows a cyber-infrastructure to lower entry barriers (for both data producers and users) and to better address the different domain specificities. The brokering interoperability approach was successfully experimented by the EuroGEOSS project, funded by the European Commission in the FP7 framework (see http://www.eurogeoss.eu). The EuroGEOSS Brokering framework provided the EuroGEOSS Capacity with multidisciplinary interoperability functionalities. This platform was developed applying several of the principles/requirements that characterize the System of Systems (SoS) approach and the Internet of Services (IoS) philosophy. The framework consists of three main brokers (middleware components implementing intermediation and harmonization services): a basic Discovery Broker, an advanced Semantic Discovery Broker, and an Access Broker. They are empowered by a suite of tools developed by the ESSI-lab of the CNR-IIA, called: GI-cat, GI-sem, and GI-axe. The EuroGEOSS brokering framework was considered and successfully adopted by cross-disciplinary initiatives (notably GEOSS: Global Earth Observation System of Systems). The brokerage services have been advanced and extended; the new brokering framework is called GEO DAB (Discovery and Access Broker). New brokerage services have been developed in the framework of other European Commission funded projects (e.g. GeoViQua). More recently, the NSF EarthCube initiative decided to fund a project dealing with brokerage services. In the framework of the GEO AIP-6 (Architecture Implementation Pilot -phase 6), the presented brokerage platform has been used by the Water Working Group to carry out improved data access for parameterization and model development.

  6. Linking User Identities Across the DataONE Federation of Data Repositories

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Mecum, B.; Leinfelder, B.; Jones, C. S.; Walker, L.

    2016-12-01

    DataONE provides services for identifying, authenticating, and authorizing researchers to access and contribute data to repositories within the DataONE federation. In the earth sciences, thousands of institutional and disciplinary repositories have created their own user identity and authentication systems with their own user directory based on a database or web content management systems. Thus, researchers have many identities that are neither linked nor interoperable, making it difficult to reference the identity of these users across systems. Key user information is hidden, and only a non-disambiguated name is often available. From a sample of 160,000 data sets within DataONE, a super-majority of references to the data creators lack even an email address. In an attempt to disambiguate these people via the GeoLink project, we conservatively estimate they represent at least 57,000 unique identities, but without a clear user identifier, there could be as many as 223,000. Interoperability among repositories is critical to improving the scope of scientific synthesis and capabilities for research collaboration. While many have focused on the convenience of Single Sign-On (SSO), we have found that sharing user identifiers is far more useful for interoperability. With an unambiguous user identity in incoming metadata, DataONE has built user-profiles that present that user's data across repositories, that link users and their organizational affiliations, and that allow users to work collaboratively in private groups that span repository systems. DataONE's user identity solution leverages existing systems such as InCommon, CILogon, Google, and ORCID to not further proliferate user identities. DataONE provides a core service allowing users to link their multiple identities so that authenticating with one identity (e.g., ORCID) can authorize access to data protected via another identity (e.g., InCommon). Currently, DataONE is using ORCID identities to link and identify users, but challenges must still be overcome to support historical records for which ORCIDs can not be used because the associated people are unavailable to confirm their identity. DataONE's identity systems facilitate crosslinking between user identities and scientific metadata to accelerate collaboration and synthesis.

  7. UAS Integration in the NAS Project: DAA-TCAS Interoperability "mini" HITL Primary Results

    NASA Technical Reports Server (NTRS)

    Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor

    2016-01-01

    At the May 2015 SC-228 meeting, requirements for TCAS II interoperability became elevated in priority. A TCAS interoperability workgroup was formed to identify and address key issues/questions. The TCAS workgroup came up with an initial list of questions and a plan to address those questions. As part of that plan, NASA proposed to run a mini HITL to address display, alerting and guidance issues. A TCAS Interoperability Workshop was held to determine potential display/alerting/guidance issues that could be explored in future NASA mini HITLS. Consensus on main functionality of DAA guidance when TCAS II RA occurs. Prioritized list of independent variables for experimental design. Set of use cases to stress TCAS Interoperability.

  8. Knowledge representation and management enabling intelligent interoperability - principles and standards.

    PubMed

    Blobel, Bernd

    2013-01-01

    Based on the paradigm changes for health, health services and underlying technologies as well as the need for at best comprehensive and increasingly automated interoperability, the paper addresses the challenge of knowledge representation and management for medical decision support. After introducing related definitions, a system-theoretical, architecture-centric approach to decision support systems (DSSs) and appropriate ways for representing them using systems of ontologies is given. Finally, existing and emerging knowledge representation and management standards are presented. The paper focuses on the knowledge representation and management part of DSSs, excluding the reasoning part from consideration.

  9. Making Interoperability Easier with NASA's Metadata Management Tool (MMT)

    NASA Technical Reports Server (NTRS)

    Shum, Dana; Reese, Mark; Pilone, Dan; Baynes, Katie

    2016-01-01

    While the ISO-19115 collection level metadata format meets many users' needs for interoperable metadata, it can be cumbersome to create it correctly. Through the MMT's simple UI experience, metadata curators can create and edit collections which are compliant with ISO-19115 without full knowledge of the NASA Best Practices implementation of ISO-19115 format. Users are guided through the metadata creation process through a forms-based editor, complete with field information, validation hints and picklists. Once a record is completed, users can download the metadata in any of the supported formats with just 2 clicks.

  10. 77 FR 37001 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...

  11. Achieving Interoperability in GEOSS - How Close Are We?

    NASA Astrophysics Data System (ADS)

    Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.

    2010-12-01

    A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.

  12. Personal Health Records: Is Rapid Adoption Hindering Interoperability?

    PubMed Central

    Studeny, Jana; Coustasse, Alberto

    2014-01-01

    The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822

  13. Using GDAL to Convert NetCDF 4 CF 1.6 to GeoTIFF: Interoperability Problems and Solutions for Data Providers and Distributors

    NASA Astrophysics Data System (ADS)

    Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.

    2015-12-01

    An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.

  14. Evolution of System Architectures: Where Do We Need to Fail Next?

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Alameh, Nadine; Percivall, George

    2013-04-01

    Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.

  15. Measuring interoperable EHR adoption and maturity: a Canadian example.

    PubMed

    Gheorghiu, Bobby; Hagens, Simon

    2016-01-25

    An interoperable electronic health record is a secure consolidated record of an individual's health history and care, designed to facilitate authorized information sharing across the care continuum.  Each Canadian province and territory has implemented such a system and for all, measuring adoption is essential to understanding progress and optimizing use in order to realize intended benefits. About 250,000 health professionals-approximately half of Canada's anticipated potential physician, nurse, pharmacist, and administrative users-indicated that they electronically access data, such as those found in provincial/territorial lab or drug information systems, in 2015.  Trends suggest further growth as maturity of use increases. There is strong interest in health information exchange through the iEHR in Canada, and continued growth in adoption is expected. Central to managing the evolution of digital health is access to robust data about who is using solutions, how they are used, where and when.  Stakeholders such as government, program leads, and health system administrators must critically assess progress and achievement of benefits, to inform future strategic and operational decisions.

  16. NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments

    NASA Technical Reports Server (NTRS)

    Zernic, M. J.; Beering, D. R.; Brooks, D. E.

    2000-01-01

    This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.

  17. PyMOOSE: Interoperable Scripting in Python for MOOSE

    PubMed Central

    Ray, Subhasis; Bhalla, Upinder S.

    2008-01-01

    Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE). MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. We illustrate this by using Python numerical libraries to analyze MOOSE output online, and by developing a GUI in Python/Qt for a MOOSE simulation. Finally, we build and run a composite neuronal/signaling model that uses both the NEURON and MOOSE numerical engines, and Python as a bridge between the two. Thus PyMOOSE has a high degree of interoperability with analysis routines, with graphical toolkits, and with other simulators. PMID:19129924

  18. Across the Atlantic cooperation to address international challenges in eHealth and health IT: managing toward a common goal.

    PubMed

    Friedman, Charles P; Iakovidis, Ilias; Debenedetti, Laurent; Lorenzi, Nancy M

    2009-11-01

    Countries on both sides of the Atlantic Ocean have invested in health information and communication technologies. Since eHealth challenges cross borders a European Union-United States of America conference on public policies relating to health IT and eHealth was held October 20-21, 2008 in Paris, France. The conference was organized around the four themes: (1) privacy and security, (2) health IT interoperability, (3) deployment and adoption of health IT, and (4) Public Private Collaborative Governance. The four key themes framed the discussion over the two days of plenary sessions and workshops. Key findings of the conference were organized along the four themes. (1) Privacy and security: Patients' access to their own data and key elements of a patient identification management framework were discussed. (2) Health IT interoperability: Three significant and common interoperability challenges emerged: (a) the need to establish common or compatible standards and clear guidelines for their implementation, (b) the desirability for shared certification criteria and (c) the need for greater awareness of the importance of interoperability. (3) Deployment and adoption of health IT: Three major areas of need emerged: (a) a shared knowledge base and assessment framework, (b) public-private collaboration and (c) and effective organizational change strategies. (4) Public Private Collaborative Governance: Sharing and communication are central to success in this area. Nations can learn from one another about ways to develop harmonious, effective partnerships. Three areas that were identified as highest priority for collaboration included: (1) health data security, (2) developing effective strategies to ensure healthcare professionals' acceptance of health IT tools, and (3) interoperability.

  19. Cloud-Based Data Sharing Connects Emergency Managers

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Stennis Space Center, Baltimore-based StormCenter Communications Inc. developed an improved interoperable platform for sharing geospatial data over the Internet in real time-information that is critical for decision makers in emergency situations.

  20. Organisational Interoperability: Evaluation and Further Development of the OIM Model

    DTIC Science & Technology

    2003-06-01

    an Organizational Interoperability Maturity Model (OIM) to evaluate interoperability at the organizational level. The OIM considers the human ... activity aspects of military operations, which are not covered in other models. This paper describes how the model has been used to identify problems and to

  1. Data and Models as Social Objects in the HydroShare System for Collaboration in the Hydrology Community and Beyond

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.; Crawley, S.; Ramirez, M.; Sadler, J.; Xue, Z.; Bandaragoda, C.

    2016-12-01

    How do you share and publish hydrologic data and models for a large collaborative project? HydroShare is a new, web-based system for sharing hydrologic data and models with specific functionality aimed at making collaboration easier. HydroShare has been developed with U.S. National Science Foundation support under the auspices of the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) to support the collaboration and community cyberinfrastructure needs of the hydrology research community. Within HydroShare, we have developed new functionality for creating datasets, describing them with metadata, and sharing them with collaborators. We cast hydrologic datasets and models as "social objects" that can be shared, collaborated around, annotated, published and discovered. In addition to data and model sharing, HydroShare supports web application programs (apps) that can act on data stored in HydroShare, just as software programs on your PC act on your data locally. This can free you from some of the limitations of local computing capacity and challenges in installing and maintaining software on your own PC. HydroShare's web-based cyberinfrastructure can take work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This presentation will describe HydroShare's collaboration functionality that enables both public and private sharing with individual users and collaborative user groups, and makes it easier for collaborators to iterate on shared datasets and models, creating multiple versions along the way, and publishing them with a permanent landing page, metadata description, and citable Digital Object Identifier (DOI) when the work is complete. This presentation will also describe the web app architecture that supports interoperability with third party servers functioning as application engines for analysis and processing of big hydrologic datasets. While developed to support the cyberinfrastructure needs of the hydrology community, the informatics infrastructure for programmatic interoperability of web resources has a generality beyond the solution of hydrology problems that will be discussed.

  2. Accelerating Harmonization in Digital Health.

    PubMed

    Moore, Carolyn; Werner, Laurie; BenDor, Amanda Puckett; Bailey, Mike; Khan, Nighat

    2017-01-01

    Digital tools play an important role in supporting front-line health workers who deliver primary care. This paper explores the current state of efforts undertaken to move away from single-purpose applications of digital health towards integrated systems and solutions that align with national strategies. Through examples from health information systems, data and health worker training, this paper demonstrates how governments and stakeholders are working to integrate digital health services. We emphasize three factors as crucial for this integration: development and implementation of national digital health strategies; technical interoperability and collaborative approaches to ensure that digital health has an impact on the primary care level. Consolidation of technologies will enable an integrated, scaleable approach to the use of digital health to support health workers. As this edition explores a paradigm shift towards harmonization in primary healthcare systems, this paper explores complementary efforts undertaken to move away from single-purpose applications of digital health towards integrated systems and solutions that align with national strategies. It describes a paradigm shift towards integrated and interoperable systems that respond to health workers' needs in training, data and health information; and calls for the consolidation and integration of digital health tools and approaches across health areas, functions and levels of the health system. It then considers the critical factors that must be in place to support this paradigm shift. This paper aims not only to describe steps taken to move from fractured pilots to effective systems, but to propose a new perspective focused on consolidation and collaboration guided by national digital health strategies.

  3. Comparison of a semi-automatic annotation tool and a natural language processing application for the generation of clinical statement entries.

    PubMed

    Lin, Ching-Heng; Wu, Nai-Yuan; Lai, Wei-Shao; Liou, Der-Ming

    2015-01-01

    Electronic medical records with encoded entries should enhance the semantic interoperability of document exchange. However, it remains a challenge to encode the narrative concept and to transform the coded concepts into a standard entry-level document. This study aimed to use a novel approach for the generation of entry-level interoperable clinical documents. Using HL7 clinical document architecture (CDA) as the example, we developed three pipelines to generate entry-level CDA documents. The first approach was a semi-automatic annotation pipeline (SAAP), the second was a natural language processing (NLP) pipeline, and the third merged the above two pipelines. We randomly selected 50 test documents from the i2b2 corpora to evaluate the performance of the three pipelines. The 50 randomly selected test documents contained 9365 words, including 588 Observation terms and 123 Procedure terms. For the Observation terms, the merged pipeline had a significantly higher F-measure than the NLP pipeline (0.89 vs 0.80, p<0.0001), but a similar F-measure to that of the SAAP (0.89 vs 0.87). For the Procedure terms, the F-measure was not significantly different among the three pipelines. The combination of a semi-automatic annotation approach and the NLP application seems to be a solution for generating entry-level interoperable clinical documents. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.comFor numbered affiliation see end of article.

  4. The Development of Clinical Document Standards for Semantic Interoperability in China

    PubMed Central

    Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping

    2011-01-01

    Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722

  5. 77 FR 48153 - Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...

  6. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    ERIC Educational Resources Information Center

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  7. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    ERIC Educational Resources Information Center

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  8. Interoperability of Demand Response Resources Demonstration in NY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  9. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    EPA Science Inventory

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  10. The HDF Product Designer - Interoperability in the First Mile

    NASA Astrophysics Data System (ADS)

    Lee, H.; Jelenak, A.; Habermann, T.

    2014-12-01

    Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.

  11. Software-codec-based full motion video conferencing on the PC using visual pattern image sequence coding

    NASA Astrophysics Data System (ADS)

    Barnett, Barry S.; Bovik, Alan C.

    1995-04-01

    This paper presents a real time full motion video conferencing system based on the Visual Pattern Image Sequence Coding (VPISC) software codec. The prototype system hardware is comprised of two personal computers, two camcorders, two frame grabbers, and an ethernet connection. The prototype system software has a simple structure. It runs under the Disk Operating System, and includes a user interface, a video I/O interface, an event driven network interface, and a free running or frame synchronous video codec that also acts as the controller for the video and network interfaces. Two video coders have been tested in this system. Simple implementations of Visual Pattern Image Coding and VPISC have both proven to support full motion video conferencing with good visual quality. Future work will concentrate on expanding this prototype to support the motion compensated version of VPISC, as well as encompassing point-to-point modem I/O and multiple network protocols. The application will be ported to multiple hardware platforms and operating systems. The motivation for developing this prototype system is to demonstrate the practicality of software based real time video codecs. Furthermore, software video codecs are not only cheaper, but are more flexible system solutions because they enable different computer platforms to exchange encoded video information without requiring on-board protocol compatible video codex hardware. Software based solutions enable true low cost video conferencing that fits the `open systems' model of interoperability that is so important for building portable hardware and software applications.

  12. Semantic Integration for Marine Science Interoperability Using Web Technologies

    NASA Astrophysics Data System (ADS)

    Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.

    2008-12-01

    The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.

  13. NASA's Earth Science Gateway: A Platform for Interoperable Services in Support of the GEOSS Architecture

    NASA Astrophysics Data System (ADS)

    Alameh, N.; Bambacus, M.; Cole, M.

    2006-12-01

    Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG has played a key role in recent GEOSS Service Network (GSN) demos and workshops, acting not only as a service and data catalog and discovery client, but also as a portrayal and visualization client to distributed data.

  14. Procrustes-based geometric morphometrics on MRI images: An example of inter-operator bias in 3D landmarks and its impact on big datasets.

    PubMed

    Daboul, Amro; Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea

    2018-01-01

    Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'.

  15. Procrustes-based geometric morphometrics on MRI images: An example of inter-operator bias in 3D landmarks and its impact on big datasets

    PubMed Central

    Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea

    2018-01-01

    Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'. PMID:29787586

  16. Ubiquitous information for ubiquitous computing: expressing clinical data sets with openEHR archetypes.

    PubMed

    Garde, Sebastian; Hovenga, Evelyn; Buck, Jasmin; Knaup, Petra

    2006-01-01

    Ubiquitous computing requires ubiquitous access to information and knowledge. With the release of openEHR Version 1.0 there is a common model available to solve some of the problems related to accessing information and knowledge by improving semantic interoperability between clinical systems. Considerable work has been undertaken by various bodies to standardise Clinical Data Sets. Notwithstanding their value, several problems remain unsolved with Clinical Data Sets without the use of a common model underpinning them. This paper outlines these problems like incompatible basic data types and overlapping and incompatible definitions of clinical content. A solution to this based on openEHR archetypes is motivated and an approach to transform existing Clinical Data Sets into archetypes is presented. To avoid significant overlaps and unnecessary effort during archetype development, archetype development needs to be coordinated nationwide and beyond and also across the various health professions in a formalized process.

  17. Successful integration efforts in water quality from the integrated Ocean Observing System Regional Associations and the National Water Quality Monitoring Network

    USGS Publications Warehouse

    Ragsdale, R.; Vowinkel, E.; Porter, D.; Hamilton, P.; Morrison, R.; Kohut, J.; Connell, B.; Kelsey, H.; Trowbridge, P.

    2011-01-01

    The Integrated Ocean Observing System (IOOS??) Regional Associations and Interagency Partners hosted a water quality workshop in January 2010 to discuss issues of nutrient enrichment and dissolved oxygen depletion (hypoxia), harmful algal blooms (HABs), and beach water quality. In 2007, the National Water Quality Monitoring Council piloted demonstration projects as part of the National Water Quality Monitoring Network (Network) for U.S. Coastal Waters and their Tributaries in three IOOS Regional Associations, and these projects are ongoing. Examples of integrated science-based solutions to water quality issues of major concern from the IOOS regions and Network demonstration projects are explored in this article. These examples illustrate instances where management decisions have benefited from decision-support tools that make use of interoperable data. Gaps, challenges, and outcomes are identified, and a proposal is made for future work toward a multiregional water quality project for beach water quality.

  18. [Secure e-mail between physicians--aspect of a telemedicine platform for the health care system].

    PubMed

    Goetz, C F

    2001-10-01

    Ever since the Roland-Berger-Study in 1997, the concept of a "telematics platform" for health care describes the combination of all technical and organizational components and services for the online transmission of patient data. This platform works on an interoperable collection of standards for addressing, security and content-description. In this context the security for application and transport data is based on data protection as well as medical non-disclosure rules. The methods of cryptography can provide security services for data transmitted realizing addressed, direct and indirect privacy. The first German health professional card, the electronic physicians' ID, provides central tools for such applications. First functionally simple pilot projects will prove the effectiveness of chosen methods in this year, even if not all identified construction sites in health care telematics have yet been lead towards a finalized solution.

  19. Specializing architectures for the type 2 diabetes mellitus care use cases with a focus on process management.

    PubMed

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Ruiz, Alonso A

    2015-01-01

    The development of software supporting inter-disciplinary systems like the type 2 diabetes mellitus care requires the deployment of methodologies designed for this type of interoperability. The GCM framework allows the architectural description of such systems and the development of software solutions based on it. A first step of the GCM methodology is the definition of a generic architecture, followed by its specialization for specific use cases. This paper describes the specialization of the generic architecture of a system, supporting Type 2 diabetes mellitus glycemic control, for a pharmacotherapy use case. It focuses on the behavioral aspect of the system, i.e. the policy domain and the definition of the rules governing the system. The design of this architecture reflects the inter-disciplinary feature of the methodology. Finally, the resulting architecture allows building adaptive, intelligent and complete systems.

  20. Facilitating consumer access to health information.

    PubMed

    Snowdon, Anne; Schnarr, Karin; Alessi, Charles

    2014-01-01

    The lead paper from Zelmer and Hagens details the substantive evolution occurring in health information technologies that has the potential to transform the relationship between consumers, health practitioners and health systems. In this commentary, the authors suggest that Canada is experiencing a shift in consumer behaviour toward a desire to actively manage one's health and wellness that is being facilitated through the advent of health applications on mobile and online technologies platforms. The result is that Canadians are now able to create personalized health solutions based on their individual health values and goals. However, before Canadians are able to derive a personal health benefit from these rapid changes in information technology, they require and are increasingly demanding greater real-time access to their own health information to better inform decision-making, as well as interoperability between their personal health tracking systems and those of their health practitioner team.

  1. Water quality success stories: Integrated assessments from the IOOS regional associations and national water quality monitoring network

    USGS Publications Warehouse

    Ragsdale, Rob; Vowinkel, Eric; Porter, Dwayne; Hamilton, Pixie; Morrison, Ru; Kohut, Josh; Connell, Bob; Kelsey, Heath; Trowbridge, Phil

    2011-01-01

    The Integrated Ocean Observing System (IOOS®) Regional Associations and Interagency Partners hosted a water quality workshop in January 2010 to discuss issues of nutrient enrichment and dissolved oxygen depletion (hypoxia), harmful algal blooms (HABs), and beach water quality. In 2007, the National Water Quality Monitoring Council piloted demonstration projects as part of the National Water Quality Monitoring Network (Network) for U.S. Coastal Waters and their Tributaries in three IOOS Regional Associations, and these projects are ongoing. Examples of integrated science-based solutions to water quality issues of major concern from the IOOS regions and Network demonstration projects are explored in this article. These examples illustrate instances where management decisions have benefited from decision-support tools that make use of interoperable data. Gaps, challenges, and outcomes are identified, and a proposal is made for future work toward a multiregional water quality project for beach water quality.

  2. Development of Extended Content Standards for Biodiversity Data

    NASA Astrophysics Data System (ADS)

    Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu

    2013-04-01

    Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS, SOS, NetCDF), with a use-case dependent 'payload' embedded into the data stream. This enables the re-use of the abstract schema, and sometimes the implementation specification (for example XML, JSON, or NetCDF conventions) across services. An explicit aim will be to make the XML implementation specification re-usable as a DwC and a GML (SOS end WFS) extension. (1) Olga Lyashevska, Keith D. Farnsworth, How many dimensions of biodiversity do we need?, Ecological Indicators, Volume 18, July 2012, Pages 485-492, ISSN 1470-160X, 10.1016/j.ecolind.2011.12.016. (2) GEO BON: Workshop on Essential Biodiversity Variables (27-29 February 2012, Frascati, Italy). (http://www.earthobservations.org/geobon_docs_20120227.shtml)

  3. Elements of Network-Based Assessment

    ERIC Educational Resources Information Center

    Gibson, David

    2007-01-01

    Elements of network-based assessment systems are envisioned based on recent advances in knowledge and practice in learning theory, assessment design and delivery, and semantic web interoperability. The architecture takes advantage of the meditating role of technology as well as recent models of assessment systems. This overview of the elements…

  4. Supporting spatial data harmonization process with the use of ontologies and Semantic Web technologies

    NASA Astrophysics Data System (ADS)

    Strzelecki, M.; Iwaniak, A.; Łukowicz, J.; Kaczmarek, I.

    2013-10-01

    Nowadays, spatial information is not only used by professionals, but also by common citizens, who uses it for their daily activities. Open Data initiative states that data should be freely and unreservedly available for all users. It also applies to spatial data. As spatial data becomes widely available it is essential to publish it in form which guarantees the possibility of integrating it with other, heterogeneous data sources. Interoperability is the possibility to combine spatial data sets from different sources in a consistent way as well as providing access to it. Providing syntactic interoperability based on well-known data formats is relatively simple, unlike providing semantic interoperability, due to the multiple possible data interpretation. One of the issues connected with the problem of achieving interoperability is data harmonization. It is a process of providing access to spatial data in a representation that allows combining it with other harmonized data in a coherent way by using a common set of data product specification. Spatial data harmonization is performed by creating definition of reclassification and transformation rules (mapping schema) for source application schema. Creation of those rules is a very demanding task which requires wide domain knowledge and a detailed look into application schemas. The paper focuses on proposing methods for supporting data harmonization process, by automated or supervised creation of mapping schemas with the use of ontologies, ontology matching methods and Semantic Web technologies.

  5. Public strategies for improving eHealth integration and long‐term sustainability in public health care systems: Findings from an Italian case study

    PubMed Central

    Nuti, Sabina

    2017-01-01

    Summary eHealth is expected to contribute in tackling challenges for health care systems. However, it also imposes challenges. Financing strategies adopted at national as well regional levels widely affect eHealth long‐term sustainability. In a public health care system, the public actor is among the main “buyers” eHealth. However, public interventions have been increasingly focused on cost containment. How to match these 2 aspects? This article explores some central issues, mainly related to financial aspects, in the development of effective and valuable eHealth strategies in a public health care system: How can the public health care system (as a “buyer”) improve long‐term success and sustainability of eHealth solutions? What levers are available to match in the long period different interests of different stakeholders in the eHealth field? A case study was performed in the Region of Tuscany, Italy. According to our results, win‐win strategies should be followed. Investments should take into account the need to long‐term finance solutions, for sustaining changes in health care organizations for obtaining benefits. To solve the interoperability issues, the concept of the “platform approach” emerged, based on collaboration within and between organizations. Private sector as well as beneficiaries and final users of the eHealth solutions should participate in their design, provision, and monitoring. For creating value for all, the evidence gap and the financial needs could be addressed with a pull mechanism of funding, aimed at paying according to the outcomes produced by the eHealth solution, on the base of an ongoing monitoring, measurement, and evaluation of the outcomes. PMID:28791771

  6. Standardized Solution for Management Controller for MTCA.4

    NASA Astrophysics Data System (ADS)

    Makowski, D.; Fenner, M.; Ludwig, F.; Mavrič, U.; Mielczarek, A.; Napieralski, A.; Perek, P.; Schlarb, H.

    2015-06-01

    The Micro Telecommunications Computing Architecture (MTCA) standard is a modern platform that is gaining popularity in the area of High Energy Physics (HEP) experiments. The standard provides extensive management, monitoring and diagnostics functionalities. The hardware control and monitoring is based on the Intelligent Platform Management Interface (IPMI), that was initially developed for supervision of complex computers operation. The original IPMI specification was extended to support functions required by the MTCA specification. The Module Management Controller (MMC) is required on each Advanced Mezzanine Card (AMC) installed in MTCA chassis. The Rear Transition Modules (RTMs) have to be equipped with RTM Management Controllers (RMCs) which is required by the MTCA.4 subsidiary specification. The commercially available implementations of MMC and RMC are expensive and do not provide the complete functionality that is required by specific HEP applications. Therefore, many research centers and commercial companies work on their own implementation of AMC and RTM controllers. The available implementations suffer because of lack of common approach and interoperability problems. Since both Lodz University of Technology (TUL) and Deutsches Elektronen-Synchrotron (DESY) have long-term experience in developing ATCA and MTCA hardware, the authors decided to develop a unified solution of management controller fully compliant with AMC and MTCA.4 standards. The MMC v1.00 solution is dedicated for management of AMC and RTM modules. The MMC v1.00 is based on Atmel ATxmega MCUs and can be fully customized by the user or used as a drop-in-module without any modifications. The paper discusses the functionality of the MMC v1.00 solution. The implementation was verified with developed evaluation kits for AMC and RTM cards.

  7. Public strategies for improving eHealth integration and long-term sustainability in public health care systems: Findings from an Italian case study.

    PubMed

    De Rosis, Sabina; Nuti, Sabina

    2018-01-01

    eHealth is expected to contribute in tackling challenges for health care systems. However, it also imposes challenges. Financing strategies adopted at national as well regional levels widely affect eHealth long-term sustainability. In a public health care system, the public actor is among the main "buyers" eHealth. However, public interventions have been increasingly focused on cost containment. How to match these 2 aspects? This article explores some central issues, mainly related to financial aspects, in the development of effective and valuable eHealth strategies in a public health care system: How can the public health care system (as a "buyer") improve long-term success and sustainability of eHealth solutions? What levers are available to match in the long period different interests of different stakeholders in the eHealth field? A case study was performed in the Region of Tuscany, Italy. According to our results, win-win strategies should be followed. Investments should take into account the need to long-term finance solutions, for sustaining changes in health care organizations for obtaining benefits. To solve the interoperability issues, the concept of the "platform approach" emerged, based on collaboration within and between organizations. Private sector as well as beneficiaries and final users of the eHealth solutions should participate in their design, provision, and monitoring. For creating value for all, the evidence gap and the financial needs could be addressed with a pull mechanism of funding, aimed at paying according to the outcomes produced by the eHealth solution, on the base of an ongoing monitoring, measurement, and evaluation of the outcomes. © 2017 The Authors. The International Journal of Health Planning and Management published by John Wiley & Sons Ltd.

  8. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  9. Porting Social Media Contributions with SIOC

    NASA Astrophysics Data System (ADS)

    Bojars, Uldis; Breslin, John G.; Decker, Stefan

    Social media sites, including social networking sites, have captured the attention of millions of users as well as billions of dollars in investment and acquisition. To better enable a user's access to multiple sites, portability between social media sites is required in terms of both (1) the personal profiles and friend networks and (2) a user's content objects expressed on each site. This requires representation mechanisms to interconnect both people and objects on the Web in an interoperable, extensible way. The Semantic Web provides the required representation mechanisms for portability between social media sites: it links people and objects to record and represent the heterogeneous ties that bind each to the other. The FOAF (Friend-of-a-Friend) initiative provides a solution to the first requirement, and this paper discusses how the SIOC (Semantically-Interlinked Online Communities) project can address the latter. By using agreed-upon Semantic Web formats like FOAF and SIOC to describe people, content objects, and the connections that bind them together, social media sites can interoperate and provide portable data by appealing to some common semantics. In this paper, we will discuss the application of Semantic Web technology to enhance current social media sites with semantics and to address issues with portability between social media sites. It has been shown that social media sites can serve as rich data sources for SIOC-based applications such as the SIOC Browser, but in the other direction, we will now show how SIOC data can be used to represent and port the diverse social media contributions (SMCs) made by users on heterogeneous sites.

  10. A Systematic Review of Wearable Patient Monitoring Systems - Current Challenges and Opportunities for Clinical Adoption.

    PubMed

    Baig, Mirza Mansoor; GholamHosseini, Hamid; Moqeem, Aasia A; Mirza, Farhaan; Lindén, Maria

    2017-07-01

    The aim of this review is to investigate barriers and challenges of wearable patient monitoring (WPM) solutions adopted by clinicians in acute, as well as in community, care settings. Currently, healthcare providers are coping with ever-growing healthcare challenges including an ageing population, chronic diseases, the cost of hospitalization, and the risk of medical errors. WPM systems are a potential solution for addressing some of these challenges by enabling advanced sensors, wearable technology, and secure and effective communication platforms between the clinicians and patients. A total of 791 articles were screened and 20 were selected for this review. The most common publication venue was conference proceedings (13, 54%). This review only considered recent studies published between 2015 and 2017. The identified studies involved chronic conditions (6, 30%), rehabilitation (7, 35%), cardiovascular diseases (4, 20%), falls (2, 10%) and mental health (1, 5%). Most studies focussed on the system aspects of WPM solutions including advanced sensors, wireless data collection, communication platform and clinical usability based on a specific area or disease. The current studies are progressing with localized sensor-software integration to solve a specific use-case/health area using non-scalable and 'silo' solutions. There is further work required regarding interoperability and clinical acceptance challenges. The advancement of wearable technology and possibilities of using machine learning and artificial intelligence in healthcare is a concept that has been investigated by many studies. We believe future patient monitoring and medical treatments will build upon efficient and affordable solutions of wearable technology.

  11. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaidon, Clement; Poplawski, Michael

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  12. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  13. Networking: an overview for leaders of academic medical centers.

    PubMed

    Panko, W B; Erhardt-Domino, K; Pletcher, T; Wilson, W

    1993-07-01

    Organizations face a unique challenge over the next decade. When technology was expensive, it was arguably necessary to use an undifferentiated, or monolithic, model for computer-based solutions to problems. This has fundamentally changed. Technology is now so inexpensive that solutions are not limited by costs, but rather by how well the implementors understand the many different problem domains. Thus, academic medical centers are faced with successive waves in information technology use. First, there will be a wave of innovation, driven by the need for specialization in problem solving. This will be followed by consolidation of the best of the approaches into the core systems of the institution. The average level of heterogeneity (cost) will be higher, but the overall quality of the solutions (benefit) will also be higher. If one can develop a strategy for managing and creatively limiting the heterogeneity, the cost-benefit ratio will be much more favorable. While there may be other strategies that will do this, we support the use of a strategy centered on enterprise networking. This strategy emphasizes not simply technology but also the cultural and organizational changes that empower innovation--within a framework that makes it possible to simply implement interoperability and data sharing within nearly all solutions. The organizations that survive the coming period of change and external pressure will be those that do the best job of managing their resources. Information will continue to be one of the most important resources.(ABSTRACT TRUNCATED AT 250 WORDS)

  14. A component-based software environment for visualizing large macromolecular assemblies.

    PubMed

    Sanner, Michel F

    2005-03-01

    The interactive visualization of large biological assemblies poses a number of challenging problems, including the development of multiresolution representations and new interaction methods for navigating and analyzing these complex systems. An additional challenge is the development of flexible software environments that will facilitate the integration and interoperation of computational models and techniques from a wide variety of scientific disciplines. In this paper, we present a component-based software development strategy centered on the high-level, object-oriented, interpretive programming language: Python. We present several software components, discuss their integration, and describe some of their features that are relevant to the visualization of large molecular assemblies. Several examples are given to illustrate the interoperation of these software components and the integration of structural data from a variety of experimental sources. These examples illustrate how combining visual programming with component-based software development facilitates the rapid prototyping of novel visualization tools.

  15. Functional requirements document for NASA/MSFC Earth Science and Applications Division: Data and information system (ESAD-DIS). Interoperability, 1992

    NASA Technical Reports Server (NTRS)

    Stephens, J. Briscoe; Grider, Gary W.

    1992-01-01

    These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.

  16. ARGOS Policy Brief on Semantic Interoperability

    PubMed Central

    KALRA, Dipak; MUSEN, Mark; SMITH, Barry; CEUSTERS, Werner

    2016-01-01

    Semantic interoperability is one of the priority themes of the ARGOS Trans-Atlantic Observatory. This topic represents a globally recognised challenge that must be addressed if electronic health records are to be shared among heterogeneous systems, and the information in them exploited to the maximum benefit of patients, professionals, health services, research, and industry. Progress in this multi-faceted challenge has been piecemeal, and valuable lessons have been learned, and approaches discovered, in Europe and in the US that can be shared and combined. Experts from both continents have met at three ARGOS workshops during 2010 and 2011 to share understanding of these issues and how they might be tackled collectively from both sides of the Atlantic. This policy brief summarises the problems and the reasons why they are important to tackle, and also why they are so difficult. It outlines the major areas of semantic innovation that exist and that are available to help address this challenge. It proposes a series of next steps that need to be championed on both sides of the Atlantic if further progress is to be made in sharing and analysing electronic health records meaningfully. Semantic interoperability requires the use of standards, not only for EHR data to be transferred and structurally mapped into a receiving repository, but also for the clinical content of the EHR to be interpreted in conformity with the original meanings intended by its authors. Wide-scale engagement with professional bodies, globally, is needed to develop these clinical information standards. Accurate and complete clinical documentation, faithful to the patient’s situation, and interoperability between systems, require widespread and dependable access to published and maintained collections of coherent and quality-assured semantic resources, including models such as archetypes and templates that would (1) provide clinical context, (2) be mapped to interoperability standards for EHR data, (3) be linked to well specified multi-lingual terminology value sets, and (4) be derived from high quality ontologies. There is need to gain greater experience in how semantic resources should be defined, validated, and disseminated, how users (who increasingly will include patients) should be educated to improve the quality and consistency of EHR documentation and to make full use of it. There are urgent needs to scale up the authorship, acceptance, and adoption of clinical information standards, to leverage and harmonise the islands of standardisation optimally, to assure the quality of the artefacts produced, and to organise end-to-end governance of the development and adoption of solutions. PMID:21893897

  17. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and environmental stewardship by 2030. These efforts suggest the need for a holistic approach towards improving and implementing strategies, policies, and practices that will ensure long-term sustainability and interoperability of scientific data repositories and networks across multiple scientific domains.

  18. Promoting A-Priori Interoperability of HLA-Based Simulations in the Space Domain: The SISO Space Reference FOM Initiative

    NASA Technical Reports Server (NTRS)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2016-01-01

    Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.

  19. Groundwater data network interoperability

    USGS Publications Warehouse

    Brodaric, Boyan; Booth, Nathaniel; Boisvert, Eric; Lucido, Jessica M.

    2016-01-01

    Water data networks are increasingly being integrated to answer complex scientific questions that often span large geographical areas and cross political borders. Data heterogeneity is a major obstacle that impedes interoperability within and between such networks. It is resolved here for groundwater data at five levels of interoperability, within a Spatial Data Infrastructure architecture. The result is a pair of distinct national groundwater data networks for the United States and Canada, and a combined data network in which they are interoperable. This combined data network enables, for the first time, transparent public access to harmonized groundwater data from both sides of the shared international border.

  20. Report on the Second Catalog Interoperability Workshop

    NASA Technical Reports Server (NTRS)

    Thieman, James R.; James, Mary E.

    1988-01-01

    The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.

  1. A logical approach to semantic interoperability in healthcare.

    PubMed

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  2. A SOA-Based Solution to Monitor Vaccination Coverage Among HIV-Infected Patients in Liguria.

    PubMed

    Giannini, Barbara; Gazzarata, Roberta; Sticchi, Laura; Giacomini, Mauro

    2016-01-01

    Vaccination in HIV-infected patients constitutes an essential tool in the prevention of the most common infectious diseases. The Ligurian Vaccination in HIV Program is a proposed vaccination schedule specifically dedicated to this risk group. Selective strategies are proposed within this program, employing ICT (Information and Communication) tools to identify this susceptible target group, to monitor immunization coverage over time and to manage failures and defaulting. The proposal is to connect an immunization registry system to an existing regional platform that allows clinical data re-use among several medical structures, to completely manage the vaccination process. This architecture will adopt a Service Oriented Architecture (SOA) approach and standard HSSP (Health Services Specification Program) interfaces to support interoperability. According to the presented solution, vaccination administration information retrieved from the immunization registry will be structured according to the specifications within the immunization section of the HL7 (Health Level 7) CCD (Continuity of Care Document) document. Immunization coverage will be evaluated through the continuous monitoring of serology and antibody titers gathered from the hospital LIS (Laboratory Information System) structured into a HL7 Version 3 (v3) Clinical Document Architecture Release 2 (CDA R2).

  3. Process-oriented integration and coordination of healthcare services across organizational boundaries.

    PubMed

    Tello-Leal, Edgar; Chiotti, Omar; Villarreal, Pablo David

    2012-12-01

    The paper presents a methodology that follows a top-down approach based on a Model-Driven Architecture for integrating and coordinating healthcare services through cross-organizational processes to enable organizations providing high quality healthcare services and continuous process improvements. The methodology provides a modeling language that enables organizations conceptualizing an integration agreement, and identifying and designing cross-organizational process models. These models are used for the automatic generation of: the private view of processes each organization should perform to fulfill its role in cross-organizational processes, and Colored Petri Net specifications to implement these processes. A multi-agent system platform provides agents able to interpret Colored Petri-Nets to enable the communication between the Healthcare Information Systems for executing the cross-organizational processes. Clinical documents are defined using the HL7 Clinical Document Architecture. This methodology guarantees that important requirements for healthcare services integration and coordination are fulfilled: interoperability between heterogeneous Healthcare Information Systems; ability to cope with changes in cross-organizational processes; guarantee of alignment between the integrated healthcare service solution defined at the organizational level and the solution defined at technological level; and the distributed execution of cross-organizational processes keeping the organizations autonomy.

  4. Toward a DoD/VA longitudinal health record: politics and the policy landscape.

    PubMed

    Gimbel, Ronald W; Clyburn, Conrad A

    2009-05-01

    Policy implications of an interoperable Department of Defense (DoD) and Veterans Administration (VA) longitudinal health record (LHR) are substantial and far reaching. In this manuscript the authors explore the existing challenges and opportunities, the political landscape, and alternative solutions that have created a favorable environment for legislation and funding to support its development. The authors identify six policy themes emerging from the historic National Forum on the Future of the Defense Health Information System held recently at Georgetown University in Washington DC.

  5. Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project: Advanced Collision Avoidance System for UAS (ACAS Xu) Interoperability White Paper Presentation

    NASA Technical Reports Server (NTRS)

    Fern, Lisa

    2017-01-01

    The Phase 1 DAA Minimum Operational Performance Standards (MOPS) provided requirements for two classes of DAA equipment: equipment Class 1 contains the basic DAA equipment required to assist a pilot in remaining well clear, while equipment Class 2 integrates the Traffic Alert and Collision Avoidance (TCAS) II system. Thus, the Class 1 system provides RWC functionality only, while the Class 2 system is intended to provide both RWC and Collision Avoidance (CA) functionality, in compliance with the Minimum Aviation System Performance (MASPS) for the Interoperability of Airborne Collision Avoidance Systems. The FAAs TCAS Program Office is currently developing Airborne Collision Avoidance System X (ACAS X) to support the objectives of the Federal Aviation Administrations (FAA) Next Generation Air Transportation System Program (NextGen). ACAS X has a suite of variants with a common underlying design that are intended to be optimized for their intended airframes and operations. ACAS Xu being is designed for UAS and allows for new surveillance technologies and tailored logic for platforms with different performance characteristics. In addition to Collision Avoidance (CA) alerting and guidance, ACAS Xu is being tuned to provide RWC alerting and guidance in compliance with the SC 228 DAA MOPS. With a single logic performing both RWC and CA functions, ACAS Xu will provide industry with an integrated DAA solution that addresses many of the interoperability shortcomings of Phase I systems. While the MOPS for ACAS Xu will specify an integrated DAA system, it will need to show compliance with the RWC alerting thresholds and alerting requirements defined in the DAA Phase 2 MOPS. Further, some functional components of the ACAS Xu system such as the remote pilots displayed guidance might be mostly references to the corresponding requirements in the DAA MOPS. To provide a seamless, integrated, RWC-CA system to assist the pilot in remaining well clear and avoiding collisions, several issues need to be addressed within the Phase 2 SC-228 DAA efforts. Interoperability of the RWC and CA alerting and guidance, and ensuring pilot comprehension, compliance and performance, will be a primary research area.

  6. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  7. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  8. Towards Standardized Patient Data Exchange: Integrating a FHIR Based API for the Open Medical Record System.

    PubMed

    Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul

    2015-01-01

    Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.

  9. Architected Agile Solutions for Software-Reliant Systems

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Lane, Jo Ann; Koolmanojwong, Supannika; Turner, Richard

    Systems are becoming increasingly reliant on software due to needs for rapid fielding of “70% capabilities,” interoperability, net-centricity, and rapid adaptation to change. The latter need has led to increased interest in agile methods of software development, in which teams rely on shared tacit interpersonal knowledge rather than explicit documented knowledge. However, such systems often need to be scaled up to higher level of performance and assurance, requiring stronger architectural support. Several organizations have recently transformed themselves by developing successful combinations of agility and architecture that can scale to projects of up to 100 personnel. This chapter identifies a set of key principles for such architected agile solutions for software-reliant systems, provides guidance for how much architecting is enough, and illustrates the key principles with several case studies.

  10. Interoperability Measurement

    DTIC Science & Technology

    2008-08-01

    facilitate the use of existing architecture descriptions in performing interoperability measurement. Noting that “everything in the world can be expressed as...biological, botanical, and genetic research, it has also been used with great success in the fields of ecology, medicine, the social sciences, the...appropriate for at least three reasons. First, systems perform different interoperations in different scenarios (i.e., they are used differently); second

  11. Commanding Heterogeneous Multi-Robot Teams

    DTIC Science & Technology

    2014-06-01

    Coalition Battle Management Language (C-BML) Study Group Report. 2005 Fall Simulation Interoperability Workshop (05F- SIW - 041), Orlando, FL, September...NMSG-085 CIG Land Operation Demonstration. 2013 Spring Simulation Interoperability Workshop (13S- SIW -031), San Diego, CA. April 2013. [4] K...Simulation Interoperability Workshop (10F- SIW -039), Orlando, FL, September 2010. [5] M. Langerwisch, M. Ax, S. Thamke, T. Remmersmann, A. Tiderko

  12. DeoR repression at-a-distance only weakly responds to changes in interoperator separation and DNA topology.

    PubMed Central

    Dandanell, G

    1992-01-01

    The interoperator distance between a synthetic operator Os and the deoP2O2-galK fusion was varied between 46 and 176 bp. The repression of the deoP2 directed galK expression as a function of the interoperator distance (center-to-center) was measured in vivo in a single-copy system. The results show that the DeoR repressor efficiently can repress transcription at all the interoperator distances tested. The degree of repression depends very little on the spacing between the operators, however, a weak periodic dependency of 8-11 bp may exist. PMID:1437558

  13. Standardized exchange of clinical documents--towards a shared care paradigm in glaucoma treatment.

    PubMed

    Gerdsen, F; Müller, S; Jablonski, S; Prokosch, H-U

    2006-01-01

    The exchange of medical data from research and clinical routine across institutional borders is essential to establish an integrated healthcare platform. In this project we want to realize the standardized exchange of medical data between different healthcare institutions to implement an integrated and interoperable information system supporting clinical treatment and research of glaucoma. The central point of our concept is a standardized communication model based on the Clinical Document Architecture (CDA). Further, a communication concept between different health care institutions applying the developed document model has been defined. With our project we have been able to prove that standardized communication between an Electronic Medical Record (EMR), an Electronic Health Record (EHR) and the Erlanger Glaucoma Register (EGR) based on the established conceptual models, which rely on CDA rel.1 level 1 and SCIPHOX, could be implemented. The HL7-tool-based deduction of a suitable CDA rel.2 compliant schema showed significant differences when compared with the manually created schema. Finally fundamental requirements, which have to be implemented for an integrated health care platform, have been identified. An interoperable information system can enhance both clinical treatment and research projects. By automatically transferring screening findings from a glaucoma research project to the electronic medical record of our ophthalmology clinic, clinicians could benefit from the availability of a longitudinal patient record. The CDA as a standard for exchanging clinical documents has demonstrated its potential to enhance interoperability within a future shared care paradigm.

  14. Standards to support information systems integration in anatomic pathology.

    PubMed

    Daniel, Christel; García Rojo, Marcial; Bourquard, Karima; Henin, Dominique; Schrader, Thomas; Della Mea, Vincenzo; Gilbertson, John; Beckwith, Bruce A

    2009-11-01

    Integrating anatomic pathology information- text and images-into electronic health care records is a key challenge for enhancing clinical information exchange between anatomic pathologists and clinicians. The aim of the Integrating the Healthcare Enterprise (IHE) international initiative is precisely to ensure interoperability of clinical information systems by using existing widespread industry standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level Seven (HL7). To define standard-based informatics transactions to integrate anatomic pathology information to the Healthcare Enterprise. We used the methodology of the IHE initiative. Working groups from IHE, HL7, and DICOM, with special interest in anatomic pathology, defined consensual technical solutions to provide end-users with improved access to consistent information across multiple information systems. The IHE anatomic pathology technical framework describes a first integration profile, "Anatomic Pathology Workflow," dedicated to the diagnostic process including basic image acquisition and reporting solutions. This integration profile relies on 10 transactions based on HL7 or DICOM standards. A common specimen model was defined to consistently identify and describe specimens in both HL7 and DICOM transactions. The IHE anatomic pathology working group has defined standard-based informatics transactions to support the basic diagnostic workflow in anatomic pathology laboratories. In further stages, the technical framework will be completed to manage whole-slide images and semantically rich structured reports in the diagnostic workflow and to integrate systems used for patient care and those used for research activities (such as tissue bank databases or tissue microarrayers).

  15. Prototype Interoperability Document between NASA-JSC and DLR-GSOC Describing the CCSDS SM and C Mission Operations Prototype

    NASA Technical Reports Server (NTRS)

    Lucord, Steve A.; Gully, Sylvain

    2009-01-01

    The purpose of the PROTOTYPE INTEROPERABILITY DOCUMENT is to document the design and interfaces for the service providers and consumers of a Mission Operations prototype between JSC-OTF and DLR-GSOC. The primary goal is to test the interoperability sections of the CCSDS Spacecraft Monitor & Control (SM&C) Mission Operations (MO) specifications between both control centers. An additional goal is to provide feedback to the Spacecraft Monitor and Control (SM&C) working group through the Review Item Disposition (RID) process. This Prototype is considered a proof of concept and should increase the knowledge base of the CCSDS SM&C Mission Operations standards. No operational capabilities will be provided. The CCSDS Mission Operations (MO) initiative was previously called Spacecraft Monitor and Control (SM&C). The specifications have been renamed to better reflect the scope and overall objectives. The working group retains the name Spacecraft Monitor and Control working group and is under the Mission Operations and Information Services Area (MOIMS) of CCSDS. This document will refer to the specifications as SM&C Mission Operations, Mission Operations or just MO.

  16. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    PubMed

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    PubMed

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  18. Device interoperability and authentication for telemedical appliance based on the ISO/IEEE 11073 Personal Health Device (PHD) Standards.

    PubMed

    Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G

    2012-01-01

    In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.

  19. LVC Architecture Roadmap Implementation - Results of the First Two Years

    DTIC Science & Technology

    2012-03-01

    NOTES Presented at the Simulation Interoperability Standards Organization?s (SISO) Spring Simulation Interoperability Workshop ( SIW ), 26-30 March...presented at the semi-annual Simulation Interoperability Workshops ( SIWs ) and the annual Interservice/Industry Training, Simulation & Education Conference...I/ITSEC), as well as other venues. For example, a full-day workshop on the initial progress of the effort was conducted at the 2010 Spring SIW [2

  20. Towards a cross-domain interoperable framework for natural hazards and disaster risk reduction information

    NASA Astrophysics Data System (ADS)

    Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela

    2014-05-01

    The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf

  1. Grid Application Meta-Repository System: Repository Interconnectivity and Cross-domain Application Usage in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen

    Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.

  2. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    NASA Technical Reports Server (NTRS)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  3. A VO-Driven Astronomical Data Grid in China

    NASA Astrophysics Data System (ADS)

    Cui, C.; He, B.; Yang, Y.; Zhao, Y.

    2010-12-01

    With the implementation of many ambitious observation projects, including LAMOST, FAST, and Antarctic observatory at Doom A, observational astronomy in China is stepping into a brand new era with emerging data avalanche. In the era of e-Science, both these cutting-edge projects and traditional astronomy research need much more powerful data management, sharing and interoperability. Based on data-grid concept, taking advantages of the IVOA interoperability technologies, China-VO is developing a VO-driven astronomical data grid environment to enable multi-wavelength science and large database science. In the paper, latest progress and data flow of the LAMOST, architecture of the data grid, and its supports to the VO are discussed.

  4. Open Architecture SDR for Space

    NASA Technical Reports Server (NTRS)

    Smith, Carl; Long, Chris; Liebetreu, John; Reinhart, Richard C.

    2005-01-01

    This paper describes an open-architecture SDR (software defined radio) infrastructure that is suitable for space-based operations (Space-SDR). SDR technologies will endow space and planetary exploration systems with dramatically increased capability, reduced power consumption, and significantly less mass than conventional systems, at costs reduced by vigorous competition, hardware commonality, dense integration, reduced obsolescence, interoperability, and software re-use. Significant progress has been recorded on developments like the Joint Tactical Radio System (JSTRS) Software Communication Architecture (SCA), which is oriented toward reconfigurable radios for defense forces operating in multiple theaters of engagement. The JTRS-SCA presents a consistent software interface for waveform development, and facilitates interoperability, waveform portability, software re-use, and technology evolution.

  5. Operational Plan Ontology Model for Interconnection and Interoperability

    NASA Astrophysics Data System (ADS)

    Long, F.; Sun, Y. K.; Shi, H. Q.

    2017-03-01

    Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.

  6. .NET INTEROPERABILITY GUIDELINES

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modelling components (PMCs) developed by third parties to be used in any process modelling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compo...

  7. Standards-based sensor interoperability and networking SensorWeb: an overview

    NASA Astrophysics Data System (ADS)

    Bolling, Sam

    2012-06-01

    The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.

  8. Querying clinical data in HL7 RIM based relational model with morph-RDB.

    PubMed

    Priyatna, Freddy; Alonso-Calvo, Raul; Paraiso-Medina, Sergio; Corcho, Oscar

    2017-10-05

    Semantic interoperability is essential when carrying out post-genomic clinical trials where several institutions collaborate, since researchers and developers need to have an integrated view and access to heterogeneous data sources. One possible approach to accommodate this need is to use RDB2RDF systems that provide RDF datasets as the unified view. These RDF datasets may be materialized and stored in a triple store, or transformed into RDF in real time, as virtual RDF data sources. Our previous efforts involved materialized RDF datasets, hence losing data freshness. In this paper we present a solution that uses an ontology based on the HL7 v3 Reference Information Model and a set of R2RML mappings that relate this ontology to an underlying relational database implementation, and where morph-RDB is used to expose a virtual, non-materialized SPARQL endpoint over the data. By applying a set of optimization techniques on the SPARQL-to-SQL query translation algorithm, we can now issue SPARQL queries to the underlying relational data with generally acceptable performance.

  9. Study on resources and environmental data integration towards data warehouse construction covering trans-boundary area of China, Russia and Mongolia

    NASA Astrophysics Data System (ADS)

    Wang, J.; Song, J.; Gao, M.; Zhu, L.

    2014-02-01

    The trans-boundary area between Northern China, Mongolia and eastern Siberia of Russia is a continuous geographical area located in north eastern Asia. Many common issues in this region need to be addressed based on a uniform resources and environmental data warehouse. Based on the practice of joint scientific expedition, the paper presented a data integration solution including 3 steps, i.e., data collection standards and specifications making, data reorganization and process, data warehouse design and development. A series of data collection standards and specifications were drawn up firstly covering more than 10 domains. According to the uniform standard, 20 resources and environmental survey databases in regional scale, and 11 in-situ observation databases were reorganized and integrated. North East Asia Resources and Environmental Data Warehouse was designed, which included 4 layers, i.e., resources layer, core business logic layer, internet interoperation layer, and web portal layer. The data warehouse prototype was developed and deployed initially. All the integrated data in this area can be accessed online.

  10. Using Clinical Data Standards to Measure Quality: A New Approach.

    PubMed

    D'Amore, John D; Li, Chun; McCrary, Laura; Niloff, Jonathan M; Sittig, Dean F; McCoy, Allison B; Wright, Adam

    2018-04-01

     Value-based payment for care requires the consistent, objective calculation of care quality. Previous initiatives to calculate ambulatory quality measures have relied on billing data or individual electronic health records (EHRs) to calculate and report performance. New methods for quality measure calculation promoted by federal regulations allow qualified clinical data registries to report quality outcomes based on data aggregated across facilities and EHRs using interoperability standards.  This research evaluates the use of clinical document interchange standards as the basis for quality measurement.  Using data on 1,100 patients from 11 ambulatory care facilities and 5 different EHRs, challenges to quality measurement are identified and addressed for 17 certified quality measures.  Iterative solutions were identified for 14 measures that improved patient inclusion and measure calculation accuracy. Findings validate this approach to improving measure accuracy while maintaining measure certification.  Organizations that report care quality should be aware of how identified issues affect quality measure selection and calculation. Quality measure authors should consider increasing real-world validation and the consistency of measure logic in respect to issues identified in this research. Schattauer GmbH Stuttgart.

  11. Context Aware Middleware Architectures: Survey and Challenges

    PubMed Central

    Li, Xin; Eckert, Martina; Martinez, José-Fernán; Rubio, Gregorio

    2015-01-01

    Context aware applications, which can adapt their behaviors to changing environments, are attracting more and more attention. To simplify the complexity of developing applications, context aware middleware, which introduces context awareness into the traditional middleware, is highlighted to provide a homogeneous interface involving generic context management solutions. This paper provides a survey of state-of-the-art context aware middleware architectures proposed during the period from 2009 through 2015. First, a preliminary background, such as the principles of context, context awareness, context modelling, and context reasoning, is provided for a comprehensive understanding of context aware middleware. On this basis, an overview of eleven carefully selected middleware architectures is presented and their main features explained. Then, thorough comparisons and analysis of the presented middleware architectures are performed based on technical parameters including architectural style, context abstraction, context reasoning, scalability, fault tolerance, interoperability, service discovery, storage, security & privacy, context awareness level, and cloud-based big data analytics. The analysis shows that there is actually no context aware middleware architecture that complies with all requirements. Finally, challenges are pointed out as open issues for future work. PMID:26307988

  12. A Community-Based Approach to Leading the Nation in Smart Energy Use

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2013-12-31

    Project Objectives The AEP Ohio gridSMART® Demonstration Project (Project) achieved the following objectives: • Built a secure, interoperable, and integrated smart grid infrastructure in northeast central Ohio that demonstrated the ability to maximize distribution system efficiency and reliability and consumer use of demand response programs that reduced energy consumption, peak demand, and fossil fuel emissions. • Actively attracted, educated, enlisted, and retained consumers in innovative business models that provided tools and information reducing consumption and peak demand. • Provided the U.S. Department of Energy (DOE) information to evaluate technologies and preferred smart grid business models to be extended nationally. Projectmore » Description Ohio Power Company (the surviving company of a merger with Columbus Southern Power Company), doing business as AEP Ohio (AEP Ohio), took a community-based approach and incorporated a full suite of advanced smart grid technologies for 110,000 consumers in an area selected for its concentration and diversity of distribution infrastructure and consumers. It was organized and aligned around: • Technology, implementation, and operations • Consumer and stakeholder acceptance • Data management and benefit assessment Combined, these functional areas served as the foundation of the Project to integrate commercially available products, innovative technologies, and new consumer products and services within a secure two-way communication network between the utility and consumers. The Project included Advanced Metering Infrastructure (AMI), Distribution Management System (DMS), Distribution Automation Circuit Reconfiguration (DACR), Volt VAR Optimization (VVO), and Consumer Programs (CP). These technologies were combined with two-way consumer communication and information sharing, demand response, dynamic pricing, and consumer products, such as plug-in electric vehicles and smart appliances. In addition, the Project incorporated comprehensive cyber security capabilities, interoperability, and a data assessment that, with grid simulation capabilities, made the demonstration results an adaptable, integrated solution for AEP Ohio and the nation.« less

  13. A comparison of data interoperability approaches of fusion codes with application to synthetic diagnostics

    NASA Astrophysics Data System (ADS)

    Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.

    2010-11-01

    As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.

  14. An Interoperable Electronic Medical Record-Based Platform for Personalized Predictive Analytics

    ERIC Educational Resources Information Center

    Abedtash, Hamed

    2017-01-01

    Precision medicine refers to the delivering of customized treatment to patients based on their individual characteristics, and aims to reduce adverse events, improve diagnostic methods, and enhance the efficacy of therapies. Among efforts to achieve the goals of precision medicine, researchers have used observational data for developing predictive…

  15. Development of Health Information Search Engine Based on Metadata and Ontology

    PubMed Central

    Song, Tae-Min; Jin, Dal-Lae

    2014-01-01

    Objectives The aim of the study was to develop a metadata and ontology-based health information search engine ensuring semantic interoperability to collect and provide health information using different application programs. Methods Health information metadata ontology was developed using a distributed semantic Web content publishing model based on vocabularies used to index the contents generated by the information producers as well as those used to search the contents by the users. Vocabulary for health information ontology was mapped to the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), and a list of about 1,500 terms was proposed. The metadata schema used in this study was developed by adding an element describing the target audience to the Dublin Core Metadata Element Set. Results A metadata schema and an ontology ensuring interoperability of health information available on the internet were developed. The metadata and ontology-based health information search engine developed in this study produced a better search result compared to existing search engines. Conclusions Health information search engine based on metadata and ontology will provide reliable health information to both information producer and information consumers. PMID:24872907

  16. Development of health information search engine based on metadata and ontology.

    PubMed

    Song, Tae-Min; Park, Hyeoun-Ae; Jin, Dal-Lae

    2014-04-01

    The aim of the study was to develop a metadata and ontology-based health information search engine ensuring semantic interoperability to collect and provide health information using different application programs. Health information metadata ontology was developed using a distributed semantic Web content publishing model based on vocabularies used to index the contents generated by the information producers as well as those used to search the contents by the users. Vocabulary for health information ontology was mapped to the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), and a list of about 1,500 terms was proposed. The metadata schema used in this study was developed by adding an element describing the target audience to the Dublin Core Metadata Element Set. A metadata schema and an ontology ensuring interoperability of health information available on the internet were developed. The metadata and ontology-based health information search engine developed in this study produced a better search result compared to existing search engines. Health information search engine based on metadata and ontology will provide reliable health information to both information producer and information consumers.

  17. OntoCR: A CEN/ISO-13606 clinical repository based on ontologies.

    PubMed

    Lozano-Rubí, Raimundo; Muñoz Carrero, Adolfo; Serrano Balazote, Pablo; Pastor, Xavier

    2016-04-01

    To design a new semantically interoperable clinical repository, based on ontologies, conforming to CEN/ISO 13606 standard. The approach followed is to extend OntoCRF, a framework for the development of clinical repositories based on ontologies. The meta-model of OntoCRF has been extended by incorporating an OWL model integrating CEN/ISO 13606, ISO 21090 and SNOMED CT structure. This approach has demonstrated a complete evaluation cycle involving the creation of the meta-model in OWL format, the creation of a simple test application, and the communication of standardized extracts to another organization. Using a CEN/ISO 13606 based system, an indefinite number of archetypes can be merged (and reused) to build new applications. Our approach, based on the use of ontologies, maintains data storage independent of content specification. With this approach, relational technology can be used for storage, maintaining extensibility capabilities. The present work demonstrates that it is possible to build a native CEN/ISO 13606 repository for the storage of clinical data. We have demonstrated semantic interoperability of clinical information using CEN/ISO 13606 extracts. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  19. IT Labs Proof-of-Concept Project: Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards

    NASA Technical Reports Server (NTRS)

    Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell

    2014-01-01

    No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.

  20. Interacting with Multi-Robot Systems Using BML

    DTIC Science & Technology

    2013-06-01

    Pullen, U. Schade, J. Simonsen & R. Gomez-Veiga, NATO MSG-048 C-BML Final Report Summary. 2010 Fall Simulation Interoperability Workshop (10F- SIW -039...NATO MSG-085. 2012 Spring Simulation Interoperability Workshop (12S- SIW -045), Orlando, FL, March 2012. [3] T. Remmersmann, U. Schade, L. Khimeche...B. Grautreau & R. El Abdouni Khayari, Lessons Recognized: How to Combine BML and MSDL. 2012 Spring Simulation Interoperability Workshop (12S- SIW -012

  1. A Linguistic Foundation for Communicating Geo-Information in the context of BML and geoBML

    DTIC Science & Technology

    2010-03-23

    BML Standard. 2009 Spring Simulation Interoperability Workshop (09S- SIW -046). San Diego, CA. Rein, K., Schade, U. & Hieb, M.R. (2009). Battle...Formalizing Battle Management Language: A Grammar for Specifying Orders. 2006 Spring Simulation Interoperability Workshop (06S- SIW - 068). Huntsville...Hieb, M.R. (2007). Battle Management Language: A Grammar for Specifying Reports. 2007 Spring Simulation Interoperability Workshop (07S- SIW -036

  2. Semantic and syntactic interoperability in online processing of big Earth observation data.

    PubMed

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).

  3. Semantic and syntactic interoperability in online processing of big Earth observation data

    PubMed Central

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171

  4. Operable Data Management for Ocean Observing Systems

    NASA Astrophysics Data System (ADS)

    Chavez, F. P.; Graybeal, J. B.; Godin, M. A.

    2004-12-01

    As oceanographic observing systems become more numerous and complex, data management solutions must follow. Most existing oceanographic data management systems fall into one of three categories: they have been developed as dedicated solutions, with limited application to other observing systems; they expect that data will be pre-processed into well-defined formats, such as netCDF; or they are conceived as robust, generic data management solutions, with complexity (high) and maturity and adoption rates (low) to match. Each approach has strengths and weaknesses; no approach yet fully addresses, nor takes advantage of, the sophistication of ocean observing systems as they are now conceived. In this presentation we describe critical data management requirements for advanced ocean observing systems, of the type envisioned by ORION and IOOS. By defining common requirements -- functional, qualitative, and programmatic -- for all such ocean observing systems, the performance and nature of the general data management solution can be characterized. Issues such as scalability, maintaining metadata relationships, data access security, visualization, and operational flexibility suggest baseline architectural characteristics, which may in turn lead to reusable components and approaches. Interoperability with other data management systems, with standards-based solutions in metadata specification and data transport protocols, and with the data management infrastructure envisioned by IOOS and ORION, can also be used to define necessary capabilities. Finally, some requirements for the software infrastructure of ocean observing systems can be inferred. Early operational results and lessons learned, from development and operations of MBARI ocean observing systems, are used to illustrate key requirements, choices, and challenges. Reference systems include the Monterey Ocean Observing System (MOOS), its component software systems (Software Infrastructure and Applications for MOOS, and the Shore Side Data System), and the Autonomous Ocean Sampling Network (AOSN).

  5. Improved operator agreement and efficiency using the minimum area contour change method for delineation of hyperintense multiple sclerosis lesions on FLAIR MRI

    PubMed Central

    2013-01-01

    Background Activity of disease in patients with multiple sclerosis (MS) is monitored by detecting and delineating hyper-intense lesions on MRI scans. The Minimum Area Contour Change (MACC) algorithm has been created with two main goals: a) to improve inter-operator agreement on outlining regions of interest (ROIs) and b) to automatically propagate longitudinal ROIs from the baseline scan to a follow-up scan. Methods The MACC algorithm first identifies an outer bound for the solution path, forms a high number of iso-contour curves based on equally spaced contour values, and then selects the best contour value to outline the lesion. The MACC software was tested on a set of 17 FLAIR MRI images evaluated by a pair of human experts and a longitudinal dataset of 12 pairs of T2-weighted Fluid Attenuated Inversion Recovery (FLAIR) images that had lesion analysis ROIs drawn by a single expert operator. Results In the tests where two human experts evaluated the same MRI images, the MACC program demonstrated that it could markedly reduce inter-operator outline error. In the longitudinal part of the study, the MACC program created ROIs on follow-up scans that were in close agreement to the original expert’s ROIs. Finally, in a post-hoc analysis of 424 follow-up scans 91% of propagated MACC were accepted by an expert and only 9% of the final accepted ROIS had to be created or edited by the expert. Conclusion When used with an expert operator's verification of automatically created ROIs, MACC can be used to improve inter- operator agreement and decrease analysis time, which should improve data collected and analyzed in multicenter clinical trials. PMID:24004511

  6. Distributed Energy Systems Integration and Demand Optimization for Autonomous Operations and Electric Grid Transactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghatikar, Girish; Mashayekh, Salman; Stadler, Michael

    Distributed power systems in the U.S. and globally are evolving to provide reliable and clean energy to consumers. In California, existing regulations require significant increases in renewable generation, as well as identification of customer-side distributed energy resources (DER) controls, communication technologies, and standards for interconnection with the electric grid systems. As DER deployment expands, customer-side DER control and optimization will be critical for system flexibility and demand response (DR) participation, which improves the economic viability of DER systems. Current DER systems integration and communication challenges include leveraging the existing DER and DR technology and systems infrastructure, and enabling optimized cost,more » energy and carbon choices for customers to deploy interoperable grid transactions and renewable energy systems at scale. Our paper presents a cost-effective solution to these challenges by exploring communication technologies and information models for DER system integration and interoperability. This system uses open standards and optimization models for resource planning based on dynamic-pricing notifications and autonomous operations within various domains of the smart grid energy system. It identifies architectures and customer engagement strategies in dynamic DR pricing transactions to generate feedback information models for load flexibility, load profiles, and participation schedules. The models are tested at a real site in California—Fort Hunter Liggett (FHL). Furthermore, our results for FHL show that the model fits within the existing and new DR business models and networked systems for transactive energy concepts. Integrated energy systems, communication networks, and modeling tools that coordinate supply-side networks and DER will enable electric grid system operators to use DER for grid transactions in an integrated system.« less

  7. An open and reconfigurable wireless sensor network for pervasive health monitoring.

    PubMed

    Triantafyllidis, A; Koutkias, V; Chouvarda, I; Maglaveras, N

    2008-01-01

    Sensor networks constitute the backbone for the construction of personalized monitoring systems. Up to now, several sensor networks have been proposed for diverse pervasive healthcare applications, which are however characterized by a significant lack of open architectures, resulting in closed, non-interoperable and difficult to extend solutions. In this context, we propose an open and reconfigurable wireless sensor network (WSN) for pervasive health monitoring, with particular emphasis in its easy extension with additional sensors and functionality by incorporating embedded intelligence mechanisms. We consider a generic WSN architecture comprised of diverse sensor nodes (with communication and processing capabilities) and a mobile base unit (MBU) operating as the gateway between the sensors and the medical personnel, formulating this way a body area network (BAN). The primary focus of this work is on the intra-BAN data communication issues, adopting SensorML as the data representation mean, including the encoding of the monitoring patterns and the functionality of the sensor network. In our prototype implementation two sensor nodes are emulated; one for heart rate monitoring and the other for blood glucose observations, while the MBU corresponds to a personal digital assistant (PDA) device. Java 2 Micro Edition (J2ME) is used to implement both the sensor nodes and the MBU components. Intra-BAN wireless communication relies on the Blue-tooth protocol. Via an adaptive user interface in the MBU, health professionals may specify the monitoring parameters of the WSN and define the monitoring patterns of interest in terms of rules. This work constitutes an essential step towards the construction of open, extensible, inter-operable and intelligent WSNs for pervasive health monitoring.

  8. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  9. Publication, discovery and interoperability of Clinical Decision Support Systems: A Linked Data approach.

    PubMed

    Marco-Ruiz, Luis; Pedrinaci, Carlos; Maldonado, J A; Panziera, Luca; Chen, Rong; Bellika, J Gustav

    2016-08-01

    The high costs involved in the development of Clinical Decision Support Systems (CDSS) make it necessary to share their functionality across different systems and organizations. Service Oriented Architectures (SOA) have been proposed to allow reusing CDSS by encapsulating them in a Web service. However, strong barriers in sharing CDS functionality are still present as a consequence of lack of expressiveness of services' interfaces. Linked Services are the evolution of the Semantic Web Services paradigm to process Linked Data. They aim to provide semantic descriptions over SOA implementations to overcome the limitations derived from the syntactic nature of Web services technologies. To facilitate the publication, discovery and interoperability of CDS services by evolving them into Linked Services that expose their interfaces as Linked Data. We developed methods and models to enhance CDS SOA as Linked Services that define a rich semantic layer based on machine interpretable ontologies that powers their interoperability and reuse. These ontologies provided unambiguous descriptions of CDS services properties to expose them to the Web of Data. We developed models compliant with Linked Data principles to create a semantic representation of the components that compose CDS services. To evaluate our approach we implemented a set of CDS Linked Services using a Web service definition ontology. The definitions of Web services were linked to the models developed in order to attach unambiguous semantics to the service components. All models were bound to SNOMED-CT and public ontologies (e.g. Dublin Core) in order to count on a lingua franca to explore them. Discovery and analysis of CDS services based on machine interpretable models was performed reasoning over the ontologies built. Linked Services can be used effectively to expose CDS services to the Web of Data by building on current CDS standards. This allows building shared Linked Knowledge Bases to provide machine interpretable semantics to the CDS service description alleviating the challenges on interoperability and reuse. Linked Services allow for building 'digital libraries' of distributed CDS services that can be hosted and maintained in different organizations. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Current Trends in Wireless Mesh Sensor Networks: A Review of Competing Approaches

    PubMed Central

    Rodenas-Herraiz, David; Garcia-Sanchez, Antonio-Javier; Garcia-Sanchez, Felipe; Garcia-Haro, Joan

    2013-01-01

    Finding a complete mesh-based solution for low-rate wireless personal area networks (LR-WPANs) is still an open issue. To cope with this concern, different competing approaches have emerged in the Wireless Mesh Sensor Networks (WMSNs) field in the last few years. They are usually supported by the IEEE 802.15.4 standard, the most commonly adopted LR-WPAN recommendation for point-to-point topologies. In this work, we review the most relevant and up-to-date WMSN solutions that extend the IEEE 802.15.4 standard to multi-hop mesh networks. To conduct this review, we start by identifying the most significant WMSN requirements (i.e., interoperability, robustness, scalability, mobility or energy-efficiency) that reveal the benefits and shortcomings of each proposal. Then, we re-examine thoroughly the group of proposals following different design guidelines which are usually considered by end-users and developers. Among all of the approaches reviewed, we highlight the IEEE 802.15.5 standard, a recent recommendation that, in its LR-WPAN version, fully satisfies the greatest number of WMSN requirements. As a result, IEEE 802.15.5 can be an appropriate solution for a wide-range of applications, unlike the majority of the remaining solutions reviewed, which are usually designed to solve particular problems, for instance in the home, building and industrial sectors. In this sense, a description of IEEE 802.15.5 is also included, paying special attention to its efficient energy-saving mechanisms. Finally, possible improvements of this recommendation are pointed out in order to offer hints for future research. PMID:23666128

  11. A Virtual Observatory Approach to Planetary Data for Vesta and Ceres

    NASA Astrophysics Data System (ADS)

    Giardino, M.; Fonte, S.; Politi, R.; Ivanovski, S.; Longobardo, A.; Capria, M. T.; Erard, S.; De Sanctis, M. C.

    2018-04-01

    A virtual observatory service for DAWN/VIR spectral dataset is presented, based upon the IVOA standards adapted to the planetary field. Advantages of such an approach will be discussed, especially concerning interoperability and availability.

  12. Based new WiMax simulation model to investigate Qos with OPNET modeler in sheduling environment

    NASA Astrophysics Data System (ADS)

    Saini, Sanju; Saini, K. K.

    2012-11-01

    WiMAX stands for World Interoperability for Microwave Access. It is considered a major part of broadband wireless network having the IEEE 802.16 standard. WiMAX provides innovative, fixed as well as mobile platforms for broadband internet access anywhere anytime with different transmission modes. The results show approximately equal load and throughput while the delay values vary among the different Base Stations Introducing the various type of scheduling algorithm, like FIFO,PQ,WFQ, for comparison of four type of scheduling service, with its own QoS needs and also introducing OPNET modeler support for Worldwide Interoperability for Microwave Access (WiMAX) network. The simulation results indicate the correctness and the effectiveness of this algorithm. This paper presents a WiMAX simulation model designed with OPNET modeler 14 to measure the delay, load and the throughput performance factors.

  13. SDI-based business processes: A territorial analysis web information system in Spain

    NASA Astrophysics Data System (ADS)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  14. CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability

    NASA Technical Reports Server (NTRS)

    Claus, Russell; Weitzer, Ilan

    2002-01-01

    Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.

  15. Sustainable ubiquitous home health care--architectural considerations and first practical experiences.

    PubMed

    Marschollek, Michael; Wolf, Klaus-H; Bott, Oliver-J; Geisler, Mirko; Plischke, Maik; Ludwig, Wolfram; Hornberger, Andreas; Haux, Reinhold

    2007-01-01

    Despite the abundance of past home care projects and the maturity of the technologies used, there is no widespread dissemination as yet. The absence of accepted standards and thus interoperability and the inadequate integration into transinstitutional health information systems (tHIS) are perceived as key factors. Based on the respective literature and previous experiences in home care projects we propose an architectural model for home care as part of a transinstitutional health information system using the HL7 clinical document architecture (CDA) as well as the HL7 Arden Syntax for Medical Logic Systems. In two short case studies we describe the practical realization of the architecture as well as first experiences. Our work can be regarded as a first step towards an interoperable - and in our view sustainable - home care architecture based on a prominent document standard from the health information system domain.

  16. Interoperability between phenotype and anatomy ontologies.

    PubMed

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-12-15

    Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.

  17. Data Management challenges in Astronomy and Astroparticle Physics

    NASA Astrophysics Data System (ADS)

    Lamanna, Giovanni

    2015-12-01

    Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.

  18. Deserts in the Deluge: TerraPopulus and Big Human-Environment Data.

    PubMed

    Manson, S M; Kugler, T A; Haynes, D

    2016-01-01

    Terra Populus, or TerraPop, is a cyberinfrastructure project that integrates, preserves, and disseminates massive data collections describing characteristics of the human population and environment over the last six decades. TerraPop has made a number of GIScience advances in the handling of big spatial data to make information interoperable between formats and across scientific communities. In this paper, we describe challenges of these data, or 'deserts in the deluge' of data, that are common to spatial big data more broadly, and explore computational solutions specific to microdata, raster, and vector data models.

  19. Health information technology: strategic initiatives, real progress.

    PubMed

    Kolodner, Robert M; Cohn, Simon P; Friedman, Charles P

    2008-01-01

    We fully agree with Carol Diamond and Clay Shirky that deployment of health information technology (IT) is necessary but not sufficient for transforming U.S. health care. However, the recent work to advance health IT is far from an exercise in "magical thinking." It has been strategic thinking. To illustrate this, we highlight recent initiatives and progress under four focus areas: adoption, governance, privacy and security, and interoperability. In addition, solutions exist for health IT to advance rapidly without adversely affecting future policy choices. A broad national consensus is emerging in support of advancing health IT to enable the transformation of health and care.

  20. Uganda's National Transmission Backbone Infrastructure Project: Technical Challenges and the Way Forward

    NASA Astrophysics Data System (ADS)

    Bulega, T.; Kyeyune, A.; Onek, P.; Sseguya, R.; Mbabazi, D.; Katwiremu, E.

    2011-10-01

    Several publications have identified technical challenges facing Uganda's National Transmission Backbone Infrastructure project. This research addresses the technical limitations of the National Transmission Backbone Infrastructure project, evaluates the goals of the project, and compares the results against the technical capability of the backbone. The findings of the study indicate a bandwidth deficit, which will be addressed by using dense wave division multiplexing repeaters, leasing bandwidth from private companies. Microwave links for redundancy, a Network Operation Center for operation and maintenance, and deployment of wireless interoperability for microwave access as a last-mile solution are also suggested.

Top