75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...
75 FR 81605 - Smart Grid Interoperability Standards; Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Technical Conference December 21, 2010. Take notice that the Federal Energy... National Institute of Standards and Technology and included in this proceeding are ready for Commission...
75 FR 66752 - Smart Grid Interoperability Standards; Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-29
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... adoption of Smart Grid Interoperability Standards (Standards) in their States. On October 6, 2010, the....m. Eastern time in conjunction with the NARUC/FERC Collaborative on Smart Response (Collaborative...
76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid... Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability... National Institute of Standards and Technology are ready for Commission consideration in a rulemaking...
76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-04
...--Intersection of Cloud Computing and Mobility Forum and Workshop AGENCY: National Institute of Standards and.../intersection-of-cloud-and-mobility.cfm . SUPPLEMENTARY INFORMATION: NIST hosted six prior Cloud Computing Forum... interoperability, portability, and security, discuss the Federal Government's experience with cloud computing...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public..., Reliability, and Interoperability Council (CSRIC) will hold its fifth meeting. The CSRIC will vote on... to the FCC regarding best practices and actions the FCC can take to ensure the security, reliability...
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2017-10-01
Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-01
... FEDERAL COMMUNICATIONS COMMISSION Federal Advisory Committee Act; Communications Security, Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC or Commission) Communications Security...
2011-12-01
Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization
Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep
2016-10-01
This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
Interoperable and standard e-Health solution over Bluetooth.
Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J
2010-01-01
The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.
System and methods of resource usage using an interoperable management framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.
Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.
Reuse and Interoperability of Avionics for Space Systems
NASA Technical Reports Server (NTRS)
Hodson, Robert F.
2007-01-01
The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.
Study and validation of tools interoperability in JPSEC
NASA Astrophysics Data System (ADS)
Conan, V.; Sadourny, Y.; Jean-Marie, K.; Chan, C.; Wee, S.; Apostolopoulos, J.
2005-08-01
Digital imagery is important in many applications today, and the security of digital imagery is important today and is likely to gain in importance in the near future. The emerging international standard ISO/IEC JPEG-2000 Security (JPSEC) is designed to provide security for digital imagery, and in particular digital imagery coded with the JPEG-2000 image coding standard. One of the primary goals of a standard is to ensure interoperability between creators and consumers produced by different manufacturers. The JPSEC standard, similar to the popular JPEG and MPEG family of standards, specifies only the bitstream syntax and the receiver's processing, and not how the bitstream is created or the details of how it is consumed. This paper examines the interoperability for the JPSEC standard, and presents an example JPSEC consumption process which can provide insights in the design of JPSEC consumers. Initial interoperability tests between different groups with independently created implementations of JPSEC creators and consumers have been successful in providing the JPSEC security services of confidentiality (via encryption) and authentication (via message authentication codes, or MACs). Further interoperability work is on-going.
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison
2018-01-12
Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).
An open repositories network development for medical teaching resources.
Soula, Gérard; Darmoni, Stefan; Le Beux, Pierre; Renard, Jean-Marie; Dahamna, Badisse; Fieschi, Marius
2010-01-01
The lack of interoperability between repositories of heterogeneous and geographically widespread data is an obstacle to the diffusion, sharing and reutilization of those data. We present the development of an open repositories network taking into account both the syntactic and semantic interoperability of the different repositories and based on international standards in this field. The network is used by the medical community in France for the diffusion and sharing of digital teaching resources. The syntactic interoperability of the repositories is managed using the OAI-PMH protocol for the exchange of metadata describing the resources. Semantic interoperability is based, on one hand, on the LOM standard for the description of resources and on MESH for the indexing of the latter and, on the other hand, on semantic interoperability management designed to optimize compliance with standards and the quality of the metadata.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World
NASA Astrophysics Data System (ADS)
Wright, D. J.; Sankaran, S.
2015-12-01
In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
Standard-compliant real-time transmission of ECGs: harmonization of ISO/IEEE 11073-PHD and SCP-ECG.
Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José
2009-01-01
Ambient assisted living and integrated care in an aging society is based on the vision of the lifelong Electronic Health Record calling for HealthCare Information Systems and medical device interoperability. For medical devices this aim can be achieved by the consistent implementation of harmonized international interoperability standards. The ISO/IEEE 11073 (x73) family of standards is a reference standard for medical device interoperability. In its Personal Health Device (PHD) version several devices have been included, but an ECG device specialization is not yet available. On the other hand, the SCP-ECG standard for short-term diagnostic ECGs (EN1064) has been recently approved as an international standard ISO/IEEE 11073-91064:2009. In this paper, the relationships between a proposed x73-PHD model for an ECG device and the fields of the SCP-ECG standard are investigated. A proof-of-concept implementation of the proposed x73-PHD ECG model is also presented, identifying open issues to be addressed by standards development for the wider interoperability adoption of x73-PHD standards.
Ryan, Amanda; Eklund, Peter
2008-01-01
Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).
Warfighter IT Interoperability Standards Study
2012-07-22
data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data
Report on the Second Catalog Interoperability Workshop
NASA Technical Reports Server (NTRS)
Thieman, James R.; James, Mary E.
1988-01-01
The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell
2014-01-01
No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.
Weininger, Sandy; Jaffe, Michael B; Goldman, Julian M
2017-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account these systems' perspective. In this article, we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups, some of which focus on safety and effectiveness and others focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development.
Weininger, Sandy; Jaffe, Michael B.; Goldman, Julian M
2016-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account this systems perspective. In this article we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups; some which focus on safety and effectiveness, and others that focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development. PMID:27584685
2011-07-01
Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended
Enabling interoperability in planetary sciences and heliophysics: The case for an information model
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.
2018-01-01
The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.
A SOA-Based Platform to Support Clinical Data Sharing.
Gazzarata, R; Giannini, B; Giacomini, M
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the "Interoperable" Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the "Interoperable" Tier, the current solution actually covers the "Connected" Tier, due to local hospital policy restrictions.
Moving Beyond the 10,000 Ways That Don't Work
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Arctur, D. K.; Rueda, C.
2009-12-01
From his research in developing light bulb filaments, Thomas Edison provide us with a good lesson to advance any venture. He said "I have not failed, I've just found 10,000 ways that won't work." Advancing data and access interoperability is one of those ventures difficult to achieve because of the differences among the participating communities. Even within the marine domain, different communities exist and with them different technologies (formats and protocols) to publish data and its descriptions, and different vocabularies to name things (e.g. parameters, sensor types). Simplifying the heterogeneity of technologies is not only accomplished by adopting standards, but by creating profiles, and advancing tools that use those standards. In some cases, standards are advanced by building from existing tools. But what is the best strategy? Edison could provide us a hint. Prototypes and test beds are essential to achieve interoperability among geospatial communities. The Open Geospatial Consortium (OGC) calls them interoperability experiments. The World Wide Web Consortium (W3C) calls them incubator projects. Prototypes help test and refine specifications. The Marine Metadata Interoperability (MMI) Initiative, which is advancing marine data integration and re-use by promoting community solutions, understood this strategy and started an interoperability demonstration with the SURA Coastal Ocean Observing and Prediction (SCOOP) program. This interoperability demonstration transformed into the OGC Ocean Science Interoperability Experiment (Oceans IE). The Oceans IE brings together the Ocean-Observing community to advance interoperability of ocean observing systems by using OGC Standards. The Oceans IE Phase I investigated the use of OGC Web Feature Service (WFS) and OGC Sensor Observation Service (SOS) standards for representing and exchanging point data records from fixed in-situ marine platforms. The Oceans IE Phase I produced an engineering best practices report, advanced reference implementations, and submitted various change requests that are now being considered by the OGC SOS working group. Building on Phase I, and with a focus on semantically-enabled services, Oceans IE Phase II will continue the use and improvement of OGC specifications in the marine community. We will present the lessons learned and in particular the strategy of experimenting with technologies to advance standards to publish data in marine communities, which could also help advance interoperability in other geospatial communities. We will also discuss the growing collaborations among ocean-observing standards organizations that will bring about the institutional acceptance needed for these technologies and practices to gain traction globally.
Personal Health Records: Is Rapid Adoption Hindering Interoperability?
Studeny, Jana; Coustasse, Alberto
2014-01-01
The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822
Examining the Relationship between Electronic Health Record Interoperability and Quality Management
ERIC Educational Resources Information Center
Purcell, Bernice M.
2013-01-01
A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…
Capurro, Daniel; Echeverry, Aisen; Figueroa, Rosa; Guiñez, Sergio; Taramasco, Carla; Galindo, César; Avendaño, Angélica; García, Alejandra; Härtel, Steffen
2017-01-01
Despite the continuous technical advancements around health information standards, a critical component to their widespread adoption involves political agreement between a diverse set of stakeholders. Countries that have addressed this issue have used diverse strategies. In this vision paper we present the path that Chile is taking to establish a national program to implement health information standards and achieve interoperability. The Chilean government established an inter-agency program to define the current interoperability situation, existing gaps, barriers, and facilitators for interoperable health information systems. As an answer to the identified issues, the government decided to fund a consortium of Chilean universities to create the National Center for Health Information Systems. This consortium should encourage the interaction between all health care stakeholders, both public and private, to advance the selection of national standards and define certification procedures for software and human resources in health information technologies.
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
Proceedings of the ITS Standards Program Review and Interoperability Workshop
DOT National Transportation Integrated Search
1997-12-17
An ITS Standards Program Review and Interoperability Workshop was held on Dec. 17-18, 1997 in Arlington, Va. It was sponsored by the U.S. DOT, ITS America, George Mason University (GMU) and the University of Michigan. The purpose was to review the US...
Personalized-detailed clinical model for data interoperability among clinical standards.
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung
2013-08-01
Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.
Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir
2013-01-01
Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730
Information Management Challenges in Achieving Coalition Interoperability
2001-12-01
by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han
2014-01-01
Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.
Electronic Health Records Data and Metadata: Challenges for Big Data in the United States.
Sweet, Lauren E; Moulaison, Heather Lea
2013-12-01
This article, written by researchers studying metadata and standards, represents a fresh perspective on the challenges of electronic health records (EHRs) and serves as a primer for big data researchers new to health-related issues. Primarily, we argue for the importance of the systematic adoption of standards in EHR data and metadata as a way of promoting big data research and benefiting patients. EHRs have the potential to include a vast amount of longitudinal health data, and metadata provides the formal structures to govern that data. In the United States, electronic medical records (EMRs) are part of the larger EHR. EHR data is submitted by a variety of clinical data providers and potentially by the patients themselves. Because data input practices are not necessarily standardized, and because of the multiplicity of current standards, basic interoperability in EHRs is hindered. Some of the issues with EHR interoperability stem from the complexities of the data they include, which can be both structured and unstructured. A number of controlled vocabularies are available to data providers. The continuity of care document standard will provide interoperability in the United States between the EMR and the larger EHR, potentially making data input by providers directly available to other providers. The data involved is nonetheless messy. In particular, the use of competing vocabularies such as the Systematized Nomenclature of Medicine-Clinical Terms, MEDCIN, and locally created vocabularies inhibits large-scale interoperability for structured portions of the records, and unstructured portions, although potentially not machine readable, remain essential. Once EMRs for patients are brought together as EHRs, the EHRs must be managed and stored. Adequate documentation should be created and maintained to assure the secure and accurate use of EHR data. There are currently a few notable international standards initiatives for EHRs. Organizations such as Health Level Seven International and Clinical Data Interchange Standards Consortium are developing and overseeing implementation of interoperability standards. Denmark and Singapore are two countries that have successfully implemented national EHR systems. Future work in electronic health information initiatives should underscore the importance of standards and reinforce interoperability of EHRs for big data research and for the sake of patients.
Hankin, Steven C.; Blower, Jon D.; Carval, Thierry; Casey, Kenneth S.; Donlon, Craig; Lauret, Olivier; Loubrieu, Thomas; Srinivasan, Ashwanth; Trinanes, Joaquin; Godøy, Øystein; Mendelssohn, Roy; Signell, Richard P.; de La Beaujardiere, Jeff; Cornillon, Peter; Blanc, Frederique; Rew, Russ; Harlan, Jack; Hall, Julie; Harrison, D.E.; Stammer, Detlef
2010-01-01
It is generally recognized that meeting society's emerging environmental science and management needs will require the marine data community to provide simpler, more effective and more interoperable access to its data. There is broad agreement, as well, that data standards are the bedrock upon which interoperability will be built. The path that would bring the marine data community to agree upon and utilize such standards, however, is often elusive. In this paper we examine the trio of standards 1) netCDF files; 2) the Climate and Forecast (CF) metadata convention; and 3) the OPeNDAP data access protocol. These standards taken together have brought our community a high level of interoperability for "gridded" data such as model outputs, satellite products and climatological analyses, and they are gaining rapid acceptance for ocean observations. We will provide an overview of the scope of the contribution that has been made. We then step back from the information technology considerations to examine the community or "social" process by which the successes were achieved. We contrast the path by which the World Meteorological Organization (WMO) has advanced the Global Telecommunications System (GTS) - netCDF/CF/OPeNDAP exemplifying a "bottom up" standards process whereas GTS is "top down". Both of these standards are tales of success at achieving specific purposes, yet each is hampered by technical limitations. These limitations sometimes lead to controversy over whether alternative technological directions should be pursued. Finally we draw general conclusions regarding the factors that affect the success of a standards development effort - the likelihood that an IT standard will meet its design goals and will achieve community-wide acceptance. We believe that a higher level of thoughtful awareness by the scientists, program managers and technology experts of the vital role of standards and the merits of alternative standards processes can help us as a community to reach our interoperability goals faster.
Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh;
2014-01-01
The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.
Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems
ERIC Educational Resources Information Center
Mason, Robert T.
2011-01-01
An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…
Relevance of eHealth standards for big data interoperability in radiology and beyond.
Marcheschi, Paolo
2017-06-01
The aim of this paper is to report on the implementation of radiology and related information technology standards to feed big data repositories and so to be able to create a solid substrate on which to operate with analysis software. Digital Imaging and Communications in Medicine (DICOM) and Health Level 7 (HL7) are the major standards for radiology and medical information technology. They define formats and protocols to transmit medical images, signals, and patient data inside and outside hospital facilities. These standards can be implemented but big data expectations are stimulating a new approach, simplifying data collection and interoperability, seeking reduction of time to full implementation inside health organizations. Virtual Medical Record, DICOM Structured Reporting and HL7 Fast Healthcare Interoperability Resources (FHIR) are changing the way medical data are shared among organization and they will be the keys to big data interoperability. Until we do not find simple and comprehensive methods to store and disseminate detailed information on the patient's health we will not be able to get optimum results from the analysis of those data.
The HDF Product Designer - Interoperability in the First Mile
NASA Astrophysics Data System (ADS)
Lee, H.; Jelenak, A.; Habermann, T.
2014-12-01
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
EUnetHTA information management system: development and lessons learned.
Chalon, Patrice X; Kraemer, Peter
2014-11-01
The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.
Operations and Plans: International Military Rationalization, Standardization, and Interoperability
1989-02-15
Army Regulation 34–1 Operations and Plans International Military Rationalization , Standardization, and Interoperability Headquarters Department of...YYYY) 15-02-1997 2. REPORT TYPE 3. DATES COVERED (FROM - TO) xx-xx-1997 to xx-xx-1997 4. TITLE AND SUBTITLE International Military Rationalization ...DSN 427-9007 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39.18 SUMMARY of CHANGE AR 34–1 International Military Rationalization
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong
2014-01-01
Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
ERIC Educational Resources Information Center
Arms, William Y.; Hillmann, Diane; Lagoze, Carl; Krafft, Dean; Marisa, Richard; Saylor, John; Terizzi, Carol; Van de Sompel, Herbert; Gill, Tony; Miller, Paul; Kenney, Anne R.; McGovern, Nancy Y.; Botticelli, Peter; Entlich, Richard; Payette, Sandra; Berthon, Hilary; Thomas, Susan; Webb, Colin; Nelson, Michael L.; Allen, B. Danette; Bennett, Nuala A.; Sandore, Beth; Pianfetti, Evangeline S.
2002-01-01
Discusses digital libraries, including interoperability, metadata, and international standards; Web resource preservation efforts at Cornell University; digital preservation at the National Library of Australia; object persistence and availability; collaboration among libraries, museums and elementary schools; Asian digital libraries; and a Web…
Maturity model for enterprise interoperability
NASA Astrophysics Data System (ADS)
Guédria, Wided; Naudet, Yannick; Chen, David
2015-01-01
Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.
Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Patten, K.
2014-12-01
The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
... New Hampshire InterOperability Laboratory, Durham, NH; Verimatrix, Inc., San Diego, CA; Vobile, Inc... Inc., San Diego, CA; Vidiator, Bellevue, WA; Virtual Logix, Montigny-le Bretonneux, France; Vishwak...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basso, T.; DeBlasio, R.
The IEEE American National Standards smart grid publications and standards development projects IEEE 2030, which addresses smart grid interoperability, and IEEE 1547TM, which addresses distributed resources interconnection with the grid, have made substantial progress since 2009. The IEEE 2030TM and 1547 standards series focus on systems-level aspects and cover many of the technical integration issues involved in a mature smart grid. The status and highlights of these two IEEE series of standards, which are sponsored by IEEE Standards Coordinating Committee 21 (SCC21), are provided in this paper.
LVC Architecture Roadmap Implementation - Results of the First Two Years
2012-03-01
NOTES Presented at the Simulation Interoperability Standards Organization?s (SISO) Spring Simulation Interoperability Workshop ( SIW ), 26-30 March...presented at the semi-annual Simulation Interoperability Workshops ( SIWs ) and the annual Interservice/Industry Training, Simulation & Education Conference...I/ITSEC), as well as other venues. For example, a full-day workshop on the initial progress of the effort was conducted at the 2010 Spring SIW [2
Fundamental Data Standards for Science Data System Interoperability and Data Correlation
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Gopala Krishna, Barla; Rye, Elizabeth; Crichton, Daniel
The advent of the Web and languages such as XML have brought an explosion of online science data repositories and the promises of correlated data and interoperable systems. However there have been relatively few successes in meeting the expectations of science users in the internet age. For example a Google-like search for images of Mars will return many highly-derived and appropriately tagged images but largely ignore the majority of images in most online image repositories. Once retrieved, users are further frustrated by poor data descriptions, arcane formats, and badly organized ancillary information. A wealth of research indicates that shared information models are needed to enable system interoperability and data correlation. However, at a more fundamental level, data correlation and system interoperability are dependant on a relatively few shared data standards. A com-mon data dictionary standard, for example, allows the controlled vocabulary used in a science repository to be shared with potential collaborators. Common data registry and product iden-tification standards enable systems to efficiently find, locate, and retrieve data products and their metadata from remote repositories. Information content standards define categories of descriptive data that help make the data products scientifically useful to users who were not part of the original team that produced the data. The Planetary Data System (PDS) has a plan to move the PDS to a fully online, federated system. This plan addresses new demands on the system including increasing data volume, numbers of missions, and complexity of missions. A key component of this plan is the upgrade of the PDS Data Standards. The adoption of the core PDS data standards by the International Planetary Data Alliance (IPDA) adds the element of international cooperation to the plan. This presentation will provide an overview of the fundamental data standards being adopted by the PDS that transcend science domains and that will help to meet the PDS's and IPDA's system interoperability and data correlation requirements.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-04
... notice of investigation named as respondents Sony Corporation of Tokyo, Japan; Sony Corporation of America of New York, New York; Sony Electronics Corporation of San Diego, California; and Sony Computer...
The BACnet Campus Challenge - Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masica, Ken; Tom, Steve
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
The BACnet Campus Challenge - Part 1
Masica, Ken; Tom, Steve
2015-12-01
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Clinical data interoperability based on archetype transformation.
Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2011-10-01
The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
Development of Extended Content Standards for Biodiversity Data
NASA Astrophysics Data System (ADS)
Hugo, Wim; Schmidt, Jochen; Saarenmaa, Hannu
2013-04-01
Interoperability in the field of Biodiversity observation has been strongly driven by the development of a number of global initiatives (GEO, GBIF, OGC, TDWG, GenBank, …) and its supporting standards (OGC-WxS, OGC-SOS, Darwin Core (DwC), NetCDF, …). To a large extent, these initiatives have focused on discoverability and standardization of syntactic and schematic interoperability. Semantic interoperability is more complex, requiring development of domain-dependent conceptual data models, and extension of these models with appropriate ontologies (typically manifested as controlled vocabularies). Biodiversity content has been standardized partly, for example through Darwin Core for occurrence data and associated taxonomy, and through Genbank for genetic data, but other contexts of biodiversity observation have lagged behind - making it difficult to achieve semantic interoperability between distributed data sources. With this in mind, WG8 of GEO BON (charged with data and systems interoperability) has started a work programme to address a number of concerns, one of which is the gap in content standards required to make Biodiversity data truly interoperable. The paper reports on the framework developed by WG8 for the classification of Biodiversity observation data into 'families' of use cases and its supporting data schema, where gaps, if any, in the availability if content standards have been identified, and how these are to be addressed by way of an abstract data model and the development of associated content standards. It is proposed that a minimum set of standards (1) will be required to address the scope of Biodiversity content, aligned with levels and dimensions of observation, and based on the 'Essential Biodiversity Variables' (2) being developed by GEO BON . The content standards are envisaged as loosely separated from the syntactic and schematic standards used for the base data exchange: typically, services would offer an existing data standard (DwC, WFS, SOS, NetCDF), with a use-case dependent 'payload' embedded into the data stream. This enables the re-use of the abstract schema, and sometimes the implementation specification (for example XML, JSON, or NetCDF conventions) across services. An explicit aim will be to make the XML implementation specification re-usable as a DwC and a GML (SOS end WFS) extension. (1) Olga Lyashevska, Keith D. Farnsworth, How many dimensions of biodiversity do we need?, Ecological Indicators, Volume 18, July 2012, Pages 485-492, ISSN 1470-160X, 10.1016/j.ecolind.2011.12.016. (2) GEO BON: Workshop on Essential Biodiversity Variables (27-29 February 2012, Frascati, Italy). (http://www.earthobservations.org/geobon_docs_20120227.shtml)
Galarraga, M; Serrano, L; Martinez, I; de Toledo, P; Reynolds, Melvin
2007-01-01
Advances in Information and Communication Technologies, ICT, are bringing new opportunities and use cases in the field of systems and Personal Health Devices used for the telemonitoring of citizens in Home or Mobile scenarios. At a time of such challenges, this review arises from the need to identify robust technical telemonitoring solutions that are both open and interoperable. These systems demand standardized solutions to be cost effective and to take advantage of standardized operation and interoperability. Thus, the fundamental challenge is to design plug-&-play devices that, either as individual elements or as components, can be incorporated in a simple way into different Telecare systems, perhaps configuring a personal user network. Moreover, there is an increasing market pressure from companies not traditionally involved in medical markets, asking for a standard for Personal Health Devices, which foresee a vast demand for telemonitoring, wellness, Ambient Assisted Living (AAL) and e-health applications. However, the newly emerging situations imply very strict requirements for the protocols involved in the communication. The ISO/IEEE 11073 family of standards is adapting and moving in order to face the challenge and might appear the best positioned international standards to reach this goal. This work presents an updated survey of these standards, trying to track the changes that are being fulfilled, and tries to serve as a starting-point for those who want to familiarize themselves with them.
Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G
2012-01-01
In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.
Approaching semantic interoperability in Health Level Seven
Alschuler, Liora
2010-01-01
‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995
A Linguistic Foundation for Communicating Geo-Information in the context of BML and geoBML
2010-03-23
BML Standard. 2009 Spring Simulation Interoperability Workshop (09S- SIW -046). San Diego, CA. Rein, K., Schade, U. & Hieb, M.R. (2009). Battle...Formalizing Battle Management Language: A Grammar for Specifying Orders. 2006 Spring Simulation Interoperability Workshop (06S- SIW - 068). Huntsville...Hieb, M.R. (2007). Battle Management Language: A Grammar for Specifying Reports. 2007 Spring Simulation Interoperability Workshop (07S- SIW -036
Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites
NASA Technical Reports Server (NTRS)
Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng
2007-01-01
This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.
NASA Technical Reports Server (NTRS)
Kearney, Mike
2013-01-01
The primary goal of Consultative Committee for Space Data Systems (CCSDS) is interoperability between communications and data systems of space agencies' vehicles, facilities, missions and programs. Of all of the technologies used in spaceflight, standardization of communications and data systems brings the most benefit to multi-agency interoperability. CCSDS Started in 1982 developing standards at the lower layers of the protocol stack. The CCSDS scope has grown to cover standards throughout the entire ISO communications stack, plus other Data Systems areas (architecture, archive, security, XML exchange formats, etc.
The Benefits and Future of Standards: Metadata and Beyond
NASA Astrophysics Data System (ADS)
Stracke, Christian M.
This article discusses the benefits and future of standards and presents the generic multi-dimensional Reference Model. First the importance and the tasks of interoperability as well as quality development and their relationship are analyzed. Especially in e-Learning their connection and interdependence is evident: Interoperability is one basic requirement for quality development. In this paper, it is shown how standards and specifications are supporting these crucial issues. The upcoming ISO metadata standard MLR (Metadata for Learning Resource) will be introduced and used as example for identifying the requirements and needs for future standardization. In conclusion a vision of the challenges and potentials for e-Learning standardization is outlined.
NASA Astrophysics Data System (ADS)
Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup
When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.
A logical approach to semantic interoperability in healthcare.
Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni
2011-01-01
Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.
Design and Implementation of a REST API for the Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M
2014-01-01
Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.
Interoperability and information discovery
Christian, E.
2001-01-01
In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.
Employing Semantic Technologies for the Orchestration of Government Services
NASA Astrophysics Data System (ADS)
Sabol, Tomáš; Furdík, Karol; Mach, Marián
The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.
Planetary Sciences Interoperability at VO Paris Data Centre
NASA Astrophysics Data System (ADS)
Le Sidaner, P.; Aboudarham, J.; Birlan, M.; Briot, D.; Bonnin, X.; Cecconi, B.; Chauvin, C.; Erard, S.; Henry, F.; Lamy, L.; Mancini, M.; Normand, J.; Popescu, F.; Roques, F.; Savalle, R.; Schneider, J.; Shih, A.; Thuillot, W.; Vinatier, S.
2015-10-01
The Astronomy community has been developing interoperability since more than 10 years, by standardizing data access, data formats, and metadata. This international action is led by the International Virtual Observatory Alliance (IVOA). Observatoire de Paris is an active participant in this project. All actions on interoperability, data and service provision are centralized in and managed by VOParis Data Centre (VOPDC). VOPDC is a coordinated project from all scientific departments of Observatoire de Paris..
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative a requirement to standardize and control the naming conventions of very disparate systems and technologies are emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation Initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation Initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and birds of a feather activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another
Best Practices for Preparing Interoperable Geospatial Data
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.
2010-12-01
Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.
NASA Astrophysics Data System (ADS)
Lucido, J. M.; Booth, N.
2014-12-01
Interoperable sharing of groundwater data across international boarders is essential for the proper management of global water resources. However storage and management of groundwater data is often times distributed across many agencies or organizations. Furthermore these data may be represented in disparate proprietary formats, posing a significant challenge for integration. For this reason standard data models are required to achieve interoperability across geographical and political boundaries. The GroundWater Markup Language 1.0 (GWML1) was developed in 2010 as an extension of the Geography Markup Language (GML) in order to support groundwater data exchange within Spatial Data Infrastructures (SDI). In 2013, development of GWML2 was initiated under the sponsorship of the Open Geospatial Consortium (OGC) for intended adoption by the international community as the authoritative standard for the transfer of groundwater feature data, including data about water wells, aquifers, and related entities. GWML2 harmonizes GWML1 and the EU's INSPIRE models related to geology and hydrogeology. Additionally, an interoperability experiment was initiated to test the model for commercial, technical, scientific, and policy use cases. The scientific use case focuses on the delivery of data required for input into computational flow modeling software used to determine the flow of groundwater within a particular aquifer system. It involves the delivery of properties associated with hydrogeologic units, observations related to those units, and information about the related aquifers. To test this use case web services are being implemented using GWML2 and WaterML2, which is the authoritative standard for water time series observations, in order to serve USGS water well and hydrogeologic data via standard OGC protocols. Furthermore, integration of these data into a computational groundwater flow model will be tested. This submission will present the GWML2 information model and results of an interoperability experiment with a particular emphasis on the scientific use case.
NASA Astrophysics Data System (ADS)
2018-01-01
The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.
Interoperability of Neuroscience Modeling Software
Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik
2009-01-01
Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374
Data Modeling Challenges of Advanced Interoperability.
Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka
2018-01-01
Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.
Interoperability Assets for Patient Summary Components: A Gap Analysis.
Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine
2018-01-01
The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat
2015-01-01
Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.
Interconnecting Multidiscilinary Data Infrastructures: From Federation to Brokering Framework
NASA Astrophysics Data System (ADS)
Nativi, S.
2014-12-01
Standardization and federation activities have been played an essential role to push interoperability at the disciplinary and cross-disciplinary level. However, they demonstrated not to be sufficient to resolve important interoperability challenges, including: disciplinary heterogeneity, cross-organizations diversities, cultural differences. Significant international initiatives like GEOSS, IODE, and CEOS demonstrated that a federation system dealing with global and multi-disciplinary domain turns out to be rater complex, raising more the already high entry level barriers for both Providers and Users. In particular, GEOSS demonstrated that standardization and federation actions must be accompanied and complemented by a brokering approach. Brokering architecture and its implementing technologies are able to implement an effective interoperability level among multi-disciplinary systems, lowering the entry level barriers for both data providers and users. This presentation will discuss the brokering philosophy as a complementary approach for standardization and federation to interconnect existing and heterogeneous infrastructures and systems. The GEOSS experience will be analyzed, specially.
NASA's Geospatial Interoperability Office(GIO)Program
NASA Technical Reports Server (NTRS)
Weir, Patricia
2004-01-01
NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.
NASA Astrophysics Data System (ADS)
Alameh, N.; Bambacus, M.; Cole, M.
2006-12-01
Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG has played a key role in recent GEOSS Service Network (GSN) demos and workshops, acting not only as a service and data catalog and discovery client, but also as a portrayal and visualization client to distributed data.
Interoperability in Personalized Adaptive Learning
ERIC Educational Resources Information Center
Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin
2006-01-01
Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…
Archive interoperability in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise
2003-02-01
Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.
Achieving interoperability for metadata registries using comparative object modeling.
Park, Yu Rang; Kim, Ju Han
2010-01-01
Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.
NASA Technical Reports Server (NTRS)
Jones, Michael K.
1998-01-01
Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.
The Next Stage: Moving from Isolated Digital Collections to Interoperable Digital Libraries.
ERIC Educational Resources Information Center
Besser, Howard
2002-01-01
Presents a conceptual framework for digital library development and discusses how to move from isolated digital collections to interoperable digital libraries. Topics include a history of digital libraries; user-centered architecture; stages of technological development; standards, including metadata; and best practices. (Author/LRW)
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.; Witherspoon, Keith R.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative, a requirement to standardize and control the naming conventions of very disparate systems and technologies is emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and Birds of a Feather (BoF) activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another standards body or technologies that are currently not standardized. For activities one through three, we will provide the analysis by either discipline or technology with rationale, identification and brief description of requirements and precedence. For activity four, we will provide a list of current standards bodies e.g. IETF and a list of potential candidates.
76 FR 30202 - National Space-Based Positioning, Navigation, and Timing (PNT) Advisory Board; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
.... ACTION: Notice of meeting. SUMMARY: In accordance with the Federal Advisory Committee Act (Pub. L. 92-463... Positioning System (GPS) modernization. Explore opportunities for enhancing the interoperability of GPS with.... Prioritize current and planned GPS capabilities and services while assessing future PNT architecture options...
IEEE 1547 Standards Advancing Grid Modernization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basso, Thomas; Chakraborty, Sudipta; Hoke, Andy
Technology advances including development of advanced distributed energy resources (DER) and grid-integrated operations and controls functionalities have surpassed the requirements in current standards and codes for DER interconnection with the distribution grid. The full revision of IEEE Standards 1547 (requirements for DER-grid interconnection and interoperability) and 1547.1 (test procedures for conformance to 1547) are establishing requirements and best practices for state-of-the-art DER including variable renewable energy sources. The revised standards will also address challenges associated with interoperability and transmission-level effects, in addition to strictly addressing the distribution grid needs. This paper provides the status and future direction of the ongoingmore » development focus for the 1547 standards.« less
Achieving Interoperability in GEOSS - How Close Are We?
NASA Astrophysics Data System (ADS)
Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.
2010-12-01
A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.
Generic Educational Knowledge Representation for Adaptive and Cognitive Systems
ERIC Educational Resources Information Center
Caravantes, Arturo; Galan, Ramon
2011-01-01
The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…
Exploring Interoperability as a Multidimensional Challenge for Effective Emergency Response
ERIC Educational Resources Information Center
Santisteban, Hiram
2010-01-01
Purpose. The purpose of this research was to further an understanding of how the federal government is addressing the challenges of interoperability for emergency response or crisis management (FEMA, 2009) by informing the development of standards through the review of current congressional law, commissions, studies, executive orders, and…
Interoperability Is the Foundation for Successful Internet Telephony.
ERIC Educational Resources Information Center
Fromm, Larry
1997-01-01
More than 40 leading computer and telephony companies have united to lead the charge toward open standards and universal interoperability for Internet telephony products. The voice of IP Forum (VoIP) is working to define technical guidelines for two-party, real-time communications over IP networks, including provisions for compatibility with…
OpenICE medical device interoperability platform overview and requirement analysis.
Arney, David; Plourde, Jeffrey; Goldman, Julian M
2018-02-23
We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.
Towards semantic interoperability for electronic health records.
Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam
2007-01-01
In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
NASA Astrophysics Data System (ADS)
Schaap, D.
2015-12-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP 2 for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; establishing common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Glaves, Helen
2016-04-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; 3. the establishment of common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
Interoperable Archetypes With a Three Folded Terminology Governance.
Pederson, Rune; Ellingsen, Gunnar
2015-01-01
The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.
CCP interoperability and system stability
NASA Astrophysics Data System (ADS)
Feng, Xiaobing; Hu, Haibo
2016-09-01
To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.
Maturity Model for Advancing Smart Grid Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Mark; Widergren, Steven E.; Mater, J.
2013-10-28
Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less
Look who's talking. A guide to interoperability groups and resources.
2011-06-01
There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.
Stevenson, Timothy H; Chevalier, Nicole A; Scher, Gregory R; Burke, Ronald L
2016-01-01
Effective multilateral military operations such as those conducted by the North Atlantic Treaty Organization (NATO) require close cooperation and standardization between member nations to ensure interoperability. Failure to standardize policies, procedures, and doctrine prior to the commencement of military operations will result in critical interoperability gaps, which jeopardize the health of NATO forces and mission success. To prevent these gaps from occurring, US forces must be actively involved with NATO standardization efforts such as the Committee of the Chiefs of Medical Services to ensure US interests are properly represented when NATO standards are developed and US doctrine and procedures will meet the established NATO requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
Code of Federal Regulations, 2011 CFR
2011-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
Code of Federal Regulations, 2013 CFR
2013-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
Code of Federal Regulations, 2014 CFR
2014-10-01
... physical relationships in the subsystem or system; (2) A description of the railroad operation or... requirements; (5) A preliminary human factors analysis, including a complete description of all human-machine interfaces and the impact of interoperability requirements on the same; (6) An analysis of the applicability...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... and products interoperable with the same. The complaint names as respondents Sony Corporation of Tokyo, Japan; Sony Corporation of America of New York, NY; Sony Electronics Corporation of San Diego, CA; and... INFORMATION: The Commission has received a complaint filed on behalf of Chimei-Innolux Corporation, Chimei...
Challenges and Approaches to Make Multidisciplinary Team Meetings Interoperable - The KIMBo Project.
Krauss, Oliver; Holzer, Karl; Schuler, Andreas; Egelkraut, Reinhard; Franz, Barbara
2017-01-01
Multidisciplinary team meetings (MDTMs) are already in use for certain areas in healthcare (e.g. treatment of cancer). Due to the lack of common standards and accessibility for the applied IT systems, their potential is not yet completely exploited. Common requirements for MDTMs shall be identified and aggregated into a process definition to be automated by an application architecture utilizing modern standards in electronic healthcare, e.g. HL7 FHIR. To identify requirements, an extensive literature review as well as semi-structured expert interviews were conducted. Results showed, that interoperability and flexibility in terms of the process are key requirements to be addressed. An architecture blueprint as well as an aggregated process definition were derived from the insights gained. To evaluate the feasibility of identified requirements, methods of explorative prototyping in software engineering were used. MDTMs will become an important part of modern and future healthcare but the need for standardization in terms of interoperability is imminent.
Enhancing security and improving interoperability in healthcare information systems.
Gritzalis, D A
1998-01-01
Security is a key issue in healthcare information systems, since most aspects of security become of considerable or even critical importance when handling healthcare information. In addition, the intense need for information exchange has revealed interoperability of systems and applications as another key issue. Standardization can play an important role towards both these issues. In this paper, relevant standardization activities are briefly presented, and existing and emerging healthcare information security standards are identified and critically analysed. The analysis is based on a framework which has been developed for this reason. Therefore, the identification of gaps and inconsistencies in current standardization, the description of the conflicts of standards with legislation, and the analysis of implications of these standards to user organizations, are the main results of this paper.
Increasing Interoperability of E-Learning Content in Moodle within a Franco-Arabo Educative Context
ERIC Educational Resources Information Center
El Harrassi, Souad; Labour, Michel
2010-01-01
This article examines how Moodle, as an open-source Learning Management System, can be made more interoperable. The authors tested two software standards, LAMS and RELOAD, compatible with socio-constructivism norms. The analysis showed that pedagogic activities created with the LAMS-IMS Learning Design Level A Format are useable with Moodle but…
An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems
ERIC Educational Resources Information Center
Moschoglou, Georgios
2013-01-01
This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…
2011-01-01
Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109
Modelling and approaching pragmatic interoperability of distributed geoscience data
NASA Astrophysics Data System (ADS)
Ma, Xiaogang
2010-05-01
Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location, intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.
Advancement of the Artificial Pancreas through the Development of Interoperability Standards
Picton, Peter E.; Yeung, Melanie; Hamming, Nathaniel; Desborough, Lane; Dassau, Eyal; Cafazzo, Joseph A.
2013-01-01
Despite advancements in the development of the artificial pancreas, barriers in the form of proprietary data and communication protocols of diabetes devices have made the integration of these components challenging. The Artificial Pancreas Standards and Technical Platform Project is an initiative funded by the JDRF Canadian Clinical Trial Network with the goal of developing device communication standards for the interoperability of diabetes devices. Stakeholders from academia, industry, regulatory agencies, and medical and patient communities have been engaged in advancing this effort. In this article, we describe this initiative along with the process involved in working with the standards organizations and stakeholders that are key to ensuring effective standards are developed and adopted. Discussion from a special session of the 12th Annual Diabetes Technology Meeting is also provided. PMID:23911190
Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
A future-proof architecture for telemedicine using loose-coupled modules and HL7 FHIR.
Gøeg, Kirstine Rosenbeck; Rasmussen, Rune Kongsgaard; Jensen, Lasse; Wollesen, Christian Møller; Larsen, Søren; Pape-Haugaard, Louise Bilenberg
2018-07-01
Most telemedicine solutions are proprietary and disease specific which cause a heterogeneous and silo-oriented system landscape with limited interoperability. Solving the interoperability problem would require a strong focus on data integration and standardization in telemedicine infrastructures. Our objective was to suggest a future-proof architecture, that consisted of small loose-coupled modules to allow flexible integration with new and existing services, and the use of international standards to allow high re-usability of modules, and interoperability in the health IT landscape. We identified core features of our future-proof architecture as the following (1) To provide extended functionality the system should be designed as a core with modules. Database handling and implementation of security protocols are modules, to improve flexibility compared to other frameworks. (2) To ensure loosely coupled modules the system should implement an inversion of control mechanism. (3) A focus on ease of implementation requires the system should use HL7 FHIR (Fast Interoperable Health Resources) as the primary standard because it is based on web-technologies. We evaluated the feasibility of our architecture by developing an open source implementation of the system called ORDS. ORDS is written in TypeScript, and makes use of the Express Framework and HL7 FHIR DSTU2. The code is distributed on GitHub. All modules have been tested unit wise, but end-to-end testing awaits our first clinical example implementations. Our study showed that highly adaptable and yet interoperable core frameworks for telemedicine can be designed and implemented. Future work includes implementation of a clinical use case and evaluation. Copyright © 2018 Elsevier B.V. All rights reserved.
Blazona, Bojan; Koncar, Miroslav
2006-01-01
Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. Interoperability presents data and communication layer interchange. In this context we identified the HL7 standard as the world's leading medical Information and communication technology (ICT) standard for the business layer in healthcare information systems and we tried to explore the ability to exchange clinical documents with minimal integrated healthcare information systems (IHCIS) change. We explored HL7 Clinical Document Architecture (CDA) abilities to achieve radiology information system integration (DICOM) to IHCIS (HL7). We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers.
Progress of Interoperability in Planetary Research for Geospatial Data Analysis
NASA Astrophysics Data System (ADS)
Hare, T. M.; Gaddis, L. R.
2015-12-01
For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.
Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya
2018-05-01
This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.
A SOA-Based Platform to Support Clinical Data Sharing
Gazzarata, R; Giannini, B; Giacomini, M
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. © 2017 R. Gazzarata et al.
A SOA-Based Platform to Support Clinical Data Sharing
Gazzarata, R.; Giannini, B.
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. PMID:29065576
Developing Interoperable Air Quality Community Portals
NASA Astrophysics Data System (ADS)
Falke, S. R.; Husar, R. B.; Yang, C. P.; Robinson, E. M.; Fialkowski, W. E.
2009-04-01
Web portals are intended to provide consolidated discovery, filtering and aggregation of content from multiple, distributed web sources targeted at particular user communities. This paper presents a standards-based information architectural approach to developing portals aimed at air quality community collaboration in data access and analysis. An important characteristic of the approach is to advance beyond the present stand-alone design of most portals to achieve interoperability with other portals and information sources. We show how using metadata standards, web services, RSS feeds and other Web 2.0 technologies, such as Yahoo! Pipes and del.icio.us, helps increase interoperability among portals. The approach is illustrated within the context of the GEOSS Architecture Implementation Pilot where an air quality community portal is being developed to provide a user interface between the portals and clearinghouse of the GEOSS Common Infrastructure and the air quality community catalog of metadata and data services.
The Next Generation of Interoperability Agents in Healthcare
Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José
2014-01-01
Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351
The role of markup for enabling interoperability in health informatics.
McKeever, Steve; Johnson, David
2015-01-01
Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.
CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steve; Martinez, Lindolfo
2009-01-01
We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.
Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano
2012-08-01
Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.
Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.
Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert
2017-08-01
Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.
Turning Interoperability Operational with GST
NASA Astrophysics Data System (ADS)
Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha
2013-04-01
GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.
NASA Astrophysics Data System (ADS)
Glaves, H. M.
2015-12-01
In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.
EVA safety: Space suit system interoperability
NASA Technical Reports Server (NTRS)
Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.
1995-01-01
The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.
ERIC Educational Resources Information Center
Data Research Associates, Inc., St. Louis, MO.
The topic of open systems as it relates to the needs of libraries to establish interoperability between dissimilar computer systems can be clarified by an understanding of the background and evolution of the issue. The International Standards Organization developed a model to link dissimilar computers, and this model has evolved into consensus…
Saleh, Kutaiba; Stucke, Stephan; Uciteli, Alexandr; Faulbrück-Röhr, Sebastian; Neumann, Juliane; Tahar, Kais; Ammon, Danny; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Portheine, Frank; Herre, Heinrich; Kaeding, André; Specht, Martin
2017-01-01
With the growing strain of medical staff and complexity of patient care, the risk of medical errors increases. In this work we present the use of Fast Healthcare Interoperability Resources (FHIR) as communication standard for the integration of an ontology- and agent-based system to identify risks across medical processes in a clinical environment.
A review on digital ECG formats and the relationships between them.
Trigo, Jesús Daniel; Alesanco, Alvaro; Martínez, Ignacio; García, José
2012-05-01
A plethora of digital ECG formats have been proposed and implemented. This heterogeneity hinders the design and development of interoperable systems and entails critical integration issues for the healthcare information systems. This paper aims at performing a comprehensive overview on the current state of affairs of the interoperable exchange of digital ECG signals. This includes 1) a review on existing digital ECG formats, 2) a collection of applications and cardiology settings using such formats, 3) a compilation of the relationships between such formats, and 4) a reflection on the current situation and foreseeable future of the interoperable exchange of digital ECG signals. The objectives have been approached by completing and updating previous reviews on the topic through appropriate database mining. 39 digital ECG formats, 56 applications, tools or implantation experiences, 47 mappings/converters, and 6 relationships between such formats have been found in the literature. The creation and generalization of a single standardized ECG format is a desirable goal. However, this unification requires political commitment and international cooperation among different standardization bodies. Ongoing ontology-based approaches covering ECG domain have recently emerged as a promising alternative for reaching fully fledged ECG interoperability in the near future.
Feasibility of Representing a Danish Microbiology Model Using FHIR.
Andersen, Mie Vestergaard; Kristensen, Ida Hvass; Larsen, Malene Møller; Pedersen, Claus Hougaard; Gøeg, Kirstine Rosenbeck; Pape-Haugaard, Louise B
2017-01-01
Achieving interoperability in health is a challenge and requires standardization. The newly developed HL7 standard: Fast Healthcare Interoperability Resources (FHIR) promises both flexibility and interoperability. This study investigates the feasibility of expressing a Danish microbiology message model content in FHIR to explore whether complex in-use legacy models can be migrated and what challenges this may pose. The Danish microbiology message model (the DMM) is used as a case to illustrate challenges and opportunities accosted with applying the FHIR standard. Mapping of content from DMM to FHIR was done as close as possible to the DMM to minimize migration costs except when the structure of the content did not fit into FHIR. From the DMM a total of 183 elements were mapped to FHIR. 75 (40.9%) elements were modeled as existing FHIR elements and 96 (52.5%) elements were modeled as extensions and 12 (6.6%) elements were deemed unnecessary because of build-in FHIR characteristics. In this study, it was possible to represent the content of a Danish message model using HL7 FHIR.
On the Execution Control of HLA Federations using the SISO Space Reference FOM
NASA Technical Reports Server (NTRS)
Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.
2017-01-01
In the Space domain the High Level Architecture (HLA) is one of the reference standard for Distributed Simulation. However, for the different organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA) and their industrial partners, it is difficult to implement HLA simulators (called Federates) able to interact and interoperate in the context of a distributed HLA simulation (called Federation). The lack of a common FOM (Federation Object Model) for the Space domain is one of the main reasons that precludes a-priori interoperability between heterogeneous federates. To fill this lack a Product Development Group (PDG) has been recently activated in the Simulation Interoperability Standards Organization (SISO) with the aim to provide a Space Reference FOM (SRFOM) for international collaboration on Space systems simulations. Members of the PDG come from several countries and contribute experiences from projects within NASA, ESA and other organizations. Participants represent government, academia and industry. The paper presents an overview of the ongoing Space Reference FOM standardization initiative by focusing on the solution provided for managing the execution of an SRFOM-based Federation.
47 CFR 90.548 - Interoperability Technical Standards.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Specification—New Technology Standards Project—Digital Radio Technical Standards, approved March 2005. (v) ANSI/TIA-102.BAEE-B-2010, Project 25 Radio Management Protocols—New Technology Standards Project—Digital... 2003. (iii) ANSI/TIA-102.BAEA-B-2012, Project 25 Data Overview—New Technology Standards Project—Digital...
Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine
King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas
2014-01-01
Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993
The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States
Vreeman, Daniel J.; Grannis, Shaun J.
2014-01-01
Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985
A generative tool for building health applications driven by ISO 13606 archetypes.
Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás
2012-10-01
The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.
Leveraging standards to support patient-centric interdisciplinary plans of care.
Dykes, Patricia C; DaDamio, Rebecca R; Goldsmith, Denise; Kim, Hyeon-eui; Ohashi, Kumiko; Saba, Virginia K
2011-01-01
As health care systems and providers move towards meaningful use of electronic health records, the once distant vision of collaborative patient-centric, interdisciplinary plans of care, generated and updated across organizations and levels of care, may soon become a reality. Effective care planning is included in the proposed Stages 2-3 Meaningful Use quality measures. To facilitate interoperability, standardization of plan of care messaging, content, information and terminology models are needed. This degree of standardization requires local and national coordination. The purpose of this paper is to review some existing standards that may be leveraged to support development of interdisciplinary patient-centric plans of care. Standards are then applied to a use case to demonstrate one method for achieving patient-centric and interoperable interdisciplinary plan of care documentation. Our pilot work suggests that existing standards provide a foundation for adoption and implementation of patient-centric plans of care that are consistent with federal requirements.
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Glover, Daniel R.; vonDeak, Thomas C.; Bhasin, Kul B.
1998-01-01
The realization of the full potential of the National Information Infrastructure (NH) and Global Information Infrastructure (GII) requires seamless interoperability of emerging satellite networks with terrestrial networks. This requires a cooperative effort between industry, academia and government agencies to develop and advocate new, satellite-friendly communication protocols and modifications to existing communication protocol standards. These groups have recently come together to actively participating in a number of standards making bodies including: the Internet Engineering Task Force (IETF), the Asynchronous Transfer Mode (ATM) Forum, the International Telecommunication Union (ITU) and the Telecommunication Industry Association MA) to ensure that issues regarding efficient use of these protocols over satellite links are not overlooked. This paper will summarize the progress made toward standards development to achieve seamless integration and accelerate the deployment of multimedia applications.
Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji
2013-12-01
The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.
Blobel, Bernd
2013-01-01
Based on the paradigm changes for health, health services and underlying technologies as well as the need for at best comprehensive and increasingly automated interoperability, the paper addresses the challenge of knowledge representation and management for medical decision support. After introducing related definitions, a system-theoretical, architecture-centric approach to decision support systems (DSSs) and appropriate ways for representing them using systems of ontologies is given. Finally, existing and emerging knowledge representation and management standards are presented. The paper focuses on the knowledge representation and management part of DSSs, excluding the reasoning part from consideration.
Medical Device Plug-and-Play Interoperability Standards & Technology Leadership
2011-10-01
official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved...biomedical engineering students completed their senior design project on the X-Ray / Ventilator Use Case. We worked closely with the students to...Supporting Medical Device Adverse Event Analysis in an Interoperable Clinical Environment: Design of a Data Logging and Playback System,” Publication in
47 CFR 90.548 - Interoperability Technical Standards.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Industry Association, ANSI/TIA/EIA-102.BABA-1998. (2) Transmitters designed for data transmission shall... (see § 90.531) shall conform to the following technical standards: (1) Transmitters designed for voice...
RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service
NASA Astrophysics Data System (ADS)
Yang, Chao; Chen, Nengcheng; Di, Liping
2012-10-01
Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.
An Interoperability Framework and Capability Profiling for Manufacturing Software
NASA Astrophysics Data System (ADS)
Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.
ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.
Dynamic Business Networks: A Headache for Sustainable Systems Interoperability
NASA Astrophysics Data System (ADS)
Agostinho, Carlos; Jardim-Goncalves, Ricardo
Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.
An Architecture for Semantically Interoperable Electronic Health Records.
Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa
2017-01-01
Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.
Interoperability at ESA Heliophysics Science Archives: IVOA, HAPI and other implementations
NASA Astrophysics Data System (ADS)
Martinez-Garcia, B.; Cook, J. P.; Perez, H.; Fernandez, M.; De Teodoro, P.; Osuna, P.; Arnaud, M.; Arviset, C.
2017-12-01
The data of ESA heliophysics science missions are preserved at the ESAC Science Data Centre (ESDC). The ESDC aims for the long term preservation of those data, which includes missions such as Ulysses, Soho, Proba-2, Cluster, Double Star, and in the future, Solar Orbiter. Scientists have access to these data through web services, command line and graphical user interfaces for each of the corresponding science mission archives. The International Virtual Observatory Alliance (IVOA) provides technical standards that allow interoperability among different systems that implement them. By adopting some IVOA standards, the ESA heliophysics archives are able to share their data with those tools and services that are VO-compatible. Implementation of those standards can be found in the existing archives: Ulysses Final Archive (UFA) and Soho Science Archive (SSA). They already make use of VOTable format definition and Simple Application Messaging Protocol (SAMP). For re-engineered or new archives, the implementation of services through Table Access Protocol (TAP) or Universal Worker Service (UWS) will leverage this interoperability. This will be the case for the Proba-2 Science Archive (P2SA) and the Solar Orbiter Archive (SOAR). We present here which IVOA standards were already used by the ESA Heliophysics archives in the past and the work on-going.
The Development of Clinical Document Standards for Semantic Interoperability in China
Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping
2011-01-01
Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722
Experience with abstract notation one
NASA Technical Reports Server (NTRS)
Harvey, James D.; Weaver, Alfred C.
1990-01-01
The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.
Smart Grid Interoperability Maturity Model Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Drummond, R.; Giroti, Tony
The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less
Types of Weapon Programs and Their Implications for NATO Standardization or Interoperability
1978-07-01
mines , etc.). 3. Enemy Counter-measures. Lack of standardisation or Interoperability complicates Warsaw Pact problem by forcing them to counter...Licensed Production U.K. Sikorsky S-5F helicopter (U.S.) (Westland, "Hessex") Canada Canadair CC-106 transport (derivative of the Bristol Britannia ...U.K.) Canadair CP-107 Argus maritime reconnaissance aircraft (modification of the Bristol Britannia , U.K.) Federal Republic of Germany Air-Fouga
ERIC Educational Resources Information Center
Bourdon, Francoise
2005-01-01
The translation into French of the Encoded Archival Context (EAC) DTD tag library has been in progress for a few months. It is being carried out by a group of experts gathered by AFNOR, the French national standards agency. The main goal of this group is to foster the interoperability of authority data between archives, libraries and museums, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Environmental Models as a Service: Enabling Interoperability ...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
2016-08-10
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Bridging Hydroinformatics Services Between HydroShare and SWATShare
NASA Astrophysics Data System (ADS)
Merwade, V.; Zhao, L.; Song, C. X.; Tarboton, D. G.; Goodall, J. L.; Stealey, M.; Rajib, A.; Morsy, M. M.; Dash, P. K.; Miles, B.; Kim, I. L.
2016-12-01
Many cyberinfrastructure systems in the hydrologic and related domains emerged in the past decade with more being developed to address various data management and modeling needs. Although clearly beneficial to the broad user community, it is a challenging task to build interoperability across these systems due to various obstacles including technological, organizational, semantic, and social issues. This work presents our experience in developing interoperability between two hydrologic cyberinfrastructure systems - SWATShare and HydroShare. HydroShare is a large-scale online system aiming at enabling the hydrologic user community to share their data, models, and analysis online for solving complex hydrologic research questions. On the other side, SWATShare is a focused effort to allow SWAT (Soil and Water Assessment Tool) modelers share, execute and analyze SWAT models using high performance computing resources. Making these two systems interoperable required common sign-in through OAuth, sharing of models through common metadata standards and use of standard web-services for implementing key import/export functionalities. As a result, users from either community can leverage the resources and services across these systems without having to manually importing, exporting, or processing their models. Overall, this use case is an example that can serve as a model for the interoperability among other systems as no one system can provide all the functionality needed to address large interdisciplinary problems.
Assessing Quality of Data Standards: Framework and Illustration Using XBRL GAAP Taxonomy
NASA Astrophysics Data System (ADS)
Zhu, Hongwei; Wu, Harris
The primary purpose of data standards or metadata schemas is to improve the interoperability of data created by multiple standard users. Given the high cost of developing data standards, it is desirable to assess the quality of data standards. We develop a set of metrics and a framework for assessing data standard quality. The metrics include completeness and relevancy. Standard quality can also be indirectly measured by assessing interoperability of data instances. We evaluate the framework using data from the financial sector: the XBRL (eXtensible Business Reporting Language) GAAP (Generally Accepted Accounting Principles) taxonomy and US Securities and Exchange Commission (SEC) filings produced using the taxonomy by approximately 500 companies. The results show that the framework is useful and effective. Our analysis also reveals quality issues of the GAAP taxonomy and provides useful feedback to taxonomy users. The SEC has mandated that all publicly listed companies must submit their filings using XBRL. Our findings are timely and have practical implications that will ultimately help improve the quality of financial data.
Phillips, Joshua; Chilukuri, Ram; Fragoso, Gilberto; Warzel, Denise; Covitz, Peter A
2006-01-06
Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs). The National Cancer Institute (NCI) developed the cancer common ontologic representation environment (caCORE) to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK) was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. The caCORE SDK requires a Unified Modeling Language (UML) tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR) using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG) program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has emerged as a key enabling technology for caBIG. The caCORE SDK substantially lowers the barrier to implementing systems that are syntactically and semantically interoperable by providing workflow and automation tools that standardize and expedite modeling, development, and deployment. It has gained acceptance among developers in the caBIG program, and is expected to provide a common mechanism for creating data service nodes on the data grid that is under development.
The value of health care information exchange and interoperability.
Walker, Jan; Pan, Eric; Johnston, Douglas; Adler-Milstein, Julia; Bates, David W; Middleton, Blackford
2005-01-01
In this paper we assess the value of electronic health care information exchange and interoperability (HIEI) between providers (hospitals and medical group practices) and independent laboratories, radiology centers, pharmacies, payers, public health departments, and other providers. We have created an HIEI taxonomy and combined published evidence with expert opinion in a cost-benefit model. Fully standardized HIEI could yield a net value of dollar 77.8 billion per year once fully implemented. Nonstandardized HIEI offers smaller positive financial returns. The clinical impact of HIEI for which quantitative estimates cannot yet be made would likely add further value. A compelling business case exists for national implementation of fully standardized HIEI.
Toward a North American Standard for Mobile Data Services
NASA Technical Reports Server (NTRS)
Dean, Richard A.; Levesque, Allen H.
1991-01-01
The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.
NASA Astrophysics Data System (ADS)
Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.
2016-04-01
The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.
Toward a North American standard for mobile data services
NASA Astrophysics Data System (ADS)
Dean, Richard A.; Levesque, Allen H.
1991-09-01
The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.
Kalra, Dipak; Kobayashi, Shinji
2013-01-01
Objectives The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Methods Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. Results The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Conclusions Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes. PMID:24523993
Reusable Models of Pedagogical Concepts--A Framework for Pedagogical and Content Design.
ERIC Educational Resources Information Center
Pawlowski, Jan M.
Standardization initiatives in the field of learning technologies have produced standards for the interoperability of learning environments and learning management systems. Learning resources based on these standards can be reused, recombined, and adapted to the user. However, these standards follow a content-oriented approach; the process of…
Infrastructure for Planetary Sciences: Universal planetary database development project
NASA Astrophysics Data System (ADS)
Kasaba, Yasumasa; Capria, M. T.; Crichton, D.; Zender, J.; Beebe, R.
The International Planetary Data Alliance (IPDA), formally formed under COSPAR (Formal start: from the COSPAR 2008 at Montreal), is a joint international effort to enable global access and exchange of high quality planetary science data, and to establish archive stan-dards that make it easier to share the data across international boundaries. In 2008-2009, thanks to the many players from several agencies and institutions, we got fruitful results in 6 projects: (1) Inter-operable Planetary Data Access Protocol (PDAP) implementations [led by J. Salgado@ESA], (2) Small bodies interoperability [led by I. Shinohara@JAXA N. Hirata@U. Aizu], (3) PDAP assessment [led by Y. Yamamoto@JAXA], (4) Architecture and standards definition [led by D. Crichton@NASA], (5) Information model and data dictionary [led by S. Hughes@NASA], and (6) Venus Express Interoperability [led by N. Chanover@NMSU]. 'IPDA 2009-2010' is important, especially because the NASA/PDS system reformation is now reviewed as it develops for application at the international level. IPDA is the gate for the establishment of the future infrastructure. We are running 8 projects: (1) IPDA Assessment of PDS4 Data Standards [led by S. Hughes (NASA/JPL)], (2) IPDA Archive Guide [led by M.T. Capria (IASF/INAF) and D. Heather (ESA/PSA)], (3) IPDA Standards Identification [led by E. Rye (NASA/PDS) and G. Krishna (ISRO)], (4) Ancillary Data Standards [led by C. Acton (NASA/JPL)], (5) IPDA Registries Definition [led by D. Crichton (NASA/JPL)], (6) PDAP Specification [led by J. Salgado (ESA/PSA) and Y. Yamamoto (JAXA)], (7) In-teroperability Assessment [R. Beebe (NMSU) and D. Heather (ESA/PSA)], and (8) PDAP Geographic Information System (GIS) extension [N. Hirata (Univ. Aizu) and T. Hare (USGS: thare@usgs.gov)]. This paper presents our achievements and plans summarized in the IPDA 5th Steering Com-mittee meeting at DLR in July 2010. We are now just the gate for the establishment of the Infrastructure.
Sinaci, A Anil; Laleci Erturkmen, Gokce B
2013-10-01
In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, Dave; Stephan, Eric G.; Wang, Weimin
Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system’s limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa. PMID:29713506
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system's limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa.
.NET INTEROPERABILITY GUIDELINES
The CAPE-OPEN middleware standards were created to allow process modelling components (PMCs) developed by third parties to be used in any process modelling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compo...
Achieving Interoperability Through Base Registries for Governmental Services and Document Management
NASA Astrophysics Data System (ADS)
Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris
As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.
Intelligent Mortality Reporting with FHIR
Hoffman, Ryan A.; Wu, Hang; Venugopalan, Janani; Braun, Paula; Wang, May D.
2017-01-01
One pressing need in the area of public health is timely, accurate, and complete reporting of deaths and the conditions leading up to them. Fast Healthcare Interoperability Resources (FHIR) is a new HL7 interoperability standard for electronic health record (EHR), while Sustainable Medical Applications and Reusable Technologies (SMART)-on-FHIR enables third-party app development that can work “out of the box”. This research demonstrates the feasibility of developing SMART-on-FHIR applications to enable medical professionals to perform timely and accurate death reporting within multiple different jurisdictions of US. We explored how the information on a standard certificate of death can be mapped to resources defined in the FHIR standard (DSTU2). We also demonstrated analytics for potentially improving the accuracy and completeness of mortality reporting data. PMID:28804791
NASA Astrophysics Data System (ADS)
Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.
2018-04-01
This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.
SMART on FHIR: a standards-based, interoperable apps platform for electronic health records
Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B
2016-01-01
Objective In early 2010, Harvard Medical School and Boston Children’s Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). Methods We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. Results We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. Conclusion In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. PMID:26911829
Standardized exchange of clinical documents--towards a shared care paradigm in glaucoma treatment.
Gerdsen, F; Müller, S; Jablonski, S; Prokosch, H-U
2006-01-01
The exchange of medical data from research and clinical routine across institutional borders is essential to establish an integrated healthcare platform. In this project we want to realize the standardized exchange of medical data between different healthcare institutions to implement an integrated and interoperable information system supporting clinical treatment and research of glaucoma. The central point of our concept is a standardized communication model based on the Clinical Document Architecture (CDA). Further, a communication concept between different health care institutions applying the developed document model has been defined. With our project we have been able to prove that standardized communication between an Electronic Medical Record (EMR), an Electronic Health Record (EHR) and the Erlanger Glaucoma Register (EGR) based on the established conceptual models, which rely on CDA rel.1 level 1 and SCIPHOX, could be implemented. The HL7-tool-based deduction of a suitable CDA rel.2 compliant schema showed significant differences when compared with the manually created schema. Finally fundamental requirements, which have to be implemented for an integrated health care platform, have been identified. An interoperable information system can enhance both clinical treatment and research projects. By automatically transferring screening findings from a glaucoma research project to the electronic medical record of our ophthalmology clinic, clinicians could benefit from the availability of a longitudinal patient record. The CDA as a standard for exchanging clinical documents has demonstrated its potential to enhance interoperability within a future shared care paradigm.
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Test Protocols for Advanced Inverter Interoperability Functions – Main Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Test Protocols for Advanced Inverter Interoperability Functions - Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.
Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk
2018-02-23
Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2016-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry.
47 CFR 90.548 - Interoperability Technical Standards.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Section 90.548 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing the Licensing and Use of Frequencies in...—Digital Radio Technical Standards, approved April 15, 1998, Telecommunications Industry Association, ANSI...
47 CFR 90.548 - Interoperability Technical Standards.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Section 90.548 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES PRIVATE LAND MOBILE RADIO SERVICES Regulations Governing the Licensing and Use of Frequencies in...—Digital Radio Technical Standards, approved April 15, 1998, Telecommunications Industry Association, ANSI...
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.
NASA Technical Reports Server (NTRS)
Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville
2005-01-01
Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.
Myneni, Sahiti; Patel, Vimla L.
2009-01-01
Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900
Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra
2016-01-01
With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649
Economic Perspective on Cloud Computing: Three Essays
ERIC Educational Resources Information Center
Dutt, Abhijit
2013-01-01
Improvements in Information Technology (IT) infrastructure and standardization of interoperability standards among heterogeneous Information System (IS) applications have brought a paradigm shift in the way an IS application could be used and delivered. Not only an IS application can be built using standardized component but also parts of it can…
Richesson, Rachel L.; Fung, Kin Wah; Krischer, Jeffrey P.
2008-01-01
Monitoring adverse events (AEs) is an important part of clinical research and a crucial target for data standards. The representation of adverse events themselves requires the use of controlled vocabularies with thousands of needed clinical concepts. Several data standards for adverse events currently exist, each with a strong user base. The structure and features of these current adverse event data standards (including terminologies and classifications) are different, so comparisons and evaluations are not straightforward, nor are strategies for their harmonization. Three different data standards - the Medical Dictionary for Regulatory Activities (MedDRA) and the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) terminologies, and Common Terminology Criteria for Adverse Events (CTCAE) classification - are explored as candidate representations for AEs. This paper describes the structural features of each coding system, their content and relationship to the Unified Medical Language System (UMLS), and unsettled issues for future interoperability of these standards. PMID:18406213
Bouhaddou, Omar; Warnekar, Pradnya; Parrish, Fola; Do, Nhan; Mandel, Jack; Kilbourne, John; Lincoln, Michael J.
2008-01-01
Complete patient health information that is available where and when it is needed is essential to providers and patients and improves healthcare quality and patient safety. VA and DoD have built on their previous experience in patient data exchange to establish data standards and terminology services to enable real-time bi-directional computable (i.e., encoded) data exchange and achieve semantic interoperability in compliance with recommended national standards and the eGov initiative. The project uses RxNorm, UMLS, and SNOMED CT terminology standards to mediate codified pharmacy and allergy data with greater than 92 and 60 percent success rates respectively. Implementation of the project has been well received by users and is being expanded to multiple joint care sites. Stable and mature standards, mediation strategies, and a close relationship between healthcare institutions and Standards Development Organizations are recommended to achieve and maintain semantic interoperability in a clinical setting. PMID:18096911
Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul
2015-01-01
Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.
Blazona, Bojan; Koncar, Miroslav
2007-12-01
Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. However, this requirement represents one of the major challenges for the Information and Communication Technology (ICT) solutions, as systems today use diverse technologies, proprietary protocols and communication standards which are often not interoperable. One of the main producers of clinical information in healthcare settings represent Radiology Information Systems (RIS) that communicate using widely adopted DICOM (Digital Imaging and COmmunications in Medicine) standard, but in very few cases can efficiently integrate information of interest with other systems. In this context we identified HL7 standard as the world's leading medical ICT standard that is envisioned to provide the umbrella for medical data semantic interoperability, which amongst other things represents the cornerstone for the Croatia's National Integrated Healthcare Information System (IHCIS). The aim was to explore the ability to integrate and exchange RIS originated data with Hospital Information Systems based on HL7's CDA (Clinical Document Architecture) standard. We explored the ability of HL7 CDA specifications and methodology to address the need of RIS integration HL7 based healthcare information systems. We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers. The outcome of our pilot work proves our original assumption of HL7 standard being able to adopt radiology data into the integrated healthcare systems. Uniform DICOM to CDA translation scripts and business processes within IHCIS is desired and cost effective regarding to use of supporting IHCIS services aligned to SOA.
NASA Technical Reports Server (NTRS)
Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.
2016-01-01
Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.
77 FR 38768 - Smart Grid Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-29
... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Smart Grid Advisory... Smart Grid Interoperability, National Institute of Standards and Technology, 100 Bureau Drive, Mail Stop... open meeting. SUMMARY: The Smart Grid Advisory Committee (SGAC or Committee) will hold a meeting via...
Phillips, Joshua; Chilukuri, Ram; Fragoso, Gilberto; Warzel, Denise; Covitz, Peter A
2006-01-01
Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs). The National Cancer Institute (NCI) developed the cancer common ontologic representation environment (caCORE) to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK) was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML) tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR) using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG) program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has emerged as a key enabling technology for caBIG. Conclusion The caCORE SDK substantially lowers the barrier to implementing systems that are syntactically and semantically interoperable by providing workflow and automation tools that standardize and expedite modeling, development, and deployment. It has gained acceptance among developers in the caBIG program, and is expected to provide a common mechanism for creating data service nodes on the data grid that is under development. PMID:16398930
Interoperability in healthcare: major challenges in the creation of the enterprise environment
NASA Astrophysics Data System (ADS)
Lindsköld, L.; Wintell, M.; Lundberg, N.
2009-02-01
There is today a lack of interoperability in healthcare although the need for it is obvious. A new healthcare enterprise environment has been deployed for secure healthcare interoperability in the Western Region in Sweden (WRS). This paper is an empirical overview of the new enterprise environment supporting regional shared and transparent radiology domain information in the WRS. The enterprise environment compromises 17 radiology departments, 1,5 million inhabitants, using different RIS and PACS in a joint work-oriented network and additional cardiology, dentistry and clinical physiology departments. More than 160 terabytes of information are stored in the enterprise repository. Interoperability is developed according to the IHE mission, i.e. applying standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level 7 (HL7) to address specific clinical communication needs and support optimal patient care. The entire enterprise environment is implemented and used daily in WRS. The central prerequisites in the development of the enterprise environment in western region of Sweden were: 1) information harmonization, 2) reuse of standardized messages e.g. HL7 v2.x and v3.x, 3) development of a holistic information domain including both text and images, and 4) to create a continuous and dynamic update functionality. The central challenges in this project were: 1) the many different vendors acting in the region and the negotiations with them to apply communication roles/profiles such as HL7 (CDA, CCR), DICOM, and XML, 2) the question of whom owns the data, and 3) incomplete technical standards. This study concludes that to create a workflow that runs within an enterprise environment there are a number of central prerequisites and challenges that needs to be in place. This calls for negotiations on an international, national and regional level with standardization organizations, vendors, health management and health personnel.
2012-08-01
ELECTRONICS AND ARCHITECTURE (VEA) MINI-SYMPOSIUM AUGUST 14-16, TROY MICHIGAN Performance of an Embedded Platform Aggregating and Executing...NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) UBT Technologies,3250 W. Big Beaver Rd.,Ste. 329, Troy ,MI...Technology Symposium August 14-16 Troy , Michigan 14. ABSTRACT The Vehicular Integration for C4ISR/EW Interoperability (VICTORY) Standard adopts many
NASA Astrophysics Data System (ADS)
López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo
In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.
2010-10-01
the 2004 Fall Simulation Interoperability Workshop, Orlando, Florida, USA, September 2004, 04F- SIW -090. [Blacklock (2007)] - Blacklock, J. and Zalcman...Valley, CA, USA, March 2009, 09S- SIW -084. [DIS (1995)] - IEEE Standard – Protocols for Distributed Interactive Simulation Application (1995), IEEE...Workshop, Orlando, FL, USA, September 2007, 07F- SIW -111. [Gresche] - Gresche, D. et al, (2006), “International Mission Training Research
NASA Astrophysics Data System (ADS)
Cole, M.; Alameh, N.; Bambacus, M.
2006-05-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
Expanding the scope of health information systems. Challenges and developments.
Kuhn, K A; Wurst, S H R; Bott, O J; Giuse, D A
2006-01-01
To identify current challenges and developments in health information systems. Reports on HIS, eHealth and process support were analyzed, core problems and challenges were identified. Health information systems are extending their scope towards regional networks and health IT infrastructures. Integration, interoperability and interaction design are still today's core problems. Additional problems arise through the integration of genetic information into the health care process. There are noticeable trends towards solutions for these problems.
An EarthCube Roadmap for Cross-Domain Interoperability in the Geosciences: Governance Aspects
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Couch, A.; Richard, S. M.; Valentine, D. W.; Stocks, K.; Murphy, P.; Lehnert, K. A.
2012-12-01
The goal of cross-domain interoperability is to enable reuse of data and models outside the original context in which these data and models are collected and used and to facilitate analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries. A new research initiative of the U.S. National Science Foundation, called EarthCube, is developing a roadmap to address challenges of interoperability in the earth sciences and create a blueprint for community-guided cyberinfrastructure accessible to a broad range of geoscience researchers and students. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place for such secondary or derivative-use of information to be both scientifically sound and technically feasible. In this initial assessment we consider the following four basic infrastructure components that need to be present to enable cross-domain interoperability in the geosciences: metadata catalogs (at the appropriate community defined granularity) that provide standard discovery services over datasets, data access services, models and other resources of the domain; vocabularies that support unambiguous interpretation of domain resources and metadata; services used to access data repositories and other resources including models, visualizations and workflows; and formal information models that define structure and semantics of the information returned on service requests. General standards for these components have been proposed; they form the backbone of large scale integration activities in the geosciences. By utilizing these standards, EarthCube research designs can take advantage of data discovery across disciplines using the commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. Data can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data services can be used to transparently compile composite data products. Key questions addressed in this presentation are: (1) How to define and assess readiness of existing domain information systems for cross-domain re-use? (2) How to determine EarthCube development priorities given a multitude of use cases that involve cross-domain data flows? and (3) How to involve a wider community of geoscientists in the development and curation of cross-domain resources and incorporate community feedback in the CI design? Answering them involves consideration of governance mechanisms for cross-domain interoperability: while domain information systems and projects developed governance mechanisms, managing cross-domain CI resources and supporting cross-domain information re-use hasn't been the development focus at the scale of the geosciences. We present a cross-domain readiness model as enabling effective communication among scientists, governance bodies, and information providers. We also present an initial readiness assessment and a cross-domain connectivity map for the geosciences, and outline processes for eliciting user requirements, setting priorities, and obtaining community consensus.
The Development of the Learning Object Standard Using a Pedagogic Approach: A Comparative Study.
ERIC Educational Resources Information Center
Yahya, Yazrina; Jenkins, John; Yusoff, Mohammed
Education is moving towards revenue generation from such channels as electronic learning, distance learning and virtual education. Hence learning technology standards are critical to the sector's success. Existing learning technology standards have focused on various topics such as metadata, question and test interoperability and others. However,…
47 CFR 90.548 - Interoperability Technical Standards.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Standards, approved March 3, 2000, Telecommunications Industry Association, ANSI/TIA/EIA-102.BAEA-2000..., approved March 3, 2000, Telecommunications Industry Association, ANSI/TIA/EIA-102.BAEB-2000; Project 25..., approved March 3, 2000, Telecommunications Industry Association, ANSI/TIA/EIA-102.BAEE-2000; Project 25...
NASA Astrophysics Data System (ADS)
Oggioni, A.; Tagliolato, P.; Schleidt, K.; Carrara, P.; Grellet, S.; Sarretta, A.
2016-02-01
The state of the art in biodiversity data management unfortunately encompases a plethora of diverse data formats. Compared to other research fields, there is a lack in harmonization and standardization of these data. While data from traditional biodiversity collections (e.g. from museums) can be easily represented by existing standard as provided by TDWG, the growing number of field observations stemming from both VGI activities (e.g. iNaturalist) as well as from automated systems (e.g. animal biotelemetry) would at the very least require upgrades of current formats. Moreover, from an eco-informatics perspective, the integration and use of data from different scientific fields is the norm (abiotic data, geographic information, etc.); the possibility to represent this information and biodiversity data in a homogeneous way would be an advantage for interoperability, allowing for easy integration across environmental media. We will discuss the possibility to exploit the Open Geospatial Consortium/ISO standard, Observations and Measurements (O&M) [1], a generic conceptual model developed for observation data but with strong analogies with the biodiversity-oriented OBOE ontology [2]. The applicability of OGC O&M for the provision of biodiviersity occurence data has been suggested by the INSPIRE Cross Thematic Working Group on Observations & Measurements [3], Inspire Environmental Monitoring Facilities Thematic Working Group [4] and New Zealand Environmental Information Interoperability Framework [5]. This approach, in our opinion, could be an advantage for the biodiversity community. We will provide some examples for encoding biodiversity occurence data using the O&M standard in addition to highlighting the advatages offered by O&M in comparison to other representation formats. [1] Cox, S. (2013). Geographic information - Observations and measurements - OGC and ISO 19156. [2] Madin, J., Bowers, S., Schildhauer, M., Krivov, S., Pennington, D., & Villa, F. (2007). An ontology for describing and synthesizing ecological observation data. Ecological Informatics, 2(3), 279-296. [3] INSPIRE_D2.9_O&M_Guidelines_v2.0rc3.pdf[4] INSPIRE_DataSpecification_EF_v3.0.pdf[5] Watkins, A. (2012) Biodiversity Interoperability through Open Geospatial Standards
Seven recommendations to make your invasive alien species data more useful
Groom, Quentin J.; Adriaens, Tim; Desmet, Peter; Simpson, Annie; De Wever, Aaike; Bazos, Ioannis; Cardoso, Ana Cristina; Charles, Lucinda; Christopoulou, Anastasia; Gazda, Anna; Helmisaari, Harry; Hobern, Donald; Josefsson, Melanie; Lucy, Frances; Marisavljevic, Dragana; Oszako, Tomasz; Pergl, Jan; Petrovic-Obradovic, Olivera; Prévot, Céline; Ravn, Hans Peter; Richards, Gareth; Roques, Alain; Roy, Helen; Rozenberg, Marie-Anne A.; Scalera, Riccardo; Tricarico, Elena; Trichkova, Teodora; Vercayie, Diemer; Zenetos, Argyro; Vanderhoeven, Sonia
2017-01-01
Science-based strategies to tackle biological invasions depend on recent, accurate, well-documented, standardized and openly accessible information on alien species. Currently and historically, biodiversity data are scattered in numerous disconnected data silos that lack interoperability. The situation is no different for alien species data, and this obstructs efficient retrieval, combination, and use of these kinds of information for research and policy-making. Standardization and interoperability are particularly important as many alien species related research and policy activities require pooling data. We describe seven ways that data on alien species can be made more accessible and useful, based on the results of a European Cooperation in Science and Technology (COST) workshop: (1) Create data management plans; (2) Increase interoperability of information sources; (3) Document data through metadata; (4) Format data using existing standards; (5) Adopt controlled vocabularies; (6) Increase data availability; and (7) Ensure long-term data preservation. We identify four properties specific and integral to alien species data (species status, introduction pathway, degree of establishment, and impact mechanism) that are either missing from existing data standards or lack a recommended controlled vocabulary. Improved access to accurate, real-time and historical data will repay the long-term investment in data management infrastructure, by providing more accurate, timely and realistic assessments and analyses. If we improve core biodiversity data standards by developing their relevance to alien species, it will allow the automation of common activities regarding data processing in support of environmental policy. Furthermore, we call for considerable effort to maintain, update, standardize, archive, and aggregate datasets, to ensure proper valorization of alien species data and information before they become obsolete or lost.
Current state of the mass storage system reference model
NASA Technical Reports Server (NTRS)
Coyne, Robert
1993-01-01
IEEE SSSWG was chartered in May 1990 to abstract the hardware and software components of existing and emerging storage systems and to define the software interfaces between these components. The immediate goal is the decomposition of a storage system into interoperable functional modules which vendors can offer as separate commercial products. The ultimate goal is to develop interoperable standards which define the software interfaces, and in the distributed case, the associated protocols to each of the architectural modules in the model. The topics are presented in viewgraph form and include the following: IEEE SSSWG organization; IEEE SSSWG subcommittees & chairs; IEEE standards activity board; layered view of the reference model; layered access to storage services; IEEE SSSWG emphasis; and features for MSSRM version 5.
Security and Privacy in a DACS.
Delgado, Jaime; Llorente, Silvia; Pàmies, Martí; Vilalta, Josep
2016-01-01
The management of electronic health records (EHR), in general, and clinical documents, in particular, is becoming a key issue in the daily work of Healthcare Organizations (HO). The need for providing secure and private access to, and storage for, clinical documents together with the need for HO to interoperate, raises a number of issues difficult to solve. Many systems are in place to manage EHR and documents. Some of these Healthcare Information Systems (HIS) follow standards in their document structure and communications protocols, but many do not. In fact, they are mostly proprietary and do not interoperate. Our proposal to solve the current situation is the use of a DACS (Document Archiving and Communication System) for providing security, privacy and standardized access to clinical documents.
106-17 Telemetry Standards Front Matter
2017-07-01
IS UNLIMITED ABERDEEN TEST CENTER DUGWAY PROVING GROUND REAGAN TEST SITE REDSTONE TEST CENTER WHITE SANDS MISSILE RANGE YUMA PROVING GROUND...Council US Army White Sands Missile Range, New Mexico 88002-5110 This page intentionally left blank. Telemetry Standards, IRIG Standard 106-17...TM receiver commands for interoperability. f. Task TG-141: Update IRIG 106 with Standards for Data Quality Metrics (DQM) and Data Quality
Enabling Interoperability in Heliophysical Domains
NASA Astrophysics Data System (ADS)
Bentley, Robert
2013-04-01
There are many aspects of science in the Solar System that are overlapping - phenomena observed in one domain can have effects in other domains. However, there are many problems related to exploiting the data in cross-disciplinary studies because of lack of interoperability of the data and services. The CASSIS project is a Coordination Action funded under FP7 that has the objective of improving the interoperability of data and services related Solar System science. CASSIS has been investigating how the data could be made more accessible with some relatively minor changes to the observational metadata. The project has been looking at the services that are used within the domain and determining whether they are interoperable with each other and if not what would be required make them so. It has also been examining all types of metadata that are used when identifying and using observations and trying to make them more compliant with techniques and standards developed by bodies such as the International Virtual Observatory Alliance (IVOA). Many of the lessons that are being learnt in the study are applicable to domains that go beyond those directly involved in heliophysics. Adopting some simple standards related to the design of the services interfaces and metadata that are used would make it much easier to investigate interdisciplinary science topics. We will report on our finding and describe a roadmap for the future. For more information about CASSIS, please visit the project Web site on cassis-vo.eu
Interoperability challenges in river discharge modelling: A cross domain application scenario
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin
2018-06-01
River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2010-10-01
The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes
Sharma, Deepak K.; Solbrig, Harold R.; Prud’hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary’s metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration. PMID:28269909
Standardized Representation of Clinical Study Data Dictionaries with CIMI Archetypes.
Sharma, Deepak K; Solbrig, Harold R; Prud'hommeaux, Eric; Pathak, Jyotishman; Jiang, Guoqian
2016-01-01
Researchers commonly use a tabular format to describe and represent clinical study data. The lack of standardization of data dictionary's metadata elements presents challenges for their harmonization for similar studies and impedes interoperability outside the local context. We propose that representing data dictionaries in the form of standardized archetypes can help to overcome this problem. The Archetype Modeling Language (AML) as developed by the Clinical Information Modeling Initiative (CIMI) can serve as a common format for the representation of data dictionary models. We mapped three different data dictionaries (identified from dbGAP, PheKB and TCGA) onto AML archetypes by aligning dictionary variable definitions with the AML archetype elements. The near complete alignment of data dictionaries helped map them into valid AML models that captured all data dictionary model metadata. The outcome of the work would help subject matter experts harmonize data models for quality, semantic interoperability and better downstream data integration.
Flexible solution for interoperable cloud healthcare systems.
Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena
2012-01-01
It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.
GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research
NASA Astrophysics Data System (ADS)
Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.
2015-05-01
To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.
CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability
NASA Technical Reports Server (NTRS)
Claus, Russell; Weitzer, Ilan
2002-01-01
Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.
Extending the GI Brokering Suite to Support New Interoperability Specifications
NASA Astrophysics Data System (ADS)
Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.
2014-12-01
The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by public administrations. CERIF: used by CRIS (Current Research Information System) instances. HYRAX Server: a scientific dataset publishing component. This presentation will discuss these and other latest GI suite extensions implemented to support new interoperability protocols in use by the Earth Science Communities.
Regulatory Barriers Blocking Standardization of Interoperability
Zhong, Daidi; Kirwan, Michael J
2013-01-01
Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the adoption of technical standards throughout the industry. These barriers have significantly impaired the motivations of consumer device vendors who desire to enter the personal health market and the overall success of personal health industry ecosystem. In this paper, we present the affect that these barriers have placed on the health ecosystem. This requires immediate action from policy makers and other stakeholders. The current regulatory policy needs to be updated to reflect the reality and demand of consumer health industry. Our hope is that this paper will draw wide consensus amongst its readers, policy makers, and other stakeholders. PMID:25098204
Regulatory barriers blocking standardization of interoperability.
Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian
2013-07-12
Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the adoption of technical standards throughout the industry. These barriers have significantly impaired the motivations of consumer device vendors who desire to enter the personal health market and the overall success of personal health industry ecosystem. In this paper, we present the affect that these barriers have placed on the health ecosystem. This requires immediate action from policy makers and other stakeholders. The current regulatory policy needs to be updated to reflect the reality and demand of consumer health industry. Our hope is that this paper will draw wide consensus amongst its readers, policy makers, and other stakeholders.
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2017-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated – from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry. PMID:28691120
Digital Mapping, Charting and Geodesy Data Standardization
1994-12-19
The primary objective of the audit was to evaluate DMA’s implementation of the Defense Standardization Program. Specifically, the audit determined...interoperability of digital MC&G data. The audit also evaluated DMA’s implementation of the DoD Internal Management Control Program as it pertains to DMA’S implementation of the Defense Standardization Program.
Distributed Learning Metadata Standards
ERIC Educational Resources Information Center
McClelland, Marilyn
2004-01-01
Significant economies can be achieved in distributed learning systems architected with a focus on interoperability and reuse. The key building blocks of an efficient distributed learning architecture are the use of standards and XML technologies. The goal of plug and play capability among various components of a distributed learning system…
Jaulent, Marie-Christine; Leprovost, Damien; Charlet, Jean; Choquet, Remy
2018-07-01
This article is a position paper dealing with semantic interoperability challenges. It addresses the Variety and Veracity dimensions when integrating, sharing and reusing large amount of heterogeneous data for data analysis and decision making applications in the healthcare domain. Many issues are raised by the necessity to conform Big Data to interoperability standards. We discuss how semantics can contribute to the improvement of information sharing and address the problem of data mediation with domain ontologies. We then introduce the main steps for building domain ontologies as they could be implemented in the context of Forensic and Legal medicine. We conclude with a particular emphasis on the current limitations in standardisation and the importance of knowledge formalization. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Botts, Nathan; Bouhaddou, Omar; Bennett, Jamie; Pan, Eric; Byrne, Colene; Mercincavage, Lauren; Olinger, Lois; Hunolt, Elaine; Cullen, Theresa
2014-01-01
Authors studied the United States (U.S.) Department of Veterans Affairs' (VA) Virtual Lifetime Electronic Record (VLER) Health pilot phase relative to two attributes of data quality - the adoption of eHealth Exchange data standards, and clinical content exchanged. The VLER Health pilot was an early effort in testing implementation of eHealth Exchange standards and technology. Testing included evaluation of exchange data from the VLER Health pilot sites partners: VA, U.S. Department of Defense (DoD), and private sector health care organizations. Domains assessed data quality and interoperability as it relates to: 1) conformance with data standards related to the underlying structure of C32 Summary Documents (C32) produced by eHealth Exchange partners; and 2) the types of C32 clinical content exchanged. This analysis identified several standards non-conformance issues in sample C32 files and informed further discourse on the methods needed to effectively monitor Health Information Exchange (HIE) data content and standards conformance.
OSI for hardware/software interoperability
NASA Astrophysics Data System (ADS)
Wood, Richard J.; Harvey, Donald L.; Linderman, Richard W.; Gardener, Gary A.; Capraro, Gerard T.
1994-03-01
There is a need in public safety for real-time data collection and transmission from one or more sensors. The Rome Laboratory and the Ballistic Missile Defense Organization are pursuing an effort to bring the benefits of Open System Architectures (OSA) to embedded systems within the Department of Defense. When developed properly OSA provides interoperability, commonality, graceful upgradeability, survivability and hardware/software transportability to greatly minimize life cycle costs, integration and supportability. Architecture flexibility can be achieved to take advantage of commercial accomplishments by basing these developments on vendor-neutral commercially accepted standards and protocols.
Biometric identification: a holistic perspective
NASA Astrophysics Data System (ADS)
Nadel, Lawrence D.
2007-04-01
Significant advances continue to be made in biometric technology. However, the global war on terrorism and our increasingly electronic society have created the societal need for large-scale, interoperable biometric capabilities that challenge the capabilities of current off-the-shelf technology. At the same time, there are concerns that large-scale implementation of biometrics will infringe our civil liberties and offer increased opportunities for identity theft. This paper looks beyond the basic science and engineering of biometric sensors and fundamental matching algorithms and offers approaches for achieving greater performance and acceptability of applications enabled with currently available biometric technologies. The discussion focuses on three primary biometric system aspects: performance and scalability, interoperability, and cost benefit. Significant improvements in system performance and scalability can be achieved through careful consideration of the following elements: biometric data quality, human factors, operational environment, workflow, multibiometric fusion, and integrated performance modeling. Application interoperability hinges upon some of the factors noted above as well as adherence to interface, data, and performance standards. However, there are times when the price of conforming to such standards can be decreased local system performance. The development of biometric performance-based cost benefit models can help determine realistic requirements and acceptable designs.
Challenges of interoperability using HL7 v3 in Czech healthcare.
Nagy, Miroslav; Preckova, Petra; Seidl, Libor; Zvarova, Jana
2010-01-01
The paper describes several classification systems that could improve patient safety through semantic interoperability among contemporary electronic health record systems (EHR-Ss) with support of the HL7 v3 standard. We describe a proposal and a pilot implementation of a semantic interoperability platform (SIP) interconnecting current EHR-Ss by using HL7 v3 messages and concepts mappings on most widely used classification systems. The increasing number of classification systems and nomenclatures requires designing of various conversion tools for transfer between main classification systems. We present the so-called LIM filler module and the HL7 broker, which are parts of the SIP, playing the role of such conversion tools. The analysis of suitability and usability of individual terminological thesauri has been started by mapping of clinical contents of the Minimal Data Model for Cardiology (MDMC) to various terminological classification systems. A national-wide implementation of the SIP would include adopting and translating international coding systems and nomenclatures, and developing implementation guidelines facilitating the migration from national standards to international ones. Our research showed that creation of such a platform is feasible; however, it will require a huge effort to adapt fully the Czech healthcare system to the European environment.
A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.
Huang, Chih-Yuan; Wu, Cheng-Hung
2016-08-31
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.
Aeronautical Mobile Airport Communications System (AeroMACS)
NASA Technical Reports Server (NTRS)
Budinger, James M.; Hall, Edward
2011-01-01
To help increase the capacity and efficiency of the nation s airports, a secure wideband wireless communications system is proposed for use on the airport surface. This paper provides an overview of the research and development process for the Aeronautical Mobile Airport Communications System (AeroMACS). AeroMACS is based on a specific commercial profile of the Institute of Electrical and Electronics Engineers (IEEE) 802.16 standard known as Wireless Worldwide Interoperability for Microwave Access or WiMAX (WiMax Forum). The paper includes background on the need for global interoperability in air/ground data communications, describes potential AeroMACS applications, addresses allocated frequency spectrum constraints, summarizes the international standardization process, and provides findings and recommendations from the world s first AeroMACS prototype implemented in Cleveland, Ohio, USA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basso, T.
Public-private partnerships have been a mainstay of the U.S. Department of Energy and the National Renewable Energy Laboratory (DOE/NREL) approach to research and development. These partnerships also include technology development that enables grid modernization and distributed energy resources (DER) advancement, especially renewable energy systems integration with the grid. Through DOE/NREL and industry support of Institute of Electrical and Electronics Engineers (IEEE) standards development, the IEEE 1547 series of standards has helped shape the way utilities and other businesses have worked together to realize increasing amounts of DER interconnected with the distribution grid. And more recently, the IEEE 2030 series ofmore » standards is helping to further realize greater implementation of communications and information technologies that provide interoperability solutions for enhanced integration of DER and loads with the grid. For these standards development partnerships, for approximately $1 of federal funding, industry partnering has contributed $5. In this report, the status update is presented for the American National Standards IEEE 1547 and IEEE 2030 series of standards. A short synopsis of the history of the 1547 standards is first presented, then the current status and future direction of the ongoing standards development activities are discussed.« less
Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.
Eichelberg, Marco; Chronaki, Catherine
2016-01-01
Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.
Bonaretti, Serena; Vilayphiou, Nicolas; Chan, Caroline Mai; Yu, Andrew; Nishiyama, Kyle; Liu, Danmei; Boutroy, Stephanie; Ghasem-Zadeh, Ali; Boyd, Steven K.; Chapurlat, Roland; McKay, Heather; Shane, Elizabeth; Bouxsein, Mary L.; Black, Dennis M.; Majumdar, Sharmila; Orwoll, Eric S.; Lang, Thomas F.; Khosla, Sundeep; Burghardt, Andrew J.
2017-01-01
Introduction HR-pQCT is increasingly used to assess bone quality, fracture risk and anti-fracture interventions. The contribution of the operator has not been adequately accounted in measurement precision. Operators acquire a 2D projection (“scout view image”) and define the region to be scanned by positioning a “reference line” on a standard anatomical landmark. In this study, we (i) evaluated the contribution of positioning variability to in vivo measurement precision, (ii) measured intra- and inter-operator positioning variability, and (iii) tested if custom training software led to superior reproducibility in new operators compared to experienced operators. Methods To evaluate the operator in vivo measurement precision we compared precision errors calculated in 64 co-registered and non-co-registered scan-rescan images. To quantify operator variability, we developed software that simulates the positioning process of the scanner’s software. Eight experienced operators positioned reference lines on scout view images designed to test intra- and inter-operator reproducibility. Finally, we developed modules for training and evaluation of reference line positioning. We enrolled 6 new operators to participate in a common training, followed by the same reproducibility experiments performed by the experienced group. Results In vivo precision errors were up to three-fold greater (Tt.BMD and Ct.Th) when variability in scan positioning was included. Inter-operator precision errors were significantly greater than short-term intra-operator precision (p<0.001). New trained operators achieved comparable intra-operator reproducibility to experienced operators, and lower inter-operator reproducibility (p<0.001). Precision errors were significantly greater for the radius than for the tibia. Conclusion Operator reference line positioning contributes significantly to in vivo measurement precision and is significantly greater for multi-operator datasets. Inter-operator variability can be significantly reduced using a systematic training platform, now available online (http://webapps.radiology.ucsf.edu/refline/). PMID:27475931
NASA Astrophysics Data System (ADS)
Bambacus, M.; Alameh, N.; Cole, M.
2006-12-01
The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.
ARGOS Policy Brief on Semantic Interoperability
KALRA, Dipak; MUSEN, Mark; SMITH, Barry; CEUSTERS, Werner
2016-01-01
Semantic interoperability is one of the priority themes of the ARGOS Trans-Atlantic Observatory. This topic represents a globally recognised challenge that must be addressed if electronic health records are to be shared among heterogeneous systems, and the information in them exploited to the maximum benefit of patients, professionals, health services, research, and industry. Progress in this multi-faceted challenge has been piecemeal, and valuable lessons have been learned, and approaches discovered, in Europe and in the US that can be shared and combined. Experts from both continents have met at three ARGOS workshops during 2010 and 2011 to share understanding of these issues and how they might be tackled collectively from both sides of the Atlantic. This policy brief summarises the problems and the reasons why they are important to tackle, and also why they are so difficult. It outlines the major areas of semantic innovation that exist and that are available to help address this challenge. It proposes a series of next steps that need to be championed on both sides of the Atlantic if further progress is to be made in sharing and analysing electronic health records meaningfully. Semantic interoperability requires the use of standards, not only for EHR data to be transferred and structurally mapped into a receiving repository, but also for the clinical content of the EHR to be interpreted in conformity with the original meanings intended by its authors. Wide-scale engagement with professional bodies, globally, is needed to develop these clinical information standards. Accurate and complete clinical documentation, faithful to the patient’s situation, and interoperability between systems, require widespread and dependable access to published and maintained collections of coherent and quality-assured semantic resources, including models such as archetypes and templates that would (1) provide clinical context, (2) be mapped to interoperability standards for EHR data, (3) be linked to well specified multi-lingual terminology value sets, and (4) be derived from high quality ontologies. There is need to gain greater experience in how semantic resources should be defined, validated, and disseminated, how users (who increasingly will include patients) should be educated to improve the quality and consistency of EHR documentation and to make full use of it. There are urgent needs to scale up the authorship, acceptance, and adoption of clinical information standards, to leverage and harmonise the islands of standardisation optimally, to assure the quality of the artefacts produced, and to organise end-to-end governance of the development and adoption of solutions. PMID:21893897
Interoperable web applications for sharing data and products of the International DORIS Service
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.
2017-12-01
The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
Ronquillo, Jay G; Weng, Chunhua; Lester, William T
2017-11-17
Precision medicine involves three major innovations currently taking place in healthcare: electronic health records, genomics, and big data. A major challenge for healthcare providers, however, is understanding the readiness for practical application of initiatives like precision medicine. To better understand the current state and challenges of precision medicine interoperability using a national genetic testing registry as a starting point, placed in the context of established interoperability formats. We performed an exploratory analysis of the National Institutes of Health Genetic Testing Registry. Relevant standards included Health Level Seven International Version 3 Implementation Guide for Family History, the Human Genome Organization Gene Nomenclature Committee (HGNC) database, and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT). We analyzed the distribution of genetic testing laboratories, genetic test characteristics, and standardized genome/clinical code mappings, stratified by laboratory setting. There were a total of 25472 genetic tests from 240 laboratories testing for approximately 3632 distinct genes. Most tests focused on diagnosis, mutation confirmation, and/or risk assessment of germline mutations that could be passed to offspring. Genes were successfully mapped to all HGNC identifiers, but less than half of tests mapped to SNOMED CT codes, highlighting significant gaps when linking genetic tests to standardized clinical codes that explain the medical motivations behind test ordering. Conclusion: While precision medicine could potentially transform healthcare, successful practical and clinical application will first require the comprehensive and responsible adoption of interoperable standards, terminologies, and formats across all aspects of the precision medicine pipeline.
NASA Technical Reports Server (NTRS)
Figueroa, Jorge Fernando
2008-01-01
In February of 2008; NASA Stennis Space Center (SSC), NASA Kennedy Space Center (KSC), and The Applied Research Laboratory at Penn State University demonstrated a pilot implementation of an Integrated System Health Management (ISHM) capability at the Launch Complex 20 of KSC. The following significant accomplishments are associated with this development: (1) implementation of an architecture for ground operations ISHM, based on networked intelligent elements; (2) Use of standards for management of data, information, and knowledge (DIaK) leading to modular ISHM implementation with interoperable elements communicating according to standards (three standards were used: IEEE 1451 family of standards for smart sensors and actuators, Open Systems Architecture for Condition Based Maintenance (OSA-CBM) standard for communicating DIaK describing the condition of elements of a system, and the OPC standard for communicating data); (3) ISHM implementation using interoperable modules addressing health management of subsystems; and (4) use of a physical intelligent sensor node (smart network element or SNE capable of providing data and health) along with classic sensors originally installed in the facility. An operational demonstration included detection of anomalies (sensor failures, leaks, etc.), determination of causes and effects, communication among health nodes, and user interfaces.
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.
Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan
2015-09-22
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.
DOT National Transportation Integrated Search
2016-01-01
Connected vehicles have the potential to transform the way Americans travel by : allowing cars, buses, trucks, trains, traffic signals, smart phones, and other devices to : communicate through a safe, interoperable wireless network. A connected vehic...
The NTCIP guide : updated version 3
DOT National Transportation Integrated Search
2002-10-01
The transportation community has long needed transportation systems that could be built using devices and components that were interchangeable and interoperable. The National Transportation Communications for ITS Protocol (NTCIP) family of standards ...
NASA Astrophysics Data System (ADS)
Tsontos, V. M.; Arms, S. C.; Thompson, C. K.; Quach, N.; Lam, T.
2016-12-01
Earth science applications increasingly rely on the integration of multivariate data from diverse observational platforms. Whether for satellite mission cal/val, science or decision support, the coupling of remote sensing and in-situ field data is integral also to oceanographic workflows. This has prompted archives such as the PO.DAAC, NASA's physical oceanographic data archive, that historically has had a remote sensing focus, to adapt to better accommodate complex field campaign datasets. However, the inherent heterogeneity of in-situ datasets and their variable adherence to meta/data standards poses a significant impediment to interoperability, a problem originating early in the data lifecycle and significantly impacting stewardship and usability of these data long-term. Here we introduce a new initiative underway at PO.DAAC that seeks to catalyze efforts to address these challenges. It involves the enhancement and integration of available high TRL (Technology Readiness level) components for improved interoperability and support of in-situ data with a focus on a novel yet representative class of oceanographic field data: data from electronic tags deployed on a variety of marine species as biological sampling platforms in support of fisheries management and ocean observation efforts. This project seeks to demonstrate, deliver and ultimately sustain operationally a reusable and accessible set of tools to: 1) mediate reconciliation of heterogeneous source data into a tractable number of standardized formats consistent with earth science data standards; 2) harmonize existing metadata models for satellite and field datasets; 3) demonstrate the value added of integrated data access via a range of available tools and services hosted at the PO.DAAC, including a web-based visualization tool for comprehensive mapping of satellite and in-situ data. An innovative part of our project plan involves partnering with the leading electronic tag manufacturer to promote the adoption of appropriate data standards in their processing software. The proposed project thus adopts a model lifecycle approach complimented by broadly applicable technologies to address key data management and interoperability issues for in-situ data
An interoperability experiment for sharing hydrological rating tables
NASA Astrophysics Data System (ADS)
Lemon, D.; Taylor, P.; Sheahan, P.
2013-12-01
The increasing demand on freshwater resources is requiring authorities to produce more accurate and timely estimates of their available water. Calculation of continuous time-series of river discharge and storage volumes generally requires rating tables. These approximate relationships between two phenomena, such as river level and discharge, and allow us to produce continuous estimates of a phenomenon that may be impractical or impossible to measure directly. Standardised information models or access mechanisms for rating tables are required to support sharing and exchange of water flow data. An Interoperability Experiment (IE) is underway to test an information model that describes rating tables, the observations made to build these ratings, and river cross-section data. The IE is an initiative of the joint World Meteorological Organisation/Open Geospatial Consortium's Hydrology Domain Working Group (HydroDWG) and the model will be published as WaterML2.0 part 2. Interoperability Experiments (IEs) are low overhead, multiple member projects that are run under the OGC's interoperability program to test existing and emerging standards. The HydroDWG has previously run IEs to test early versions of OGC WaterML2.0 part 1 - timeseries. This IE is focussing on two key exchange scenarios: Sharing rating tables and gauging observations between water agencies. Through the use of standard OGC web services, rating tables and associated data will be made available from water agencies. The (Australian) Bureau of Meteorology will retrieve rating tables on-demand from water authorities, allowing the Bureau to run conversions of data within their own systems. Exposing rating tables and gaugings for online analysis and educational purposes. A web client will be developed to enable exploration and visualization of rating tables, gaugings and related metadata for monitoring points. The client gives a quick view into available rating tables, their periods of applicability and the standard deviation of observations against the relationship. An example of this client running can be seen at the link provided. The result of the IE will form the basis for the standardisation of WaterML2.0 part 2. The use of the standard will lead to increased transparency and accessibility of rating tables, while also improving general understanding of this important hydrological concept.
Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling
NASA Astrophysics Data System (ADS)
Tommasi, C.; Achille, C.
2017-02-01
Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
Interoperability Outlook in the Big Data Future
NASA Astrophysics Data System (ADS)
Kuo, K. S.; Ramachandran, R.
2015-12-01
The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.
Seeking the Path to Metadata Nirvana
NASA Astrophysics Data System (ADS)
Graybeal, J.
2008-12-01
Scientists have always found reusing other scientists' data challenging. Computers did not fundamentally change the problem, but enabled more and larger instances of it. In fact, by removing human mediation and time delays from the data sharing process, computers emphasize the contextual information that must be exchanged in order to exchange and reuse data. This requirement for contextual information has two faces: "interoperability" when talking about systems, and "the metadata problem" when talking about data. As much as any single organization, the Marine Metadata Interoperability (MMI) project has been tagged with the mission "Solve the metadata problem." Of course, if that goal is achieved, then sustained, interoperable data systems for interdisciplinary observing networks can be easily built -- pesky metadata differences, like which protocol to use for data exchange, or what the data actually measures, will be a thing of the past. Alas, as you might imagine, there will always be complexities and incompatibilities that are not addressed, and data systems that are not interoperable, even within a science discipline. So should we throw up our hands and surrender to the inevitable? Not at all. Rather, we try to minimize metadata problems as much as we can. In this we increasingly progress, despite natural forces that pull in the other direction. Computer systems let us work with more complexity, build community knowledge and collaborations, and preserve and publish our progress and (dis-)agreements. Funding organizations, science communities, and technologists see the importance interoperable systems and metadata, and direct resources toward them. With the new approaches and resources, projects like IPY and MMI can simultaneously define, display, and promote effective strategies for sustainable, interoperable data systems. This presentation will outline the role metadata plays in durable interoperable data systems, for better or worse. It will describe times when "just choosing a standard" can work, and when it probably won't work. And it will point out signs that suggest a metadata storm is coming to your community project, and how you might avoid it. From these lessons we will seek a path to producing interoperable, interdisciplinary, metadata-enlightened environment observing systems.
DOT National Transportation Integrated Search
2016-01-01
Connected vehicles have the potential to transform the way Americans travel by allowing cars, buses, trucks, trains, traffic signals, smart phones, and other devices to communicate through a safe, interoperable wireless network. A connected vehicle s...
76 FR 3089 - Roundtable on Federal Government Engagement in Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
... of a Smart Grid, secure and interoperable electronic health records, cybersecurity, cloud computing... government engage in sectors where there is a compelling national interest? How are existing public- private...
Tyndall, Timothy; Tyndall, Ayami
2018-01-01
Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.
An overview of the model integration process: From pre ...
Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation
Contextualizing Learning Scenarios According to Different Learning Management Systems
ERIC Educational Resources Information Center
Drira, R.; Laroussi, M.; Le Pallec, X.; Warin, B.
2012-01-01
In this paper, we first demonstrate that an instructional design process of Technology Enhanced Learning (TEL) systems based on a Model Driven Approach (MDA) addresses the limits of Learning Technology Standards (LTS), such as SCORM and IMS-LD. Although these standards ensure the interoperability of TEL systems across different Learning Management…
DOT National Transportation Integrated Search
2000-01-01
These draft standards are intended to work together to provide a high level of interoperability among regional and local traffic control centers. They provide consistent names, definitions and concepts similar to spelling and parts of speech to world...
Combining the CIDOC CRM and MPEG-7 to Describe Multimedia in Museums.
ERIC Educational Resources Information Center
Hunter, Jane
This paper describes a proposal for an interoperable metadata model, based on international standards, that has been designed to enable the description, exchange and sharing of multimedia resources both within and between cultural institutions. Domain-specific ontologies have been developed by two different ISO Working Groups to standardize the…
Competition in Defense Acquisitions
2008-05-14
NASA employees to maintain desktop assets No way to track costs, no standardization, not tracking service quality NASA’s Outsourcing Desktop...assets to the private sector. ODIN Goals Cut desktop computing costs Increase service quality Achieve interoperability and standardization Focus...not tracking service quality NASA’s Outsourcing Desktop Initiative (ODIN) transferred the responsibility for providing and managing the vast
Measures of Effectiveness for Rationalization, Standardization, and Interoperability
1988-09-01
and Standardization Group, Bonn COL Weichel LTC Corn D-2 U.S. Mission NATO COL Osborne MAJ Brand COL Smith MAJ Brew LTC Sendak Supreme Headquarters...Allie, ý,%vers Europe (SHAPE) COL Hanley CDR Lachlan COL Ross MAJ Heidema COL Solli MAJ Seay COL Tudor NAI 0 Maintenance and Supply Agency (NAMSA) Mr
2017-02-22
manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards
Digital Motion Imagery, Interoperability Challenges for Space Operations
NASA Technical Reports Server (NTRS)
Grubbs, Rodney
2012-01-01
With advances in available bandwidth from spacecraft and between terrestrial control centers, digital motion imagery and video is becoming more practical as a data gathering tool for science and engineering, as well as for sharing missions with the public. The digital motion imagery and video industry has done a good job of creating standards for compression, distribution, and physical interfaces. Compressed data streams can easily be transmitted or distributed over radio frequency, internet protocol, and other data networks. All of these standards, however, can make sharing video between spacecraft and terrestrial control centers a frustrating and complicated task when different standards and protocols are used by different agencies. This paper will explore the challenges presented by the abundance of motion imagery and video standards, interfaces and protocols with suggestions for common formats that could simplify interoperability between spacecraft and ground support systems. Real-world examples from the International Space Station will be examined. The paper will also discuss recent trends in the development of new video compression algorithms, as well likely expanded use of Delay (or Disruption) Tolerant Networking nodes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clements, Samuel L.; Edgar, Thomas W.; Manz, David O.
The purpose of this workshop was to identify and discuss concerns with the use and adoption of IEC 62351 security standard for IEC 61850 compliant control system products. The industry participants discussed performance, interoperability, adoption, challenges, business cases, and future issues.
Evaluation results of xTCA equipment for HEP experiments at CERN
NASA Astrophysics Data System (ADS)
Di Cosmo, M.; Bobillier, V.; Haas, S.; Joos, M.; Mico, S.; Vasey, F.; Vichoudis, P.
2013-12-01
The MicroTCA and AdvancedTCA industry standards are candidate modular electronic platforms for the upgrade of the current generation of high energy physics experiments. The PH-ESE group at CERN launched in 2011 the xTCA evaluation project with the aim of performing technical evaluations and eventually providing support for commercially available components. Different devices from different vendors have been acquired, evaluated and interoperability tests have been performed. This paper presents the test procedures and facilities that have been developed and focuses on the evaluation results including electrical, thermal and interoperability aspects.
Software design and implementation concepts for an interoperable medical communication framework.
Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank
2018-02-23
The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.
The Gaia Archive at ESAC: a VO-inside archive
NASA Astrophysics Data System (ADS)
Gonzalez-Nunez, J.
2015-12-01
The ESDC (ESAC Science Data Center) is one of the active members of the IVOA (International Virtual Observatory Alliance) that have defined a set of standards, libraries and concepts that allows to create flexible,scalable and interoperable architectures on the data archives development. In the case of astronomy science that involves the use of big catalogues, as in Gaia or Euclid, TAP, UWS and VOSpace standards can be used to create an architecture that allows the explotation of this valuable data from the community. Also, new challenges arise like the implementation of the new paradigm "move code close to the data", what can be partially obtained by the extension of the protocols (TAP+, UWS+, etc) or the languages (ADQL). We explain how we have used VO standards and libraries for the Gaia Archive that, not only have producing an open and interoperable archive but, also, minimizing the developement on certain areas. Also we will explain how we have extended these protocols and the future plans.
Inter-organizational future proof EHR systems. A review of the security and privacy related issues.
van der Linden, Helma; Kalra, Dipak; Hasman, Arie; Talmon, Jan
2009-03-01
Identification and analysis of privacy and security related issues that occur when health information is exchanged between health care organizations. Based on a generic scenario questions were formulated to reveal the occurring issues. Possible answers were verified in literature. Ensuring secure health information exchange across organizations requires a standardization of security measures that goes beyond organizational boundaries, such as global definitions of professional roles, global standards for patient consent and semantic interoperable audit logs. As to be able to fully address the privacy and security issues in interoperable EHRs and the long-life virtual EHR it is necessary to realize a paradigm shift from storing all incoming information in a local system to retrieving information from external systems whenever that information is deemed necessary for the care of the patient.
Bucur, Anca; van Leeuwen, Jasper; Chen, Njin-Zu; Claerhout, Brecht; de Schepper, Kristof; Perez-Rey, David; Paraiso-Medina, Sergio; Alonso-Calvo, Raul; Mehta, Keyur; Krykwinski, Cyril
2016-01-01
This paper describes a new Cohort Selection application implemented to support streamlining the definition phase of multi-centric clinical research in oncology. Our approach aims at both ease of use and precision in defining the selection filters expressing the characteristics of the desired population. The application leverages our standards-based Semantic Interoperability Solution and a Groovy DSL to provide high expressiveness in the definition of filters and flexibility in their composition into complex selection graphs including splits and merges. Widely-adopted ontologies such as SNOMED-CT are used to represent the semantics of the data and to express concepts in the application filters, facilitating data sharing and collaboration on joint research questions in large communities of clinical users. The application supports patient data exploration and efficient collaboration in multi-site, heterogeneous and distributed data environments. PMID:27570644
Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa
2010-08-21
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.
2010-01-01
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200
Standard Information Models for Representing Adverse Sensitivity Information in Clinical Documents.
Topaz, M; Seger, D L; Goss, F; Lai, K; Slight, S P; Lau, J J; Nandigam, H; Zhou, L
2016-01-01
Adverse sensitivity (e.g., allergy and intolerance) information is a critical component of any electronic health record system. While several standards exist for structured entry of adverse sensitivity information, many clinicians record this data as free text. This study aimed to 1) identify and compare the existing common adverse sensitivity information models, and 2) to evaluate the coverage of the adverse sensitivity information models for representing allergy information on a subset of inpatient and outpatient adverse sensitivity clinical notes. We compared four common adverse sensitivity information models: Health Level 7 Allergy and Intolerance Domain Analysis Model, HL7-DAM; the Fast Healthcare Interoperability Resources, FHIR; the Consolidated Continuity of Care Document, C-CDA; and OpenEHR, and evaluated their coverage on a corpus of inpatient and outpatient notes (n = 120). We found that allergy specialists' notes had the highest frequency of adverse sensitivity attributes per note, whereas emergency department notes had the fewest attributes. Overall, the models had many similarities in the central attributes which covered between 75% and 95% of adverse sensitivity information contained within the notes. However, representations of some attributes (especially the value-sets) were not well aligned between the models, which is likely to present an obstacle for achieving data interoperability. Also, adverse sensitivity exceptions were not well represented among the information models. Although we found that common adverse sensitivity models cover a significant portion of relevant information in the clinical notes, our results highlight areas needed to be reconciled between the standards for data interoperability.
Latest developments for the IAGOS database: Interoperability and metadata
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume
2014-05-01
In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.
Transforming Dermatologic Imaging for the Digital Era: Metadata and Standards.
Caffery, Liam J; Clunie, David; Curiel-Lewandrowski, Clara; Malvehy, Josep; Soyer, H Peter; Halpern, Allan C
2018-01-17
Imaging is increasingly being used in dermatology for documentation, diagnosis, and management of cutaneous disease. The lack of standards for dermatologic imaging is an impediment to clinical uptake. Standardization can occur in image acquisition, terminology, interoperability, and metadata. This paper presents the International Skin Imaging Collaboration position on standardization of metadata for dermatologic imaging. Metadata is essential to ensure that dermatologic images are properly managed and interpreted. There are two standards-based approaches to recording and storing metadata in dermatologic imaging. The first uses standard consumer image file formats, and the second is the file format and metadata model developed for the Digital Imaging and Communication in Medicine (DICOM) standard. DICOM would appear to provide an advantage over using consumer image file formats for metadata as it includes all the patient, study, and technical metadata necessary to use images clinically. Whereas, consumer image file formats only include technical metadata and need to be used in conjunction with another actor-for example, an electronic medical record-to supply the patient and study metadata. The use of DICOM may have some ancillary benefits in dermatologic imaging including leveraging DICOM network and workflow services, interoperability of images and metadata, leveraging existing enterprise imaging infrastructure, greater patient safety, and better compliance to legislative requirements for image retention.
NASA Astrophysics Data System (ADS)
Huang, C. Y.; Wu, C. H.
2016-06-01
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control IoT devices that use different communication protocols, which could consequently achieve the interoperable Internet of Things infrastructure.
NASA Technical Reports Server (NTRS)
Smit, Christine; Hegde, Mahabaleshwara; Strub, Richard; Bryant, Keith; Li, Angela; Petrenko, Maksym
2017-01-01
Giovanni is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization.
Roadmap for Testing and Validation of Electric Vehicle Communication Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratt, Richard M.; Tuffner, Francis K.; Gowri, Krishnan
Vehicle to grid communication standards are critical to the charge management and interoperability among plug-in electric vehicles (PEVs), charging stations and utility providers. The Society of Automobile Engineers (SAE), International Organization for Standardization (ISO), International Electrotechnical Commission (IEC) and the ZigBee Alliance are developing requirements for communication messages and protocols. While interoperability standards development has been in progress for more than two years, no definitive guidelines are available for the automobile manufacturers, charging station manufacturers or utility backhaul network systems. At present, there is a wide range of proprietary communication options developed and supported in the industry. Recent work bymore » the Electric Power Research Institute (EPRI), in collaboration with SAE and automobile manufacturers, has identified performance requirements and developed a test plan based on possible communication pathways using power line communication (PLC). Though the communication pathways and power line communication technology options are identified, much work needs to be done in developing application software and testing of communication modules before these can be deployed in production vehicles. This paper presents a roadmap and results from testing power line communication modules developed to meet the requirements of SAE J2847/1 standard.« less
Local, regional and national interoperability in hospital-level systems architecture.
Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J
2007-01-01
Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.
Connecting the clinical IT infrastructure to a service-oriented architecture of medical devices.
Andersen, Björn; Kasparick, Martin; Ulrich, Hannes; Franke, Stefan; Schlamelcher, Jan; Rockstroh, Max; Ingenerf, Josef
2018-02-23
The new medical device communication protocol known as IEEE 11073 SDC is well-suited for the integration of (surgical) point-of-care devices, so are the established Health Level Seven (HL7) V2 and Digital Imaging and Communications in Medicine (DICOM) standards for the communication of systems in the clinical IT infrastructure (CITI). An integrated operating room (OR) and other integrated clinical environments, however, need interoperability between both domains to fully unfold their potential for improving the quality of care as well as clinical workflows. This work thus presents concepts for the propagation of clinical and administrative data to medical devices, physiologic measurements and device parameters to clinical IT systems, as well as image and multimedia content in both directions. Prototypical implementations of the derived components have proven to integrate well with systems of networked medical devices and with the CITI, effectively connecting these heterogeneous domains. Our qualitative evaluation indicates that the interoperability concepts are suitable to be integrated into clinical workflows and are expected to benefit patients and clinicians alike. The upcoming HL7 Fast Healthcare Interoperability Resources (FHIR) communication standard will likely change the domain of clinical IT significantly. A straightforward mapping to its resource model thus ensures the tenability of these concepts despite a foreseeable change in demand and requirements.
A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability
Huang, Chih-Yuan; Wu, Cheng-Hung
2016-01-01
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759
23 CFR 950.7 - Interoperability requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...
23 CFR 950.7 - Interoperability requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...
23 CFR 950.7 - Interoperability requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...
23 CFR 950.7 - Interoperability requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...
23 CFR 950.7 - Interoperability requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...
Performance analysis of a proposed tightly-coupled medical instrument network based on CAN protocol.
Mujumdar, Shantanu; Thongpithoonrat, Pongnarin; Gurkan, D; McKneely, Paul K; Chapman, Frank M; Merchant, Fatima
2010-01-01
Advances in medical devices and health care has been phenomenal during the recent years. Although medical device manufacturers have been improving their instruments, network connection of these instruments still rely on proprietary technologies. Even if the interface has been provided by the manufacturer (e.g., RS-232, USB, or Ethernet coupled with a proprietary API), there is no widely-accepted uniform data model to access data of various bedside instruments. There is a need for a common standard which allows for internetworking with the medical devices from different manufacturers. ISO/IEEE 11073 (X73) is a standard attempting to unify the interfaces of all medical devices. X73 defines a client access mechanism that would be implemented into the communication controllers (residing between an instrument and the network) in order to access/network patient data. On the other hand, MediCAN™ technology suite has been demonstrated with various medical instruments to achieve interfacing and networking with a similar goal in its open standardization approach. However, it provides a more generic definition for medical data to achieve flexibility for networking and client access mechanisms. The instruments are in turn becoming more sophisticated; however, the operation of an instrument is still expected to be locally done by authorized medical personnel. Unfortunately, each medical instrument has its unique proprietary API (application programming interface - if any) to provide automated and electronic access to monitoring data. Integration of these APIs requires an agreement with the manufacturers towards realization of interoperable health care networking. As long as the interoperability of instruments with a network is not possible, ubiquitous access to patient status is limited only to manual entry based systems. This paper demonstrates an attempt to realize an interoperable medical instrument interface for networking using MediCAN technology suite as an open standard.
Bridging data models and terminologies to support adverse drug event reporting using EHR data.
Declerck, G; Hussain, S; Daniel, C; Yuksel, M; Laleci, G B; Twagirumukiza, M; Jaulent, M-C
2015-01-01
This article is part of the Focus Theme of METHODs of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". SALUS project aims at building an interoperability platform and a dedicated toolkit to enable secondary use of electronic health records (EHR) data for post marketing drug surveillance. An important component of this toolkit is a drug-related adverse events (AE) reporting system designed to facilitate and accelerate the reporting process using automatic prepopulation mechanisms. To demonstrate SALUS approach for establishing syntactic and semantic interoperability for AE reporting. Standard (e.g. HL7 CDA-CCD) and proprietary EHR data models are mapped to the E2B(R2) data model via SALUS Common Information Model. Terminology mapping and terminology reasoning services are designed to ensure the automatic conversion of source EHR terminologies (e.g. ICD-9-CM, ICD-10, LOINC or SNOMED-CT) to the target terminology MedDRA which is expected in AE reporting forms. A validated set of terminology mappings is used to ensure the reliability of the reasoning mechanisms. The percentage of data elements of a standard E2B report that can be completed automatically has been estimated for two pilot sites. In the best scenario (i.e. the available fields in the EHR have actually been filled), only 36% (pilot site 1) and 38% (pilot site 2) of E2B data elements remain to be filled manually. In addition, most of these data elements shall not be filled in each report. SALUS platform's interoperability solutions enable partial automation of the AE reporting process, which could contribute to improve current spontaneous reporting practices and reduce under-reporting, which is currently one major obstacle in the process of acquisition of pharmacovigilance data.
Capturing Essential Information to Achieve Safe Interoperability
Weininger, Sandy; Jaffe, Michael B.; Rausch, Tracy; Goldman, Julian M.
2016-01-01
In this article we describe the role of “clinical scenario” information to assure the safety of interoperable systems, as well as the system’s ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets (MDIDSa) and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a “Patient-controlled analgesia safety interlock” are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a Learning Health System to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™, is described. PMID:27387840
Capturing Essential Information to Achieve Safe Interoperability.
Weininger, Sandy; Jaffe, Michael B; Rausch, Tracy; Goldman, Julian M
2017-01-01
In this article, we describe the role of "clinical scenario" information to assure the safety of interoperable systems, as well as the system's ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore, improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a "patient-controlled analgesia safety interlock" are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a learning health system to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A Web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™ (MD PnP Program), is described.
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
Today, increasing numbers of intermittent generation sources (e.g., wind and photovoltaic) and new mobile intermittent loads (e.g., electric vehicles) can significantly affect traditional utility business practices and operations. At the same time, a growing number of technologies and devices, from appliances to lighting systems, are being deployed at consumer premises that have more sophisticated controls and information that remain underused for anything beyond basic building equipment operations. The intersection of these two drivers is an untapped opportunity and underused resource that, if appropriately configured and realized in open standards, can provide significant energy efficiency and commensurate savings on utility bills,more » enhanced and lower cost reliability to utilities, and national economic benefits in the creation of new markets, sectors, and businesses being fueled by the seamless coordination of energy and information through device and technology interoperability. Or, as the Quadrennial Energy Review puts it, “A plethora of both consumer-level and grid-level devices are either in the market, under development, or at the conceptual stage. When tied together through the information technology that is increasingly being deployed on electric utilities’ distribution grids, they can be an important enabling part of the emerging grid of the future. However, what is missing is the ability for all of these devices to coordinate and communicate their operations with the grid, and among themselves, in a common language — an open standard.” In this paper, we define interoperability as the ability to exchange actionable information between two or more systems within a home or building, or across and within organizational boundaries. Interoperability relies on the shared meaning of the exchanged information, with agreed-upon expectations and consequences, for the response to the information exchange.« less
Interoperability in the Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios Diaz, C.
2017-09-01
The protocols and standards currently being supported by the recently released new version of the Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet- Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. We explore these protocols in more detail providing scientifically useful examples of their usage within the PSA.
Information Systems Security Management: A Review and a Classification of the ISO Standards
NASA Astrophysics Data System (ADS)
Tsohou, Aggeliki; Kokolakis, Spyros; Lambrinoudakis, Costas; Gritzalis, Stefanos
The need for common understanding and agreement of functional and non-functional requirements is well known and understood by information system designers. This is necessary for both: designing the "correct" system and achieving interoperability with other systems. Security is maybe the best example of this need. If the understanding of the security requirements is not the same for all involved parties and the security mechanisms that will be implemented do not comply with some globally accepted rules and practices, then the system that will be designed will not necessarily achieve the desired security level and it will be very difficult to securely interoperate with other systems. It is therefore clear that the role and contribution of international standards to the design and implementation of security mechanisms is dominant. In this paper we provide a state of the art review on information security management standards published by the International Organization for Standardization and the International Electrotechnical Commission. Such an analysis is meaningful to security practitioners for an efficient management of information security. Moreover, the classification of the standards in the clauses of ISO/IEC 27001:2005 that results from our analysis is expected to provide assistance in dealing with the plethora of security standards.
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things
Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan
2015-01-01
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683
Metadata to Describe Genomic Information.
Delgado, Jaime; Naro, Daniel; Llorente, Silvia; Gelpí, Josep Lluís; Royo, Romina
2018-01-01
Interoperable metadata is key for the management of genomic information. We propose a flexible approach that we contribute to the standardization by ISO/IEC of a new format for efficient and secure compressed storage and transmission of genomic information.
A Virtual Observatory Approach to Planetary Data for Vesta and Ceres
NASA Astrophysics Data System (ADS)
Giardino, M.; Fonte, S.; Politi, R.; Ivanovski, S.; Longobardo, A.; Capria, M. T.; Erard, S.; De Sanctis, M. C.
2018-04-01
A virtual observatory service for DAWN/VIR spectral dataset is presented, based upon the IVOA standards adapted to the planetary field. Advantages of such an approach will be discussed, especially concerning interoperability and availability.
Advances in Using Opensearch for Earth Science Data Discovery and Interoperability
NASA Astrophysics Data System (ADS)
Newman, D. J.; Mitchell, A. E.
2014-12-01
As per www.opensearch.org: OpenSearch is a collection of simple formats for the sharing of search results A number of organizations (NASA, ESA, CEOS) have began to adopt this standard as a means of allowing both the discovery of earth science data and the aggregation of results from disparate data archives. OpenSearch has proven to be simpler and more effective at achieving these goals than previous efforts (Catalog Service for the web for example). This talk will outline: The basic ideas behind OpenSearch The ways in which we have extended the basic specification to accomodate the Earth Science use case (two-step searching, relevancy ranking, facets) A case-study of the above in action (CWICSmart + IDN OpenSearch + CWIC OpenSearch) The potential for interoperability this simple standard affords A discussion of where we can go in the future
Kobayashi, Shinji; Kume, Naoto; Yoshihara, Hiroyuki
2015-01-01
In 2001, we developed an EHR system for regional healthcare information inter-exchange and to provide individual patient data to patients. This system was adopted in three regions in Japan. We also developed a Medical Markup Language (MML) standard for inter- and intra-hospital communications. The system was built on a legacy platform, however, and had not been appropriately maintained or updated to meet clinical requirements. To improve future maintenance costs, we reconstructed the EHR system using archetype technology on the Ruby on Rails platform, and generated MML equivalent forms from archetypes. The system was deployed as a cloud-based system for preliminary use as a regional EHR. The system now has the capability to catch up with new requirements, maintaining semantic interoperability with archetype technology. It is also more flexible than the legacy EHR system.
Marschollek, Michael; Wolf, Klaus-H; Bott, Oliver-J; Geisler, Mirko; Plischke, Maik; Ludwig, Wolfram; Hornberger, Andreas; Haux, Reinhold
2007-01-01
Despite the abundance of past home care projects and the maturity of the technologies used, there is no widespread dissemination as yet. The absence of accepted standards and thus interoperability and the inadequate integration into transinstitutional health information systems (tHIS) are perceived as key factors. Based on the respective literature and previous experiences in home care projects we propose an architectural model for home care as part of a transinstitutional health information system using the HL7 clinical document architecture (CDA) as well as the HL7 Arden Syntax for Medical Logic Systems. In two short case studies we describe the practical realization of the architecture as well as first experiences. Our work can be regarded as a first step towards an interoperable - and in our view sustainable - home care architecture based on a prominent document standard from the health information system domain.
2015-01-01
Background A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple’s smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. Objective The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. Methods For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. Results The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. Conclusions By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care. PMID:26581920
Gay, Valerie; Leijdekkers, Peter
2015-11-18
A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple's smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care.
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Access and Interoperability
NASA Astrophysics Data System (ADS)
Fan, D.; He, B.; Xiao, J.; Li, S.; Li, C.; Cui, C.; Yu, C.; Hong, Z.; Yin, S.; Wang, C.; Cao, Z.; Fan, Y.; Mi, L.; Wan, W.; Wang, J.
2015-09-01
Data access and interoperability module connects the observation proposals, data, virtual machines and software. According to the unique identifier of PI (principal investigator), an email address or an internal ID, data can be collected by PI's proposals, or by the search interfaces, e.g. conesearch. Files associated with the searched results could be easily transported to cloud storages, including the storage with virtual machines, or several commercial platforms like Dropbox. Benefitted from the standards of IVOA (International Observatories Alliance), VOTable formatted searching result could be sent to kinds of VO software. Latter endeavor will try to integrate more data and connect archives and some other astronomical resources.
Search for supporting methodologies - Or how to support SEI for 35 years
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Masline, Richard C.
1991-01-01
Concepts relevant to the development of an evolvable information management system are examined in terms of support for the Space Exploration Initiative. The issues of interoperability within NASA and industry initiatives are studied including the Open Systems Interconnection standard and the operating system of the Open Software Foundation. The requirements of partitioning functionality into separate areas are determined with attention given to the infrastructure required to ensure system-wide compliance. The need for a decision-making context is a key to the distributed implementation of the program, and this environment is concluded to be next step in developing an evolvable, interoperable, and securable support network.
The GEOSS solution for enabling data interoperability and integrative research.
Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola
2014-03-01
Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.
A common type system for clinical natural language processing
2013-01-01
Background One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. Results We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. Conclusions We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types. PMID:23286462
Oluoch, Tom; Muturi, David; Kiriinya, Rose; Waruru, Anthony; Lanyo, Kevin; Nguni, Robert; Ojwang, James; Waters, Keith P; Richards, Janise
2015-01-01
Sub-Saharan Africa (SSA) bears the heaviest burden of the HIV epidemic. Health workers play a critical role in the scale-up of HIV programs. SSA also has the weakest information and communication technology (ICT) infrastructure globally. Implementing interoperable national health information systems (HIS) is a challenge, even in developed countries. Countries in resource-limited settings have yet to demonstrate that interoperable systems can be achieved, and can improve quality of healthcare through enhanced data availability and use in the deployment of the health workforce. We established interoperable HIS integrating a Master Facility List (MFL), District Health Information Software (DHIS2), and Human Resources Information Systems (HRIS) through application programmers interfaces (API). We abstracted data on HIV care, health workers deployment, and health facilities geo-coordinates. Over 95% of data elements were exchanged between the MFL-DHIS and HRIS-DHIS. The correlation between the number of HIV-positive clients and nurses and clinical officers in 2013 was R2=0.251 and R2=0.261 respectively. Wrong MFL codes, data type mis-match and hyphens in legacy data were key causes of data transmission errors. Lack of information exchange standards for aggregate data made programming time-consuming.
Lemnos interoperable security project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halbgewachs, Ronald D.
2010-03-01
With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance inmore » a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.« less
A common type system for clinical natural language processing.
Wu, Stephen T; Kaggal, Vinod C; Dligach, Dmitriy; Masanz, James J; Chen, Pei; Becker, Lee; Chapman, Wendy W; Savova, Guergana K; Liu, Hongfang; Chute, Christopher G
2013-01-03
One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types.
JAXA-NASA Interoperability Demonstration for Application of DTN Under Simulated Rain Attenuation
NASA Technical Reports Server (NTRS)
Suzuki, Kiyoshisa; Inagawa, Shinichi; Lippincott, Jeff; Cecil, Andrew J.
2014-01-01
As is well known, K-band or higher band communications in space link segment often experience intermittent disruptions caused by heavy rainfall. In view of keeping data integrity and establishing autonomous operations under such situation, it is important to consider introducing a tolerance mechanism such as Delay/Disruption Tolerant Networking (DTN). The Consultative Committee for Space Data Systems (CCSDS) is studying DTN as part of the standardization activities for space data systems. As a contribution to CCSDS and a feasibility study for future utilization of DTN, Japan Aerospace Exploration Agency (JAXA) and National Aeronautics and Space Administration (NASA) conducted an interoperability demonstration for confirming its tolerance mechanism and capability of automatic operation using Data Relay Test Satellite (DRTS) space link and its ground terminals. Both parties used the Interplanetary Overlay Network (ION) open source software, including the Bundle Protocol, the Licklider Transmission Protocol, and Contact Graph Routing. This paper introduces the contents of the interoperability demonstration and its results.
To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure
NASA Astrophysics Data System (ADS)
Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne
2012-08-01
A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.
Platform links clinical data with electronic health records
To make data gathered from patients in clinical trials available for use in standard care, NCI has created a new computer tool to support interoperability between clinical research and electronic health record systems. This new software represents an inno
2017-01-01
Background Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. Objective The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. Methods An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). Results The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. Conclusions This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. PMID:28851681
Reis, Zilma Silveira Nogueira; Maia, Thais Abreu; Marcolino, Milena Soriano; Becerra-Posada, Francisco; Novillo-Ortiz, David; Ribeiro, Antonio Luiz Pinho
2017-08-29
Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. ©Zilma Silveira Nogueira Reis, Thais Abreu Maia, Milena Soriano Marcolino, Francisco Becerra-Posada, David Novillo-Ortiz, Antonio Luiz Pinho Ribeiro. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 29.08.2017.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
GMSEC Interface Specification Document 2016 March
NASA Technical Reports Server (NTRS)
Handy, Matthew
2016-01-01
The GMSEC Interface Specification Document contains the standard set of defined messages. Each GMSEC standard message contains a GMSEC Information Bus Header section and a Message Contents section. Each message section identifies required fields, optional fields, data type and recommended use of the fields. Additionally, this document includes the message subjects associated with the standard messages. The system design of the operations center should ensure the components that are selected use both the API and the defined standard messages in order to achieve full interoperability from component to component.
Evolution of the Standard Simulation Architecture
2004-06-01
Interoperability Workshop, Paper 02S-SIW-016. 5. Deitel & Deitel , 2001. “C++ How to Program , Third Edition.” Prentice Hall, Inc., Upper Saddle...carefully followed for software to be successfully reused in other programs . The Standard Simulation Architecture (SSA) promotes these principles...capabilities described in this paper have been developed and successfully used on various government programs . This paper attempts to bring these
Sass, Julian; Becker, Kim; Ludmann, Dominik; Pantazoglou, Elisabeth; Dewenter, Heike; Thun, Sylvia
2018-01-01
A nationally uniform medication plan has recently been part of German legislation. The specification for the German medication plan was developed in cooperation between various stakeholders of the healthcare system. Its' goal is to enhance usability and interoperability while also providing patients and physicians with the necessary information they require for a safe and high-quality therapy. Within the research and development project named Medication Plan PLUS, the specification of the medication plan was tested and reviewed for semantic interoperability in particular. In this study, the list of pharmaceutical dose forms provided in the specification was mapped to the standard terms of the European Directorate for the Quality of Medicines & HealthCare by different coders. The level of agreement between coders was calculated using Cohen's Kappa (κ). Results show that less than half of the dose forms could be coded with EDQM standard terms. In addition to that Kappa was found to be moderate, which means rather unconvincing agreement among coders. In conclusion, there is still vast room for improvement in utilization of standardized international vocabulary and unused potential considering cross-border eHealth implementations in the future.
Implementing PAT with Standards
NASA Astrophysics Data System (ADS)
Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.
2016-02-01
Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.
Lin, M.C.; Vreeman, D.J.; Huff, S.M.
2012-01-01
Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add computable relationships that can be used for making reliable inferences about the data. The LOINC committee should also provide detailed guidance on best practices for mapping from local codes to LOINC codes and for using LOINC codes in data exchange. PMID:22306382
Secure and interoperable communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David
2016-05-01
The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.
Multi-model-based interactive authoring environment for creating shareable medical knowledge.
Ali, Taqdir; Hussain, Maqbool; Ali Khan, Wajahat; Afzal, Muhammad; Hussain, Jamil; Ali, Rahman; Hassan, Waseem; Jamshed, Arif; Kang, Byeong Ho; Lee, Sungyoung
2017-10-01
Technologically integrated healthcare environments can be realized if physicians are encouraged to use smart systems for the creation and sharing of knowledge used in clinical decision support systems (CDSS). While CDSSs are heading toward smart environments, they lack support for abstraction of technology-oriented knowledge from physicians. Therefore, abstraction in the form of a user-friendly and flexible authoring environment is required in order for physicians to create shareable and interoperable knowledge for CDSS workflows. Our proposed system provides a user-friendly authoring environment to create Arden Syntax MLM (Medical Logic Module) as shareable knowledge rules for intelligent decision-making by CDSS. Existing systems are not physician friendly and lack interoperability and shareability of knowledge. In this paper, we proposed Intelligent-Knowledge Authoring Tool (I-KAT), a knowledge authoring environment that overcomes the above mentioned limitations. Shareability is achieved by creating a knowledge base from MLMs using Arden Syntax. Interoperability is enhanced using standard data models and terminologies. However, creation of shareable and interoperable knowledge using Arden Syntax without abstraction increases complexity, which ultimately makes it difficult for physicians to use the authoring environment. Therefore, physician friendliness is provided by abstraction at the application layer to reduce complexity. This abstraction is regulated by mappings created between legacy system concepts, which are modeled as domain clinical model (DCM) and decision support standards such as virtual medical record (vMR) and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT). We represent these mappings with a semantic reconciliation model (SRM). The objective of the study is the creation of shareable and interoperable knowledge using a user-friendly and flexible I-KAT. Therefore we evaluated our system using completeness and user satisfaction criteria, which we assessed through the system- and user-centric evaluation processes. For system-centric evaluation, we compared the implementation of clinical information modelling system requirements in our proposed system and in existing systems. The results suggested that 82.05% of the requirements were fully supported, 7.69% were partially supported, and 10.25% were not supported by our system. In the existing systems, 35.89% of requirements were fully supported, 28.20% were partially supported, and 35.89% were not supported. For user-centric evaluation, the assessment criterion was 'ease of use'. Our proposed system showed 15 times better results with respect to MLM creation time than the existing systems. Moreover, on average, the participants made only one error in MLM creation using our proposed system, but 13 errors per MLM using the existing systems. We provide a user-friendly authoring environment for creation of shareable and interoperable knowledge for CDSS to overcome knowledge acquisition complexity. The authoring environment uses state-of-the-art decision support-related clinical standards with increased ease of use. Copyright © 2017 Elsevier B.V. All rights reserved.
Documenting Models for Interoperability and Reusability ...
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod
Brooks, Larry M; Kuhlman, Benjamin J; McKesson, Doug W; McCloskey, Leo
2013-01-01
The poor interoperability of anthocyanin glycosides measurements by two pH differential methods is documented. Adams-Harbertson, which was proposed for commercial winemaking, was compared to AOAC Official Method 2005.02 for wine. California bottled wines (Pinot Noir, Merlot, and Cabernet Sauvignon) were assayed in a collaborative study (n=105), which found mean precision of Adams-Harbertson winery versus reference measurements to be 77 +/- 20%. Maximum error is expected to be 48% for Pinot Noir, 42% for Merlot, and 34% for Cabernet Sauvignon from reproducibility RSD. Range of measurements was actually 30 to 91% for Pinot Noir. An interoperability study (n=30) found Adams-Harbertson produces measurements that are nominally 150% of the AOAC pH differential method. Large analytical chemistry differences are: AOAC method uses Beer-Lambert equation and measures absorbance at pH 1.0 and 4.5, proposed a priori by Flueki and Francis; whereas Adams-Harbertson uses "universal" standard curve and measures absorbance ad hoc at pH 1.8 and 4.9 to reduce the effects of so-called co-pigmentation. Errors relative to AOAC are produced by Adams-Harbertson standard curve over Beer-Lambert and pH 1.8 over pH 1.0. The study recommends using AOAC Official Method 2005.02 for analysis of wine anthocyanin glycosides.
CytometryML: a data standard which has been designed to interface with other standards
NASA Astrophysics Data System (ADS)
Leif, Robert C.
2007-02-01
Because of the differences in the requirements, needs, and past histories including existing standards of the creating organizations, a single encompassing cytology-pathology standard will not, in the near future, replace the multiple existing or under development standards. Except for DICOM and FCS, these standardization efforts are all based on XML. CytometryML is a collection of XML schemas, which are based on the Digital Imaging and Communications in Medicine (DICOM) and Flow Cytometry Standard (FCS) datatypes. The CytometryML schemas contain attributes that link them to the DICOM standard and FCS. Interoperability with DICOM has been facilitated by, wherever reasonable, limiting the difference between CytometryML and the previous standards to syntax. In order to permit the Resource Description Framework, RDF, to reference the CytometryML datatypes, id attributes have been added to many CytometryML elements. The Laboratory Digital Imaging Project (LDIP) Data Exchange Specification and the Flowcyt standards development effort employ RDF syntax. Documentation from DICOM has been reused in CytometryML. The unity of analytical cytology was demonstrated by deriving a microscope type and a flow cytometer type from a generic cytometry instrument type. The feasibility of incorporating the Flowcyt gating schemas into CytometryML has been demonstrated. CytometryML is being extended to include many of the new DICOM Working Group 26 datatypes, which describe patients, specimens, and analytes. In situations where multiple standards are being created, interoperability can be facilitated by employing datatypes based on a common set of semantics and building in links to standards that employ different syntax.
NASA Astrophysics Data System (ADS)
Simonis, I.; Alameh, N.; Percivall, G.
2012-04-01
The GEOSS Architecture Implementation Pilots (AIP) develop and pilot new process and infrastructure components for the GEOSS Common Infrastructure (GCI) and the broader GEOSS architecture through an evolutionary development process consisting of a set of phases. Each phase addresses a set of Societal Benefit Areas (SBA) and geoinformatic topics. The first three phases consisted of architecture refinements based on interactions with users; component interoperability testing; and SBA-driven demonstrations. The fourth phase (AIP-4) documented here focused on fostering interoperability arrangements and common practices for GEOSS by facilitating access to priority earth observation data sources and by developing and testing specific clients and mediation components to enable such access. Additionally, AIP-4 supported the development of a thesaurus for earth observation parameters and tutorials to guide data providers to make their data available through GEOSS. The results of AIP-4 are documented in two engineering reports and captured in a series of videos posted online. Led by the Open Geospatial Consortium (OGC), AIP-4 built on contributions from over 60 organizations. This wide portfolio helped testing interoperability arrangements in a highly heterogeneous environment. AIP-4 participants cooperated closely to test available data sets, access services, and client applications in multiple workflows and set ups. Eventually, AIP-4 improved the accessibility of GEOSS datasets identified as supporting Critical Earth Observation Priorities by the GEO User Interface Committee (UIC), and increased the use of the data through promoting availability of new data services, clients, and applications. During AIP-4, A number of key earth observation data sources have been made available online at standard service interfaces, discovered using brokered search approaches, and processed and visualized in generalized client applications. AIP-4 demonstrated the level of interoperability that can be achieved using currently available standards and corresponding products and implementations. The AIP-4 integration testing process proved that the integration of heterogeneous data resources available via interoperability arrangements such as WMS, WFS, WCS and WPS indeed works. However, the integration often required various levels of customizations on the client side to accommodate for variations in the service implementations. Those variations seem to be based on both malfunctioning service implementations as well as varying interpretations of or inconsistencies in existing standards. Other interoperability issues identified revolve around missing metadata or using unrecognized identifiers in the description of GEOSS resources. Once such issues are resolved, continuous compliance testing is necessary to ensure minimizing variability of implementations. Once data providers can choose from a set of enhanced implementations for offering their data using consistent interoperability arrangements, the barrier to client and decision support implementation developers will be lowered, leading to true leveraging of earth observation data through GEOSS. AIP-4 results, lessons learnt from previous AIPs 1-3 and close coordination with the Infrastructure Implementation Board (IIB), the successor of the Architecture and Data Committee (ADC), form the basis in the current preparation phase for the next Architecture Implementation Pilot, AIP-5. The Call For Participation will be launched in February and the pilot will be conducted from May to November 2012. The current planning foresees a scenario- oriented approach, with possible scenarios coming from the domains of disaster management, health (including air quality and waterborne diseases), water resource observations, energy, biodiversity and climate change, and agriculture.
A Linked Dataset of Medical Educational Resources
ERIC Educational Resources Information Center
Dietze, Stefan; Taibi, Davide; Yu, Hong Qing; Dovrolis, Nikolas
2015-01-01
Reusable educational resources became increasingly important for enhancing learning and teaching experiences, particularly in the medical domain where resources are particularly expensive to produce. While interoperability across educational resources metadata repositories is yet limited to the heterogeneity of metadata standards and interface…
Interoperability Is Key to Smart Grid Success - Continuum Magazine | NREL
standards. Ever wonder what makes it possible to withdraw money securely from another bank's ATM, or call a communication allows access to money and phone calls nationwide, the Smart Grid-an automated electric power
Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond
NASA Astrophysics Data System (ADS)
Tomas, Robert; Lutz, Michael
2013-04-01
The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard encoding. However, since the conceptual models are independent of concrete encodings, it is also possible to derive other encodings (e.g. based on RDF). Registers provide unique and persistent identifiers for a number of different types of information items (e.g. terms from a controlled vocabulary or units of measure) and allow their consistent management and versioning. By using these identifiers in data, references to specific information items can be made unique and unambiguous. It is important that these interoperability solutions are not developed in isolation - for Europe only. This has been identified from the beginning, and therefore, international standards have been taken into account and been widely referred to in INSPIRE. This mutual cooperation with international standardisation activities needs to be maintained or even extended. For example, where INSPIRE has gone beyond existing standards, the INSPIRE interoperability solutions should be introduced to the international standardisation initiatives. However, in some cases, it is difficult to choose the appropriate international organization or standardisation body (e.g. where there are several organizations overlapping in scope) or to achieve international agreements that accept European specifics. Furthermore, the development of the INSPIRE specifications (to be legally adopted in 2013) is only a beginning of the effort to make environmental data interoperable. Their actual implementation by data providers across Europe, as well as the rapid development in the earth sciences (e.g. from new simulation models, scientific advances, etc.) and ICT technology will lead to requests for changes. It is therefore crucial to ensure the long-term sustainable maintenance and further development of the proposed infrastructure. This task cannot be achieved by the INSPIRE coordination team of the European Commission alone. It is therefore crucial to closely involve relevant (where possible, umbrella) organisations in the earth sciences, who can provide the necessary domain knowledge and expert networks.
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier
2015-04-01
This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking each data element to a controlled, shared vocabulary. In Europe, INSPIRE defines a shared vocabulary and its associated links to an ontology. For hydrographical information this can be used as a baseline. • Organizational: Harmonizing policy aspects This level of interoperability deals with operational methodologies and procedures that organizations use to administrate their own data and processing capabilities and to share those capabilities with others. This layer is addressed by the adoption of common policy briefs that facilitate both robust protocols and flexibility to interact with others. • Data visualization: Making data easy to see The WMS and WMTS standards are the most commonly used geographic information visualization standards for sharing information in web portals. Our solution will incorporate a quality extension of these standards for visualizing data quality as nested layers linked to the different data sets. In the presented approach, the use of standards can be seen twofold: the tools and products should leverage standards wherever possible to ensure interoperability between solution providers, and the platform itself must utilize standards as much as possible, to allow for example the integration with other systems through open APIs or the description of available items.
Geospatial Standards and the Knowledge Generation Lifescycle
NASA Technical Reports Server (NTRS)
Khalsa, Siri Jodha S.; Ramachandran, Rahul
2014-01-01
Standards play an essential role at each stage in the sequence of processes by which knowledge is generated from geoscience observations, simulations and analysis. This paper provides an introduction to the field of informatics and the knowledge generation lifecycle in the context of the geosciences. In addition we discuss how the newly formed Earth Science Informatics Technical Committee is helping to advance the application of standards and best practices to make data and data systems more usable and interoperable.
Martínez-Espronceda, Miguel; Trigo, Jesús D; Led, Santiago; Barrón-González, H Gilberto; Redondo, Javier; Baquero, Alfonso; Serrano, Luis
2014-11-01
Experiences applying standards in personal health devices (PHDs) show an inherent trade-off between interoperability and costs (in terms of processing load and development time). Therefore, reducing hardware and software costs as well as time-to-market is crucial for standards adoption. The ISO/IEEE11073 PHD family of standards (also referred to as X73PHD) provides interoperable communication between PHDs and aggregators. Nevertheless, the responsibility of achieving inexpensive implementations of X73PHD in limited resource microcontrollers falls directly on the developer. Hence, the authors previously presented a methodology based on patterns to implement X73-compliant PHDs into devices with low-voltage low-power constraints. That version was based on multitasking, which required additional features and resources. This paper therefore presents an event-driven evolution of the patterns-based methodology for cost-effective development of standardized PHDs. The results of comparing between the two versions showed that the mean values of decrease in memory consumption and cycles of latency are 11.59% and 45.95%, respectively. In addition, several enhancements in terms of cost-effectiveness and development time can be derived from the new version of the methodology. Therefore, the new approach could help in producing cost-effective X73-compliant PHDs, which in turn could foster the adoption of standards. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
The Value of Data and Metadata Standardization for Interoperability in Giovanni
NASA Astrophysics Data System (ADS)
Smit, C.; Hegde, M.; Strub, R. F.; Bryant, K.; Li, A.; Petrenko, M.
2017-12-01
Giovanni (https://giovanni.gsfc.nasa.gov/giovanni/) is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization. This poster gives examples of how our metadata and data standards, both external and internal, have both simplified our code base and improved our users' experiences.
Content analysis of physical examination templates in electronic health records using SNOMED CT.
Gøeg, Kirstine Rosenbeck; Chen, Rong; Højen, Anne Randorff; Elberg, Pia
2014-10-01
Most electronic health record (EHR) systems are built on proprietary information models and terminology, which makes achieving semantic interoperability a challenge. Solving interoperability problems requires well-defined standards. In contrast, the need to support clinical work practice requires a local customization of EHR systems. Consequently, contrasting goals may be evident in EHR template design because customization means that local EHR organizations can define their own templates, whereas standardization implies consensus at some level. To explore the complexity of balancing these two goals, this study analyzes the differences and similarities between templates in use today. A similarity analysis was developed on the basis of SNOMED CT. The analysis was performed on four physical examination templates from Denmark and Sweden. The semantic relationships in SNOMED CT were used to quantify similarities and differences. Moreover, the analysis used these identified similarities to investigate the common content of a physical examination template. The analysis showed that there were both similarities and differences in physical examination templates, and the size of the templates varied from 18 to 49 fields. In the SNOMED CT analysis, exact matches and terminology similarities were represented in all template pairs. The number of exact matches ranged from 7 to 24. Moreover, the number of unrelated fields differed a lot from 1/18 to 22/35. Cross-country comparisons tended to have more unrelated content than within-country comparisons. On the basis of identified similarities, it was possible to define the common content of a physical examination. Nevertheless, a complete view on the physical examination required the inclusion of both exact matches and terminology similarities. This study revealed that a core set of items representing the physical examination templates can be generated when the analysis takes into account not only exact matches but also terminology similarities. This core set of items could be a starting point for standardization and semantic interoperability. However, both unmatched terms and terminology matched terms pose a challenge for standardization. Future work will include using local templates as a point of departure in standardization to see if local requirements can be maintained in a standardized framework. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2017-02-15
ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Reference Architecture Model Enabling Standards Interoperability.
Blobel, Bernd
2017-01-01
Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.
International Convergence on Geoscience Cyberinfrastructure
NASA Astrophysics Data System (ADS)
Allison, M. L.; Atkinson, R.; Arctur, D. K.; Cox, S.; Jackson, I.; Nativi, S.; Wyborn, L. A.
2012-04-01
There is growing international consensus on addressing the challenges to cyber(e)-infrastructure for the geosciences. These challenges include: Creating common standards and protocols; Engaging the vast number of distributed data resources; Establishing practices for recognition of and respect for intellectual property; Developing simple data and resource discovery and access systems; Building mechanisms to encourage development of web service tools and workflows for data analysis; Brokering the diverse disciplinary service buses; Creating sustainable business models for maintenance and evolution of information resources; Integrating the data management life-cycle into the practice of science. Efforts around the world are converging towards de facto creation of an integrated global digital data network for the geosciences based on common standards and protocols for data discovery and access, and a shared vision of distributed, web-based, open source interoperable data access and integration. Commonalities include use of Open Geospatial Consortium (OGC) and ISO specifications and standardized data interchange mechanisms. For multidisciplinarity, mediation, adaptation, and profiling services have been successfully introduced to leverage the geosciences standards which are commonly used by the different geoscience communities -introducing a brokering approach which extends the basic SOA archetype. Principal challenges are less technical than cultural, social, and organizational. Before we can make data interoperable, we must make people interoperable. These challenges are being met by increased coordination of development activities (technical, organizational, social) among leaders and practitioners in national and international efforts across the geosciences to foster commonalities across disparate networks. In doing so, we will 1) leverage and share resources, and developments, 2) facilitate and enhance emerging technical and structural advances, 3) promote interoperability across scientific domains, 4) support the promulgation and institutionalization of agreed-upon standards, protocols, and practice, and 5) enhance knowledge transfer not only across the community, but into the domain sciences, 6) lower existing entry barriers for users and data producers, 7) build on the existing disciplinary infrastructures leveraging their service buses. . All of these objectives are required for establishing a permanent and sustainable cyber(e)-infrastructure for the geosciences. The rationale for this approach is well articulated in the AuScope mission statement: "Many of these problems can only be solved on a national, if not global scale. No single researcher, research institution, discipline or jurisdiction can provide the solutions. We increasingly need to embrace e-Research techniques and use the internet not only to access nationally distributed datasets, instruments and compute infrastructure, but also to build online, 'virtual' communities of globally dispersed researchers." Multidisciplinary interoperability can be successfully pursued by adopting a "system of systems" or a "Network of Networks" philosophy. This approach aims to: (a) supplement but not supplant systems mandates and governance arrangements; (b) keep the existing capacities as autonomous as possible; (c) lower entry barriers; (d) Build incrementally on existing infrastructures (information systems); (e) incorporate heterogeneous resources by introducing distribution and mediation functionalities. This approach has been adopted by the European INSPIRE (Infrastructure for Spatial Information in the European Community) initiative and by the international GEOSS (Global Earth Observation System of Systems) programme.
Entity Modeling and Immersive Decision Environments
2011-09-01
Simulation Technologies (REST) Lerman, D. J. (2010). Correct Weather Modeling of non-Standard Days (10F- SIW -004). In Proceedings of 2010 Fall Simulation...Interoperability Workshop (Fall SIW ) SISO. Orlando, FL: SISO. Most flight simulators compute and fly in a weather environment that matches a
Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Annoni, Alessandro
2013-04-01
Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.
A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows
NASA Astrophysics Data System (ADS)
Babin, B. L.; Hu, L.
2008-12-01
Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).
Systems Harmonization and Convergence - the GIGAS Approach
NASA Astrophysics Data System (ADS)
Marchetti, P. G.; Biancalana, A.; Coene, Y.; Uslander, T.
2009-04-01
0.1 Background The GIGAS1 Support Action promotes the coherent and interoperable development of the GMES, INSPIRE and GEOSS initiatives through their concerted adoption of standards, protocols, and open architectures. 0.2 Preparing for Coordinated Data Access The GMES Coordinated Data Access System is under design and implementation2. This objective has motivated the definition of the interoperability standards between the contributing missions. The following elements have been addressed with associated papers submitted to OGC: The EO Product Metadata has been based on the OGC Geographic Markup Language, addressing sensor characteristics for optical, radar and atmospheric products. Collection and service discovery: an ISO extension package for CSW ebRim has been proposed. Catalogue Service (CSW): an Earth Observation extension package of the CSW ebRim has been proposed. Feasibility Analysis and Order: an Order interface control document and an Earth Observation profile of the Sensor Planning Service have been proposed. Online Data Access: an Earth Observation profile of the Web Map Services (WMS) for visualization and evaluation purposes has been proposed. Identity (user) management: the objective in the long term is to allow for a single sign-on to the Coordinated Data Access system by users registered in the various Earth Observation ground segments by providing a federated identity across participating ground segments, exploiting OASIS standards. 0.3 The GIGAS proposed harmonization approach The approach proposed by GIGAS is based on three elements: Technology watch Comparative analysis Shaping of initiatives and standards This paper concentrates on the methodology for technology watch and comparative analysis. The complexity of the GIGAS scenario involving huge systems (i.e. GEOSS, INSPIRE, GMES etc.) entails the interaction with different heterogeneous partners, each with a specific competence, expertise and know-how. 0.3.1 Technology watch The methodology proposed is based on an RM-ODP based study supported by interoperability use cases and scenarios used to derive requirements. GIGAS will monitor the INSPIRE, GMES and GEOSS evolution and analyze the requirements, the standards, the services and the architecture, the models, the processes and the consensus mechanisms with the same elements of the other systems under analysis. activities in the fields of standard development that are part of the three initiatives. This task will provide the basis for how these three initiatives will strategically support consensus and efficient standards development going forward. architecture, specifications, innovative concepts and software developments of past or ongoing FP6/FP7 research topics. The use of an RM-ODP approach is selected as: most of the architectural approaches to be compared are based on RM-ODP, it supports distributed processing, it aims at fostering interoperability across heterogeneous systems, it tries to hide distribution to systems developers. However, as most of the systems to be considered have the characteristic of a loosely-coupled network of systems and services instead of a "distributed processing system based on interacting objects", the RM-ODP concepts are tailored for the GIGAS needs. The usage of RM-ODP for GIGAS Requirements and Technology Watch is two-fold: Architectural analysis: It is performed for all projects and initiatives. Its purpose is to identify possibilities but also major obstacles for interoperability. Furthermore, it identifies the major use cases to be analysed in more detail. Use Case Implementation Analysis: It is used to describe how selected use cases of the projects and initiatives are implemented in the different architectures. Its purpose is to identify technological gaps and concrete problems of interoperability. It is performed only for selected use cases. The output of the Technology Watch is an RM-ODP based report containing parallel analysis on the same aspects on the three initiatives integrated by analysis of relevant FP6-FP7 projects and standardization activities. 0.3.2 Comparative Analysis Based on the outcomes of the previous monitoring tasks, GIGAS undertakes a comparative analysis on solutions, requirements, architecture, models, processes and consensus mechanisms used by INSPIRE, GMES and GEOSS, taking into account the inputs from the monitoring of FP6/FP7 research projects and the ongoing standardization activities. Initiative Contact Points will insure that the overall policy framework and schedules for each of the three initiatives will be factored in. The result of the Comparative Analysis includes: A list of recommendations to GEOSS, INSPIRE and GMES to be expanded and processed in depth in the following shaping phase The identification of technological gaps to be explored in the following shaping phase. Guidelines and objectives for the architectural approach within GIGAS Analysis on the schedules of the three initiatives and on the FP6/FP7 programs and standardization activities, with identification of key milestones or intervention points.
Friedman, Charles P; Iakovidis, Ilias; Debenedetti, Laurent; Lorenzi, Nancy M
2009-11-01
Countries on both sides of the Atlantic Ocean have invested in health information and communication technologies. Since eHealth challenges cross borders a European Union-United States of America conference on public policies relating to health IT and eHealth was held October 20-21, 2008 in Paris, France. The conference was organized around the four themes: (1) privacy and security, (2) health IT interoperability, (3) deployment and adoption of health IT, and (4) Public Private Collaborative Governance. The four key themes framed the discussion over the two days of plenary sessions and workshops. Key findings of the conference were organized along the four themes. (1) Privacy and security: Patients' access to their own data and key elements of a patient identification management framework were discussed. (2) Health IT interoperability: Three significant and common interoperability challenges emerged: (a) the need to establish common or compatible standards and clear guidelines for their implementation, (b) the desirability for shared certification criteria and (c) the need for greater awareness of the importance of interoperability. (3) Deployment and adoption of health IT: Three major areas of need emerged: (a) a shared knowledge base and assessment framework, (b) public-private collaboration and (c) and effective organizational change strategies. (4) Public Private Collaborative Governance: Sharing and communication are central to success in this area. Nations can learn from one another about ways to develop harmonious, effective partnerships. Three areas that were identified as highest priority for collaboration included: (1) health data security, (2) developing effective strategies to ensure healthcare professionals' acceptance of health IT tools, and (3) interoperability.
Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data
NASA Astrophysics Data System (ADS)
Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.
2007-12-01
Advances in ecology and environmental science increasingly depend on information from multiple disciplines to tackle broader and more complex questions about the natural world. Such advances, however, are hindered by data heterogeneity, which impedes the ability of researchers to discover, interpret, and integrate relevant data that have been collected by others. Here, we outline two community-building initiatives for improving data interoperability in the ecological and environmental sciences, one that is well-established (the Ecological Metadata Language [EML]), and another that is actively underway (a unified model for observations and measurements). EML is a metadata specification developed for the ecology discipline, and is based on prior work done by the Ecological Society of America and associated efforts to ensure a modular and extensible framework to document ecological data. EML "modules" are designed to describe one logical part of the total metadata that should be included with any ecological dataset. EML was developed through a series of working meetings, ongoing discussion forums and email lists, with participation from a broad range of ecological and environmental scientists, as well as computer scientists and software developers. Where possible, EML adopted syntax from the other metadata standards for other disciplines (e.g., Dublin Core, Content Standard for Digital Geospatial Metadata, and more). Although EML has not yet been ratified through a standards body, it has become the de facto metadata standard for a large range of ecological data management projects, including for the Long Term Ecological Research Network, the National Center for Ecological Analysis and Synthesis, and the Ecological Society of America. The second community-building initiative is based on work through the Scientific Environment for Ecological Knowledge (SEEK) as well as a recent workshop on multi-disciplinary data management. This initiative aims at improving interoperability by describing the semantics of data at the level of observation and measurement (rather than the traditional focus at the level of the data set) and will define the necessary specifications and technologies to facilitate semantic interpretation and integration of observational data for the environmental sciences. As such, this initiative will focus on unifying the various existing approaches for representing and describing observation data (e.g., SEEK's Observation Ontology, CUAHSI's Observation Data Model, NatureServe's Observation Data Standard, to name a few). Products of this initiative will be compatible with existing standards and build upon recent advances in knowledge representation (e.g., W3C's recommended Web Ontology Language, OWL) that have demonstrated practical utility in enhancing scientific communication and data interoperability in other communities (e.g., the genomics community). A community-sanctioned, extensible, and unified model for observational data will support metadata standards such as EML while reducing the "babel" of scientific dialects that currently impede effective data integration, which will in turn provide a strong foundation for enabling cross-disciplinary synthetic research in the ecological and environmental sciences.
A standard satellite control reference model
NASA Technical Reports Server (NTRS)
Golden, Constance
1994-01-01
This paper describes a Satellite Control Reference Model that provides the basis for an approach to identify where standards would be beneficial in supporting space operations functions. The background and context for the development of the model and the approach are described. A process for using this reference model to trace top level interoperability directives to specific sets of engineering interface standards that must be implemented to meet these directives is discussed. Issues in developing a 'universal' reference model are also identified.
Implementing a Standards Development Framework for the Coalition Battle Management Language
2013-06-01
and M. Hieb, “Coalition Battle Management (C-BML) Study Group Report”, Paper 05F- SIW -041, Fall Simulation Interoperability Workshop Sept 2006. [3...J. Abbott, S. Levine, M. Pullen: “Answering The Question Why A BML Standard Has Taken So Long To Be Establishes?”, Fall 2011 SIW , Orlando USA. [4] K...Heffner et al., “A Systems Engineering Approach to M&S Standards Development: Application to the Coalition Battle Management Language”, 13S- SIW -002
A Bridge to the Future: Observations on Building a Digital Library.
ERIC Educational Resources Information Center
Gaunt, Marianne I.
2002-01-01
The experience of Rutgers University Libraries illustrates the extensive planning, work effort, possibilities, and investment required to develop the digital library. Examines these key areas: organizational structure; staff development needs; facilities and the new digital infrastructure; metadata standards/interoperability; digital collection…
75 FR 11327 - Proposed Establishment of Certification Programs for Health Information Technology
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
... may begin demonstrating meaningful use of Certified EHR Technology. The second proposal would... means specific requirements or instructions for implementing a standard. Qualified EHR means an... professionals and eligible hospitals to promote the adoption and meaningful use of interoperable HIT. a...
A multi-service data management platform for scientific oceanographic products
NASA Astrophysics Data System (ADS)
D'Anca, Alessandro; Conte, Laura; Nassisi, Paola; Palazzo, Cosimo; Lecci, Rita; Cretì, Sergio; Mancini, Marco; Nuzzo, Alessandra; Mirto, Maria; Mannarini, Gianandrea; Coppini, Giovanni; Fiore, Sandro; Aloisio, Giovanni
2017-02-01
An efficient, secure and interoperable data platform solution has been developed in the TESSA project to provide fast navigation and access to the data stored in the data archive, as well as a standard-based metadata management support. The platform mainly targets scientific users and the situational sea awareness high-level services such as the decision support systems (DSS). These datasets are accessible through the following three main components: the Data Access Service (DAS), the Metadata Service and the Complex Data Analysis Module (CDAM). The DAS allows access to data stored in the archive by providing interfaces for different protocols and services for downloading, variables selection, data subsetting or map generation. Metadata Service is the heart of the information system of the TESSA products and completes the overall infrastructure for data and metadata management. This component enables data search and discovery and addresses interoperability by exploiting widely adopted standards for geospatial data. Finally, the CDAM represents the back-end of the TESSA DSS by performing on-demand complex data analysis tasks.
Park, Yu Rang; Yoon, Young Jo; Kim, Hye Hyeon; Kim, Ju Han
2013-01-01
Achieving semantic interoperability is critical for biomedical data sharing between individuals, organizations and systems. The ISO/IEC 11179 MetaData Registry (MDR) standard has been recognized as one of the solutions for this purpose. The standard model, however, is limited. Representing concepts consist of two or more values, for instance, are not allowed including blood pressure with systolic and diastolic values. We addressed the structural limitations of ISO/IEC 11179 by an integrated metadata object model in our previous research. In the present study, we introduce semantic extensions for the model by defining three new types of semantic relationships; dependency, composite and variable relationships. To evaluate our extensions in a real world setting, we measured the efficiency of metadata reduction by means of mapping to existing others. We extracted metadata from the College of American Pathologist Cancer Protocols and then evaluated our extensions. With no semantic loss, one third of the extracted metadata could be successfully eliminated, suggesting better strategy for implementing clinical MDRs with improved efficiency and utility.
Spyropoulos, B; Tzavaras, A; Zogogianni, D; Botsivaly, M
2013-01-01
The purpose of this paper is to present the design and the current development status of an Anesthesia Information Management System (AIMS). For this system, the physical and technical advances, depicted in relevant, recently published Industrial Property documents, have been taken into account. Additional innovative sensors create further data-load to be managed. Novel wireless data-transmission modes demand eventually compliance to further proper standards, so that interoperability between AIMS and the existing Hospital Information Systems is being sustained. We attempted to define, the state-of-the-art concerning the functions, the design-prerequisites and the relevant standards and of an "emerging" AIMS that is combining hardware innovation, real-time data acquisition, processing and displaying and lastly enabling the necessary interoperability with the other components of the existing Hospital Information Systems. Finally, we report based on this approach, about the design and implementation status, of our "real-world" system under development and discuss the multifarious obstacles encountered during this still on-going project.
Request redirection paradigm in medical image archive implementation.
Dragan, Dinu; Ivetić, Dragan
2012-08-01
It is widely recognized that the JPEG2000 facilitates issues in medical imaging: storage, communication, sharing, remote access, interoperability, and presentation scalability. Therefore, JPEG2000 support was added to the DICOM standard Supplement 61. Two approaches to support JPEG2000 medical image are explicitly defined by the DICOM standard: replacing the DICOM image format with corresponding JPEG2000 codestream, or by the Pixel Data Provider service, DICOM supplement 106. The latest one supposes two-step retrieval of medical image: DICOM request and response from a DICOM server, and then JPIP request and response from a JPEG2000 server. We propose a novel strategy for transmission of scalable JPEG2000 images extracted from a single codestream over DICOM network using the DICOM Private Data Element without sacrificing system interoperability. It employs the request redirection paradigm: DICOM request and response from JPEG2000 server through DICOM server. The paper presents programming solution for implementation of request redirection paradigm in a DICOM transparent manner. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Development of a space universal modular architecture (SUMO)
NASA Astrophysics Data System (ADS)
Collins, Bernie F.
This concept paper proposes that the space community should develop and implement a universal standard for spacecraft modularity - to improve interoperability of spacecraft components. Pursuing a global industry consensus standard for open and modular spacecraft architecture will encourage trade, remove standards-related market barriers, and in the long run increase both value provided to customers and profitability of the space industrial sector. This concept paper sets out: (1) the goals for a SUMO standard and how it will benefit the space community; (2) background on spacecraft modularity and existing related standards; (3) the proposed technical scope of the current standardization effort; and (4) an approach for creating a SUMO standard.
ACTS 118x: High Speed TCP Interoperability Testing
NASA Technical Reports Server (NTRS)
Brooks, David E.; Buffinton, Craig; Beering, Dave R.; Welch, Arun; Ivancic, William D.; Zernic, Mike; Hoder, Douglas J.
1999-01-01
With the recent explosion of the Internet and the enormous business opportunities available to communication system providers, great interest has developed in improving the efficiency of data transfer over satellite links using the Transmission Control Protocol (TCP) of the Internet Protocol (IP) suite. The NASA's ACTS experiments program initiated a series of TCP experiments to demonstrate scalability of TCP/IP and determine to what extent the protocol can be optimized over a 622 Mbps satellite link. Through partnerships with the government technology oriented labs, computer, telecommunication, and satellite industries NASA Glenn was able to: (1) promote the development of interoperable, high-performance TCP/IP implementations across multiple computing / operating platforms; (2) work with the satellite industry to answer outstanding questions regarding the use of standard protocols (TCP/IP and ATM) for the delivery of advanced data services, and for use in spacecraft architectures; and (3) conduct a series of TCP/IP interoperability tests over OC12 ATM over a satellite network in a multi-vendor environment using ACTS. The experiments' various network configurations and the results are presented.
XDS-I outsourcing proxy: ensuring confidentiality while preserving interoperability.
Ribeiro, Luís S; Viana-Ferreira, Carlos; Oliveira, José Luís; Costa, Carlos
2014-07-01
The interoperability of services and the sharing of health data have been a continuous goal for health professionals, patients, institutions, and policy makers. However, several issues have been hindering this goal, such as incompatible implementations of standards (e.g., HL7, DICOM), multiple ontologies, and security constraints. Cross-enterprise document sharing (XDS) workflows were proposed by Integrating the Healthcare Enterprise (IHE) to address current limitations in exchanging clinical data among organizations. To ensure data protection, XDS actors must be placed in trustworthy domains, which are normally inside such institutions. However, due to rapidly growing IT requirements, the outsourcing of resources in the Cloud is becoming very appealing. This paper presents a software proxy that enables the outsourcing of XDS architectural parts while preserving the interoperability, confidentiality, and searchability of clinical information. A key component in our architecture is a new searchable encryption (SE) scheme-Posterior Playfair Searchable Encryption (PPSE)-which, besides keeping the same confidentiality levels of the stored data, hides the search patterns to the adversary, bringing improvements when compared to the remaining practical state-of-the-art SE schemes.
NASA Technical Reports Server (NTRS)
Lucord, Steve A.; Gully, Sylvain
2009-01-01
The purpose of the PROTOTYPE INTEROPERABILITY DOCUMENT is to document the design and interfaces for the service providers and consumers of a Mission Operations prototype between JSC-OTF and DLR-GSOC. The primary goal is to test the interoperability sections of the CCSDS Spacecraft Monitor & Control (SM&C) Mission Operations (MO) specifications between both control centers. An additional goal is to provide feedback to the Spacecraft Monitor and Control (SM&C) working group through the Review Item Disposition (RID) process. This Prototype is considered a proof of concept and should increase the knowledge base of the CCSDS SM&C Mission Operations standards. No operational capabilities will be provided. The CCSDS Mission Operations (MO) initiative was previously called Spacecraft Monitor and Control (SM&C). The specifications have been renamed to better reflect the scope and overall objectives. The working group retains the name Spacecraft Monitor and Control working group and is under the Mission Operations and Information Services Area (MOIMS) of CCSDS. This document will refer to the specifications as SM&C Mission Operations, Mission Operations or just MO.
Tool and data interoperability in the SSE system
NASA Technical Reports Server (NTRS)
Shotton, Chuck
1988-01-01
Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.
Risk Management Considerations for Interoperable Acquisition
2006-08-01
Electronics Engineers (IEEE) to harmonize the standards for software (IEEE 12207 ) and system (IEEE 15288) life-cycle processes. A goal of this harmonization...management ( ISO /IEC 16085) is being generalized to apply to the systems level. The revised, generalized standard will add require- ments and guidance for the...risk management. The documents include the following: • ISO /IEC Guide 73: Risk Management—Vocabulary—Guidelines for use in stan- dards [ ISO 02
A/E/C CAD Standard, Release 4.0
2009-07-01
Insulating (Transformer) Oil System Lubrication Oil Hot Water Heating System Machine Design Appendix A Model File Level/Layer Assignment Tables A51...of the A /E/C CAD Standard are: “Uniform Drawing System ” The Construction Specifications Institute 99 Canal Center Plaza, Suite 300 Alexandria, VA...FM – Facility Management GIS – Geographic Information System IAI – International Alliance for Interoperability IFC – Industry Foundation
Automated Demand Response for Energy Sustainability Cost and Performance Report
2015-09-01
Install solar thermal system for pool heating in fitness Bldg 325 2022 $ 21,359 $ 7,199 3.6 yrs Renewable energy project p. 124- 126 Note: All data...and R. Bienert, 2011. Smart Grid Standards and Systems Interoperability: A Precedent with OpenADR, Lawrence Berkeley National Laboratory, LBNL...response (DR) system at Fort Irwin, CA. This demonstration employed industry-standard OpenADR (Open Automated Demand Response) technology to perform
Commercialization and Standardization Progress Towards an Optical Communications Earth Relay
NASA Technical Reports Server (NTRS)
Edwards, Bernard L.; Israel, David J.
2015-01-01
NASA is planning to launch the next generation of a space based Earth relay in 2025 to join the current Space Network, consisting of Tracking and Data Relay Satellites in space and the corresponding infrastructure on Earth. While the requirements and architecture for that relay satellite are unknown at this time, NASA is investing in communications technologies that could be deployed to provide new communications services. One of those new technologies is optical communications. The Laser Communications Relay Demonstration (LCRD) project, scheduled for launch in 2018 as a hosted payload on a commercial communications satellite, is a critical pathfinder towards NASA providing optical communications services on the next generation space based relay. This paper will describe NASA efforts in the on-going commercialization of optical communications and the development of inter-operability standards. Both are seen as critical to making optical communications a reality on future NASA science and exploration missions. Commercialization is important because NASA would like to eventually be able to simply purchase an entire optical communications terminal from a commercial provider. Inter-operability standards are needed to ensure that optical communications terminals developed by one vendor are compatible with the terminals of another. International standards in optical communications would also allow the space missions of one nation to use the infrastructure of another.
Ambient assisted living healthcare frameworks, platforms, standards, and quality attributes.
Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard
2014-03-04
Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions.
Ambient Assisted Living Healthcare Frameworks, Platforms, Standards, and Quality Attributes
Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard
2014-01-01
Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions. PMID:24599192
UHF (Ultra High Frequency) Military Satellite Communications Ground Equipment Interoperability.
1986-10-06
crisis management requires interoperability between various services. These short-term crises often arise from unforeseen circumstances in which...Scheduler Qualcomm has prepared an interoperability study for the JTC3A (Reference 15) as a TA/CE for USCINCLANT ROC 5-84 requirements. It has defined a...interoperability is fundamental. A number of operational crises have occurred where interoperable communications or the lack of interoperable
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
Towards technical interoperability in telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.
A Standardized Interface for Obtaining Digital Planetary and Heliophysics Time Series Data
NASA Astrophysics Data System (ADS)
Vandegriff, Jon; Weigel, Robert; Faden, Jeremy; King, Todd; Candey, Robert
2016-10-01
We describe a low level interface for accessing digital Planetary and Heliophysics data, focusing primarily on time-series data from in-situ instruments. As the volume and variety of planetary data has increased, it has become harder to merge diverse datasets into a common analysis environment. Thus we are building low-level computer-to-computer infrastructure to enable data from different missions or archives to be able to interoperate. The key to enabling interoperability is a simple access interface that standardizes the common capabilities available from any data server: 1. identify the data resources that can be accessed; 2. describe each resource; and 3. get the data from a resource. We have created a standardized way for data servers to perform each of these three activities. We are also developing a standard streaming data format for the actual data content to be returned (i.e., the result of item 3). Our proposed standard access interface is simple enough that it could be implemented on top of or beside existing data services, or it could even be fully implemented by a small data provider as a way to ensure that the provider's holdings can participate in larger data systems or joint analysis with other datasets. We present details of the interface and of the streaming format, including a sample server designed to illustrate the data request and streaming capabilities.
Development of NATO's recognized environmental picture
NASA Astrophysics Data System (ADS)
Teufert, John F.; Trabelsi, Mourad
2006-05-01
An important element for the fielding of a viable, effective NATO Response Force (NRF) is access to meteorological, oceanographic, geospatial data (GEOMETOC) and imagery. Currently, the available GEOMETOC information suffers from being very fragmented. NATO defines the Recognised Environmental Picture as controlled information base for GEOMETOC data. The NATO REP proposes an architecture that is both flexible and open. The focus lies on enabling a network-centric approach. The key into achieving this is relying on using open, well recognized standards that apply to both the data exchange protocols and the data formats. Communication and information exchange based on open standards enables system interoperability. Diverse systems, each with unique, specialized contributions to an increased understanding of the battlespace, can now cooperate to a manageable information sphere. By clearly defining responsibilities in the generation of information, a reduction in data transfer overhead is achieved . REP identifies three main stages in the dissemination of GEOMETOC data. These are Collection, Fusion (and Analysis) and Publication. A REP architecture has been successfully deployed during the NATO Coalition Warrior Interoperability Demonstration (CWID) in Lillehammer, Norway during June 2005. CWID is an annual event to validate and improve the interoperability of NATO and national Consultation and command, control, communications, computers, intelligence, surveillance, and reconnaissance (C4ISR) systems. With a test case success rate of 84%, it was able to provide relevant GEOMETOC support to the main NRF component headquarters. In 2006, the REP architecture will be deployed and validated during the NATO NRF Steadfast live exercises.
The interoperability force in the ERP field
NASA Astrophysics Data System (ADS)
Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon
2015-04-01
Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2016-04-01
The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential impact for the individual regional data infrastructures of implementing these prototype interoperability solutions is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project which is focussed on regional marine data systems in Europe, Australia, the USA and, more recently, Canada.
Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.
Do, Hyoungho
2018-01-01
Objectives Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. Methods We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. Results ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. Conclusions ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01. PMID:29503752
Lee, Sungkee; Do, Hyoungho
2018-01-01
Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01.
Design and Development of a Network-Based Electronic Library.
ERIC Educational Resources Information Center
Larson, Ray R.
1994-01-01
Describes collaboration between the University of California at Berkeley and four other universities to develop interoperable servers containing each participant's Computer Science Technical Reports and to make them available over the Internet using standard protocols. The proposed library architecture, approaches to indexing and retrieval, and…
AN OVERVIEW OF THE INTEROPERABILITY ROADMAP FOR COM/.NET-BASED CAPE-OPEN
The CAPE-OPEN standard interfaces have been designed to permit flexibility and modularization of process simulation environments (PMEs) in order to use process modeling components such as unit operation or thermodynamic property models across a range of tolls employed in the life...
Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study
2007-06-01
Modeling Notation ( BPMN ) • Business Process Definition Metamodel (BPDM) A Business Process (BP) is a defined sequence of steps to be executed in...enterprise applications, to evaluate the capabilities of suppliers, and to compare against the competition. BPMN standardizes flowchart diagrams that
Borland, Rob; Barasa, Mourice; Iiams-Hauser, Casey; Velez, Olivia; Kaonga, Nadi Nina; Berg, Matt
2013-01-01
The purpose of this paper is to illustrate the importance of using open source technologies and common standards for interoperability when implementing eHealth systems and illustrate this through case studies, where possible. The sources used to inform this paper draw from the implementation and evaluation of the eHealth Program in the context of the Millennium Villages Project (MVP). As the eHealth Team was tasked to deploy an eHealth architecture, the Millennium Villages Global-Network (MVG-Net), across all fourteen of the MVP sites in Sub-Saharan Africa, the team recognized the need for standards and uniformity but also realized that context would be an important factor. Therefore, the team decided to utilize open source solutions. The MVP implementation of MVG-Net provides a model for those looking to implement informatics solutions across disciplines and countries. Furthermore, there are valuable lessons learned that the eHealth community can benefit from. By sharing lessons learned and developing an accessible, open-source eHealth platform, we believe that we can more efficiently and rapidly achieve the health-related and collaborative Millennium Development Goals (MDGs). PMID:22894051
An overview on STEP-NC compliant controller development
NASA Astrophysics Data System (ADS)
Othman, M. A.; Minhat, M.; Jamaludin, Z.
2017-10-01
The capabilities of conventional Computer Numerical Control (CNC) machine tools as termination organiser to fabricate high-quality parts promptly, economically and precisely are undeniable. To date, most CNCs follow the programming standard of ISO 6983, also called G & M code. However, in fluctuating shop floor environment, flexibility and interoperability of current CNC system to react dynamically and adaptively are believed still limited. This outdated programming language does not explicitly relate to each other to have control of arbitrary locations other than the motion of the block-by-block. To address this limitation, new standard known as STEP-NC was developed in late 1990s and is formalized as an ISO 14649. It adds intelligence to the CNC in term of interoperability, flexibility, adaptability and openness. This paper presents an overview of the research work that have been done in developing a STEP-NC controller standard and the capabilities of STEP-NC to overcome modern manufacturing demands. Reviews stated that most existing STEP-NC controller prototypes are based on type 1 and type 2 implementation levels. There are still lack of effort being done to develop type 3 and type 4 STEP-NC compliant controller.
MSAT and cellular hybrid networking
NASA Astrophysics Data System (ADS)
Baranowsky, Patrick W., II
Westinghouse Electric Corporation is developing both the Communications Ground Segment and the Series 1000 Mobile Phone for American Mobile Satellite Corporation's (AMSC's) Mobile Satellite (MSAT) system. The success of the voice services portion of this system depends, to some extent, upon the interoperability of the cellular network and the satellite communication circuit switched communication channels. This paper will describe the set of user-selectable cellular interoperable modes (cellular first/satellite second, etc.) provided by the Mobile Phone and described how they are implemented with the ground segment. Topics including roaming registration and cellular-to-satellite 'seamless' call handoff will be discussed, along with the relevant Interim Standard IS-41 Revision B Cellular Radiotelecommunications Intersystem Operations and IOS-553 Mobile Station - Land Station Compatibility Specification.
ACR Imaging IT Reference Guide: Image Sharing: Evolving Solutions in the Age of Interoperability
Erickson, Bradley J.; Choy, Garry
2014-01-01
Interoperability is a major focus of the quickly evolving world of Health Information Technology. Easy, yet secure and confidential exchange of imaging exams and the associated reports must be a part of the solutions that are implemented. The availability of historical exams is essential in providing a quality interpretation and reducing inappropriate utilization of imaging services. Today exchange of imaging exams is most often achieved via a CD. We describe the virtues of this solution as well as challenges that have surfaced. Internet and cloud based technologies employed for many consumer services can provide a better solution. Vendors are making these solutions available. Standards for internet based exchange are emerging. Just as Radiology converged on DICOM as a standard to store and view images we need a common exchange standard. We will review the existing standards, and how they are organized into useful workflows through Integrating the Healthcare Enterprise (IHE) profiles. IHE and standards development processes are discussed. Healthcare and the domain of Radiology must stay current with quickly evolving internet standards. The successful use of the “cloud” will depend upon both the technologies we discuss and the policies put into place around these technologies. We discuss both aspects. The Radiology community must lead the way and provide a solution that works for radiologists and clinicians in the Electronic Medical Record (EMR). Lastly we describe the features we believe radiologists should consider when considering adding internet based exchange solutions to their practice. PMID:25467903
Interoperability in digital electrocardiography: harmonization of ISO/IEEE x73-PHD and SCP-ECG.
Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Serrano, Luis; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José
2010-11-01
The ISO/IEEE 11073 (x73) family of standards is a reference frame for medical device interoperability. A draft for an ECG device specialization (ISO/IEEE 11073-10406-d02) has already been presented to the Personal Health Device (PHD) Working Group, and the Standard Communications Protocol for Computer-Assisted ElectroCardioGraphy (SCP-ECG) Standard for short-term diagnostic ECGs (EN1064:2005+A1:2007) has recently been approved as part of the x73 family (ISO 11073-91064:2009). These factors suggest the coordinated use of these two standards in foreseeable telecardiology environments, and hence the need to harmonize them. Such harmonization is the subject of this paper. Thus, a mapping of the mandatory attributes defined in the second draft of the ISO/IEEE 11073-10406-d02 and the minimum SCP-ECG fields is presented, and various other capabilities of the SCP-ECG Standard (such as the messaging part) are also analyzed from an x73-PHD point of view. As a result, this paper addresses and analyzes the implications of some inconsistencies in the coordinated use of these two standards. Finally, a proof-of-concept implementation of the draft x73-PHD ECG device specialization is presented, along with the conversion from x73-PHD to SCP-ECG. This paper, therefore, provides recommendations for future implementations of telecardiology systems that are compliant with both x73-PHD and SCP-ECG.
Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems
2008-09-01
divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that
Using a logical information model-driven design process in healthcare.
Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen
2011-01-01
A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2016-10-01
above for publication in a peer-reviewed journal 4. Expand the release of the Clinical Scenario Repository (CSR), also known as “Good Ideas for...the Clinical Scenario Repository (CSR). The CSR pilot with the American Society of Anesthesiologists (ASA) Committee on Patient Safety and Education
Opening up Library Automation Software
ERIC Educational Resources Information Center
Breeding, Marshall
2009-01-01
Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…
Improving Cancer-Related Outcomes with Connected Health - Action Items at a Glance
Action Item 1.1: Health IT stakeholder groups should continue to collaborate to overcome policy and technical barriers to a nationwide, interoperable health IT system. Action Item 1.2: Technical standards for information related to cancer care across the continuum should be developed, tested, disseminated, and adopted.
Leveraging the Semantic Web for Adaptive Education
ERIC Educational Resources Information Center
Kravcik, Milos; Gasevic, Dragan
2007-01-01
In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2014-10-01
downloads/AboutFDA/CentersOffices/OfficeofMedicalProductsandTobacco/ CDRH /CDRHReports/UCM391521.pdf. Dr. Goldman spoke in multiple panels at this workshop...downloads/AboutFDA/CentersOffices/OfficeofMedicalProductsandTo bacco/ CDRH /CDRHReports/UCM391521.pdf Arney D, Plourde J, Schrenker R, Mattegunta P
Building the Synergy between Public Sector and Research Data Infrastructures
NASA Astrophysics Data System (ADS)
Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea
2014-05-01
INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.
NASA Astrophysics Data System (ADS)
Fulker, D. W.; Gallagher, J. H. R.
2015-12-01
OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.
A step-by-step methodology for enterprise interoperability projects
NASA Astrophysics Data System (ADS)
Chalmeta, Ricardo; Pazos, Verónica
2015-05-01
Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.
The Future of Geospatial Standards
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Simonis, I.
2016-12-01
The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.
77 FR 26278 - Notice of Issuance of Statement of Federal Financial Accounting Standard 42
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Issuance of Statement of Federal Financial Accounting Standard 42 AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action... Accounting Standards Advisory Board (FASAB) has issued Statement of Federal Financial Accounting Standard 42...
77 FR 33735 - Notice of Issuance of Statement of Federal Financial Accounting Standard 43
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Issuance of Statement of Federal Financial Accounting Standard 43 AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action... Accounting Standards Advisory Board (FASAB) has issued Statement of Federal Financial Accounting Standard 43...
78 FR 2673 - Notice of Issuance of Statement of Federal Financial Accounting Standards 44
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-14
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Issuance of Statement of Federal Financial Accounting Standards 44 AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action... Accounting Standards Advisory Board (FASAB) has issued Statement of Federal Financial Accounting Standard 44...
DICOMweb™: Background and Application of the Web Standard for Medical Imaging.
Genereaux, Brad W; Dennison, Donald K; Ho, Kinson; Horn, Robert; Silver, Elliot Lewis; O'Donnell, Kevin; Kahn, Charles E
2018-05-10
This paper describes why and how DICOM, the standard that has been the basis for medical imaging interoperability around the world for several decades, has been extended into a full web technology-based standard, DICOMweb. At the turn of the century, healthcare embraced information technology, which created new problems and new opportunities for the medical imaging industry; at the same time, web technologies matured and began serving other domains well. This paper describes DICOMweb, how it extended the DICOM standard, and how DICOMweb can be applied to problems facing healthcare applications to address workflow and the changing healthcare climate.
Building a biomedical cyberinfrastructure for collaborative research.
Schad, Peter A; Mobley, Lee Rivers; Hamilton, Carol M
2011-05-01
For the potential power of genome-wide association studies (GWAS) and translational medicine to be realized, the biomedical research community must adopt standard measures, vocabularies, and systems to establish an extensible biomedical cyberinfrastructure. Incorporating standard measures will greatly facilitate combining and comparing studies via meta-analysis. Incorporating consensus-based and well-established measures into various studies should reduce the variability across studies due to attributes of measurement, making findings across studies more comparable. This article describes two well-established consensus-based approaches to identifying standard measures and systems: PhenX (consensus measures for phenotypes and eXposures), and the Open Geospatial Consortium (OGC). NIH support for these efforts has produced the PhenX Toolkit, an assembled catalog of standard measures for use in GWAS and other large-scale genomic research efforts, and the RTI Spatial Impact Factor Database (SIFD), a comprehensive repository of geo-referenced variables and extensive meta-data that conforms to OGC standards. The need for coordinated development of cyberinfrastructure to support measures and systems that enhance collaboration and data interoperability is clear; this paper includes a discussion of standard protocols for ensuring data compatibility and interoperability. Adopting a cyberinfrastructure that includes standard measures and vocabularies, and open-source systems architecture, such as the two well-established systems discussed here, will enhance the potential of future biomedical and translational research. Establishing and maintaining the cyberinfrastructure will require a fundamental change in the way researchers think about study design, collaboration, and data storage and analysis. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Building a Biomedical Cyberinfrastructure for Collaborative Research
Schad, Peter A.; Mobley, Lee Rivers; Hamilton, Carol M.
2018-01-01
For the potential power of genome-wide association studies (GWAS) and translational medicine to be realized, the biomedical research community must adopt standard measures, vocabularies, and systems to establish an extensible biomedical cyberinfrastructure. Incorporating standard measures will greatly facilitate combining and comparing studies via meta-analysis, which is a means for deriving larger populations, needed for increased statistical power to detect less apparent and more complex associations (gene-environment interactions and polygenic gene-gene interactions). Incorporating consensus-based and well-established measures into various studies should reduce the variability across studies due to attributes of measurement, making findings across studies more comparable. This article describes two consensus-based approaches to establishing standard measures and systems: PhenX (consensus measures for Phenotypes and eXposures), and the Open Geospatial Consortium (OGC). National Institutes of Health support for these efforts has produced the PhenX Toolkit, an assembled catalog of standard measures for use in GWAS and other large-scale genomic research efforts, and the RTI Spatial Impact Factor Database (SIFD), a comprehensive repository of georeferenced variables and extensive metadata that conforms to OGC standards. The need for coordinated development of cyberinfrastructure to support collaboration and data interoperability is clear, and we discuss standard protocols for ensuring data compatibility and interoperability. Adopting a cyberinfrastructure that includes standard measures, vocabularies, and open-source systems architecture will enhance the potential of future biomedical and translational research. Establishing and maintaining the cyberinfrastructure will require a fundamental change in the way researchers think about study design, collaboration, and data storage and analysis. PMID:21521587
Implementing the HL7v3 standard in Croatian primary healthcare domain.
Koncar, Miroslav
2004-01-01
The mission of HL7 Inc. is to provide standards for the exchange, management and integration of data that supports clinical patient care and the management, delivery and evaluation of healthcare services. The scope of this work includes the specifications of flexible, cost-effective approaches, standards, guidelines, methodologies, and related services for interoperability between healthcare information systems. In the field of medical information technologies, HL7 provides the world's most advanced information standards. Versions 1 and 2 of the HL7 standard have on the one hand solved many issues, but on the other demonstrated the size and complexity of the health information sharing problem. As the solution, a complete new methodology has been adopted, which is being encompassed in version 3 recommendations. This approach standardizes the Reference Information Model (RIM), which is the source of all domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely-coupled systems that are designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project, we have decided to go directly to HL7v3. Implementing the HL7v3 standard in healthcare applications represents a challenging task. By using standardized refinement and localization methods we were able to define information models for Croatian primary healthcare domain. The scope of our work includes clinical, financial and administrative data management, where in some cases we were compelled to introduce new HL7v3-compliant models. All of the HL7v3 transactions are digitally signed, using the W3C XML Digital Signature standard.
SCHeMA web-based observation data information system
NASA Astrophysics Data System (ADS)
Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise
2016-04-01
It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.
Designing learning management system interoperability in semantic web
NASA Astrophysics Data System (ADS)
Anistyasari, Y.; Sarno, R.; Rochmawati, N.
2018-01-01
The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.
76 FR 39105 - Notice of Request for Comments on Proposed Deferred Maintenance and Repairs Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-05
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Request for Comments on Proposed Deferred Maintenance and Repairs Standards AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board... Accounting Standards Advisory Board (FASAB) is requesting comments on the Exposure Draft, Deferred...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-10
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Issuance of Statement of Federal Financial Accounting Standard 39, Subsequent Events: Codification of Accounting and Financial Reporting Standards... Programs AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action: Pursuant to 31...
2005-08-01
the Office of the Secretary of Defense chartered the Joint Architecture for Unmanned Ground Systems ( JAUGS ) Working Group to address these concerns...The JAUGS Working Group was tasked with developing an initial standard for interoperable unmanned ground systems. In 2002, the charter of the... JAUGS Working Group was 1 2 modified such that their efforts would extend to all unmanned systems, not only ground systems. The standard was
NASA Technical Reports Server (NTRS)
Newman, Doug; Mitchell, Andrew
2016-01-01
During the development of the CMR (Common Metadata Repository) (CMR) for the Earth Observing System Data and Information System (EOSDIS), CSW (Catalog Service for the Web) a number of best practices came to light. Given that the ESIP (Earth Science Information Partners) Discovery Cluster is committed to interoperability and standards in earth data discovery this seemed like a convenient moment to provide Best Practices to the organization in the same way we did for OpenSearch for this widely-used standard.
Defense Standardization Program Journal. April/June 2011
2011-06-01
years, the seven winners have played an integral part in keeping our men and women in uniform sate and in providing them the tools they need to get...supporting our men and women in uniform, helping to multiply capability through interoperability, and saving money tor the taxpayer. 1 hope that reading...committee chair of ASTM B687, "Standard Specifi- cation for Brass, Copper, and Chromium-Plated Pipe Nipples " (FSC 4730), to incor- porate eight additional
Applicability of IHE/Continua components for PHR systems: learning from experiences.
Urbauer, Philipp; Sauermann, Stefan; Frohner, Matthias; Forjan, Mathias; Pohn, Birgit; Mense, Alexander
2015-04-01
Capturing personal health data using smartphones, PCs or other devices, and the reuse of the data in personal health records (PHR) is becoming more and more attractive for modern health-conscious populations. This paper analyses interoperability specifications targeting standards-based communication of computer systems and personal health devices (e.g. blood pressure monitor) in healthcare from initiatives like Integrating the Healthcare Enterprise (IHE) and Continua Health Alliance driven by industry and healthcare professionals. Furthermore it identifies certain contradictions and gaps in the specifications and suggests possible solutions. Despite these shortcomings, the specifications allow fully functional implementations of PHR systems. Henceforth, both big business and small and medium-sized enterprises (SMEs) can actively contribute to the widespread use of large-scale interoperable PHR systems. Copyright © 2013 Elsevier Ltd. All rights reserved.
Harmonizing clinical terminologies: driving interoperability in healthcare.
Hamm, Russell A; Knoop, Sarah E; Schwarz, Peter; Block, Aaron D; Davis, Warren L
2007-01-01
Internationally, there are countless initiatives to build National Healthcare Information Networks (NHIN) that electronically interconnect healthcare organizations by enhancing and integrating current information technology (IT) capabilities. The realization of such NHINs will enable the simple and immediate exchange of appropriate and vital clinical data among participating organizations. In order for institutions to accurately and automatically exchange information, the electronic clinical documents must make use of established clinical codes, such as those of SNOMED-CT, LOINC and ICD-9 CM. However, there does not exist one universally accepted coding scheme that encapsulates all pertinent clinical information for the purposes of patient care, clinical research and population heatlh reporting. In this paper, we propose a combination of methods and standards that target the harmonization of clinical terminologies and encourage sustainable, interoperable infrastructure for healthcare.
Borlawsky, Tara B.; Dhaval, Rakesh; Hastings, Shannon L.; Payne, Philip R. O.
2009-01-01
In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative. PMID:21347164
Interoperability science cases with the CDPP tools
NASA Astrophysics Data System (ADS)
Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.
2017-12-01
Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.
Borlawsky, Tara B; Dhaval, Rakesh; Hastings, Shannon L; Payne, Philip R O
2009-03-01
In October 2006, the National Institutes of Health launched a new national consortium, funded through Clinical and Translational Science Awards (CTSA), with the primary objective of improving the conduct and efficiency of the inherently multi-disciplinary field of translational research. To help meet this goal, the Ohio State University Center for Clinical and Translational Science has launched a knowledge management initiative that is focused on facilitating widespread semantic interoperability among administrative, basic science, clinical and research computing systems, both internally and among the translational research community at-large, through the integration of domain-specific standard terminologies and ontologies with local annotations. This manuscript describes an agile framework that builds upon prevailing knowledge engineering and semantic interoperability methods, and will be implemented as part this initiative.
Advances in a distributed approach for ocean model data interoperability
Signell, Richard P.; Snowden, Derrick P.
2014-01-01
An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.
Implementation of Medical Information Exchange System Based on EHR Standard
Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-01-01
Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447
[The role of Integrating the Healthcare Enterprise (IHE) in telemedicine].
Bergh, B; Brandner, A; Heiß, J; Kutscha, U; Merzweiler, A; Pahontu, R; Schreiweis, B; Yüksekogul, N; Bronsch, T; Heinze, O
2015-10-01
Telemedicine systems are today already used in a variety of areas to improve patient care. The lack of standardization in those solutions creates a lack of interoperability of the systems. Internationally accepted standards can help to solve the lack of system interoperability. With Integrating the Healthcare Enterprise (IHE), a worldwide initiative of users and vendors is working on the use of defined standards for specific use cases by describing those use cases in so called IHE Profiles. The aim of this work is to determine how telemedicine applications can be implemented using IHE profiles. Based on a literature review, exemplary telemedicine applications are described and technical abilities of IHE Profiles are evaluated. These IHE Profiles are examined for their usability and are then evaluated in exemplary telemedicine application architectures. There are IHE Profiles which can be identified as being useful for intersectoral patient records (e.g. PEHR at Heidelberg), as well as for point to point communication where no patient record is involved. In the area of patient records, the IHE Profile "Cross-Enterprise Document Sharing (XDS)" is often used. The point to point communication can be supported using the IHE "Cross-Enterprise Document Media Interchange (XDM)". IHE-based telemedicine applications offer caregivers the possibility to be informed about their patients using data from intersectoral patient records, but also there are possible savings by reusing the standardized interfaces in other scenarios.
OSI in the NASA science internet: An analysis
NASA Technical Reports Server (NTRS)
Nitzan, Rebecca
1990-01-01
The Open Systems Interconnection (OSI) protocol suite is a result of a world-wide effort to develop international standards for networking. OSI is formalized through the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). The goal of OSI is to provide interoperability between network products without relying on one particular vendor, and to do so on a multinational basis. The National Institute for Standards and Technology (NIST) has developed a Government OSI Profile (GOSIP) that specified a subset of the OSI protocols as a Federal Information Processing Standard (FIPS 146). GOSIP compatibility has been adopted as the direction for all U.S. government networks. OSI is extremely diverse, and therefore adherence to a profile will facilitate interoperability within OSI networks. All major computer vendors have indicated current or future support of GOSIP-compliant OSI protocols in their products. The NASA Science Internet (NSI) is an operational network, serving user requirements under NASA's Office of Space Science and Applications. NSI consists of the Space Physics Analysis Network (SPAN) that uses the DECnet protocols and the NASA Science Network (NSN) that uses TCP/IP protocols. The NSI Project Office is currently working on an OSI integration analysis and strategy. A long-term goal is to integrate SPAN and NSN into one unified network service, using a full OSI protocol suite, which will support the OSSA user community.
Implementation of Medical Information Exchange System Based on EHR Standard.
Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-12-01
To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.
NASA Technical Reports Server (NTRS)
Cowen, Benjamin
2011-01-01
Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.
Service Oriented Architecture for Wireless Sensor Networks in Agriculture
NASA Astrophysics Data System (ADS)
Sawant, S. A.; Adinarayana, J.; Durbha, S. S.; Tripathy, A. K.; Sudharsan, D.
2012-08-01
Rapid advances in Wireless Sensor Network (WSN) for agricultural applications has provided a platform for better decision making for crop planning and management, particularly in precision agriculture aspects. Due to the ever-increasing spread of WSNs there is a need for standards, i.e. a set of specifications and encodings to bring multiple sensor networks on common platform. Distributed sensor systems when brought together can facilitate better decision making in agricultural domain. The Open Geospatial Consortium (OGC) through Sensor Web Enablement (SWE) provides guidelines for semantic and syntactic standardization of sensor networks. In this work two distributed sensing systems (Agrisens and FieldServer) were selected to implement OGC SWE standards through a Service Oriented Architecture (SOA) approach. Online interoperable data processing was developed through SWE components such as Sensor Model Language (SensorML) and Sensor Observation Service (SOS). An integrated web client was developed to visualize the sensor observations and measurements that enables the retrieval of crop water resources availability and requirements in a systematic manner for both the sensing devices. Further, the client has also the ability to operate in an interoperable manner with any other OGC standardized WSN systems. The study of WSN systems has shown that there is need to augment the operations / processing capabilities of SOS in order to understand about collected sensor data and implement the modelling services. Also, the very low cost availability of WSN systems in future, it is possible to implement the OGC standardized SWE framework for agricultural applications with open source software tools.
Mookencherry, Shefali
2012-01-01
It makes strategic and business sense for payers and providers to collaborate on how to take substantial cost out of the healthcare delivery system. Acting independently, neither medical groups, hospitals nor health plans have the optimal mix of resources and incentives to significantly reduce costs. Payers have core assets such as marketing, claims data, claims processing, reimbursement systems and capital. It would be cost prohibitive for all but the largest providers to develop these capabilities in order to compete directly with insurers. Likewise, medical groups and hospitals are positioned to foster financial interdependence among providers and coordinate the continuum of patient illnesses and care settings. Payers and providers should commit to reasonable clinical and cost goals, and share resources to minimize expenses and financial risks. It is in the interest of payers to work closely with providers on risk-management strategies because insurers need synergy with ACOs to remain cost competitive. It is in the interest of ACOs to work collaboratively with payers early on to develop reasonable and effective performance benchmarks. Hence, it is essential to have payer interoperability and data sharing integrated in an ACO model.
76 FR 58510 - Notice of Meeting Schedule for 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-21
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Meeting Schedule for 2012 AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action: Pursuant to 31 U.S.C. 3511(d), the Federal... October, 2010, notice is hereby given that the Federal Accounting Standards Advisory Board (FASAB) will...
77 FR 71792 - Notice of Meeting Schedule for 2013
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Meeting Schedule for 2013 AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action: Pursuant to 31 U.S.C. 3511(d), the Federal... October, 2010, notice is hereby given that the Federal Accounting Standards Advisory Board (FASAB) will...
da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V
2013-01-01
The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries.
[HL7 standard--features, principles, and methodology].
Koncar, Miroslav
2005-01-01
The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.
The ISO Edi Conceptual Model Activity and Its Relationship to OSI.
ERIC Educational Resources Information Center
Fincher, Judith A.
1990-01-01
The edi conceptual model is being developed to define common structures, services, and processes that syntax-specific standards like X12 and EDIFACT could adopt. Open Systems Interconnection (OSI) is of interest to edi because of its potential to help enable global interoperability across Electronic Data Interchange (EDI) functional groups. A…
Making Network Markets in Education: The Development of Data Infrastructure in Australian Schooling
ERIC Educational Resources Information Center
Sellar, Sam
2017-01-01
This paper examines the development of data infrastructure in Australian schooling with a specific focus on interoperability standards that help to make new markets for education data. The conceptual framework combines insights from studies of infrastructure, economic markets and digital data. The case of the Australian National Schools…
A Method for Evaluating and Standardizing Ontologies
ERIC Educational Resources Information Center
Seyed, Ali Patrice
2012-01-01
The Open Biomedical Ontology (OBO) Foundry initiative is a collaborative effort for developing interoperable, science-based ontologies. The Basic Formal Ontology (BFO) serves as the upper ontology for the domain-level ontologies of OBO. BFO is an upper ontology of types as conceived by defenders of realism. Among the ontologies developed for OBO…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-25
... Grid Cyber Security AGENCY: National Institute of Standards and Technology (NIST), Department of... and Technology (NIST) seeks comments on draft NISTIR 7628 Rev. 1, Guidelines for Smart Grid Cyber... (formerly the Cyber Security Working Group) of the Smart Grid Interoperability Panel. The document has been...
BabeLO--An Extensible Converter of Programming Exercises Formats
ERIC Educational Resources Information Center
Queiros, R.; Leal, J. P.
2013-01-01
In the last two decades, there was a proliferation of programming exercise formats that hinders interoperability in automatic assessment. In the lack of a widely accepted standard, a pragmatic solution is to convert content among the existing formats. BabeLO is a programming exercise converter providing services to a network of heterogeneous…
Medical Device Plug-and-Play (MD PnP) Interoperability Standardization Program Development
2009-08-01
TATRC / DoD Sandy Weininger, FDA / CDRH ICE-PAC / Industry Tracy Rausch, DocBox Inc. Ken Fuchs, Draeger Medical Carl Wallroth, Draeger Medical...Systems (LSTAT) Paul Jones, FDA / CDRH Kamran Sayrafian-Pour, NIST © 2006-2009 a white paper from the MD PnP Program rev July 2009 Advancing
HIPAA Readiness Collaborative in Hawaii.
Chun, Marva; Forbes, Susan; Gose, Steven; Kumabe, Brenda; Loo, Jeffrey; Nichols, Lorraine; Rosa, Luis; Sherrill, Laura; Turner, Jim
2002-01-01
The vision of Hawaii's HIPAA Readiness Collaborative (HRC) effort is to realize the positive potential of HIPAA through a collaborative process that engages the entire healthcare delivery system. Goals include reducing the cost of healthcare through streamlining, reducing the cost of HIPAA implementation for HRC participants, and improving the interoperability between facilities through use of standard technologies.
Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel
2013-03-01
The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Anani, Nadim; Mazya, Michael V; Chen, Rong; Prazeres Moreira, Tiago; Bill, Olivier; Ahmed, Niaz; Wahlgren, Nils; Koch, Sabine
2017-01-10
Interoperability standards intend to standardise health information, clinical practice guidelines intend to standardise care procedures, and patient data registries are vital for monitoring quality of care and for clinical research. This study combines all three: it uses interoperability specifications to model guideline knowledge and applies the result to registry data. We applied the openEHR Guideline Definition Language (GDL) to data from 18,400 European patients in the Safe Implementation of Treatments in Stroke (SITS) registry to retrospectively check their compliance with European recommendations for acute stroke treatment. Comparing compliance rates obtained with GDL to those obtained by conventional statistical data analysis yielded a complete match, suggesting that GDL technology is reliable for guideline compliance checking. The successful application of a standard guideline formalism to a large patient registry dataset is an important step toward widespread implementation of computer-interpretable guidelines in clinical practice and registry-based research. Application of the methodology gave important results on the evolution of stroke care in Europe, important both for quality of care monitoring and clinical research.
OntoCR: A CEN/ISO-13606 clinical repository based on ontologies.
Lozano-Rubí, Raimundo; Muñoz Carrero, Adolfo; Serrano Balazote, Pablo; Pastor, Xavier
2016-04-01
To design a new semantically interoperable clinical repository, based on ontologies, conforming to CEN/ISO 13606 standard. The approach followed is to extend OntoCRF, a framework for the development of clinical repositories based on ontologies. The meta-model of OntoCRF has been extended by incorporating an OWL model integrating CEN/ISO 13606, ISO 21090 and SNOMED CT structure. This approach has demonstrated a complete evaluation cycle involving the creation of the meta-model in OWL format, the creation of a simple test application, and the communication of standardized extracts to another organization. Using a CEN/ISO 13606 based system, an indefinite number of archetypes can be merged (and reused) to build new applications. Our approach, based on the use of ontologies, maintains data storage independent of content specification. With this approach, relational technology can be used for storage, maintaining extensibility capabilities. The present work demonstrates that it is possible to build a native CEN/ISO 13606 repository for the storage of clinical data. We have demonstrated semantic interoperability of clinical information using CEN/ISO 13606 extracts. Copyright © 2016 Elsevier Inc. All rights reserved.
Building a VO-compliant Radio Astronomical DAta Model for Single-dish radio telescopes (RADAMS)
NASA Astrophysics Data System (ADS)
Santander-Vela, Juan de Dios; García, Emilio; Leon, Stephane; Espigares, Victor; Ruiz, José Enrique; Verdes-Montenegro, Lourdes; Solano, Enrique
2012-11-01
The Virtual Observatory (VO) is becoming the de-facto standard for astronomical data publication. However, the number of radio astronomical archives is still low in general, and even lower is the number of radio astronomical data available through the VO. In order to facilitate the building of new radio astronomical archives, easing at the same time their interoperability with VO framework, we have developed a VO-compliant data model which provides interoperable data semantics for radio data. That model, which we call the Radio Astronomical DAta Model for Single-dish (RADAMS) has been built using standards of (and recommendations from) the International Virtual Observatory Alliance (IVOA). This article describes the RADAMS and its components, including archived entities and their relationships to VO metadata. We show that by using IVOA principles and concepts, the effort needed for both the development of the archives and their VO compatibility has been lowered, and the joint development of two radio astronomical archives have been possible. We plan to adapt RADAMS to be able to deal with interferometry data in the future.
75 FR 20364 - Notice of Meeting Schedule for 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Meeting Schedule for 2011 AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action: Pursuant to 31 U.S.C. 3511(d), the Federal..., 2004, notice is hereby given that the Federal Accounting Standards Advisory Board (FASAB) will meet on...
NASA Astrophysics Data System (ADS)
Asmi, Ari; Powers, Lindsay
2015-04-01
Research Infrastructures (RIs) are major long-term investments supporting innovative, bottom-up research activities. In the environmental research, they range from high atmosphere radars, to field observation networks and coordinated laboratory facilities. The Earth system is highly interactive and each part of the system interconnected across the spatial and disciplinary borders. However, due practical and historical reasons, the RIs are built from disciplinary points-of-view and separately in different parts of the world, with differing standards, policies, methods and research cultures. This heterogeneity provides necessary diversity to study the complex Earth system, but makes cross-disciplinary and/or global interoperability a challenge. Global actions towards better interoperability are surfacing, especially with EU and US. For example, recent mandates within the US government prioritize open data for federal agencies and federally funded science, and encourage collaboration among agencies to reduce duplication of efforts and increase efficient use of resources. There are several existing initiatives working toward these goals (e.g., COOPEUS, EarthCube, RDA, ICSU-WDS, DataOne, ESIP, USGEO, GEO). However, there is no cohesive framework to coordinate efforts among these, and other, entities. COOPEUS and EarthCube have now begun to map the landscape of interoperability efforts across earth science domains. The COOPEUS mapping effort describes the EU and US landscape of environmental research infrastructures to accomplish the following: identify gaps in services (data provision) necessary to address societal priorities; provide guidance for development of future research infrastructures; and identify opportunities for Research Infrastructures (RIs) to collaborate on issues of common interest. EarthCube mapping effort identifies opportunities to engage a broader community by identifying scientific domain organizations and entities. We present the current situation of the landscape analysis to create a sustainable effort towards removing barriers to interoperability on a global scale.
WMS and WFS Standards Implementation of Weather Data
NASA Astrophysics Data System (ADS)
Armstrong, M.
2005-12-01
CustomWeather is private weather company that delivers global weather data products. CustomWeather has built a mapping platform according to OGC standards. Currently, both a Web Mapping Service (WMS) and Web Feature Service (WFS) are supported by CustomWeather. Supporting open geospatial standards has lead to number of positive changes internally to the processes of CustomWeather, along with those of the clients accessing the data. Quite a number of challenges surfaced during this process, particularly with respect to combining a wide variety of raw modeling and sensor data into a single delivery platform. Open standards have, however, made the delivery of very different data products rather seamless. The discussion will address the issues faced in building an OGC-based mapping platform along with the limitations encountered. While the availability of these data products through open standards is still very young, there have already been many adopters in the utility and navigation industries. The discussion will take a closer look at the different approach taken by these two industries as they utilize interoperability standards with existing data. Insight will be given in regards to applications already taking advantage of this new technology and how this is affecting decision-making processes. CustomWeather has observed considerable interest and potential benefit in this technology from developing countries. Weather data is a key element in disaster management. Interoperability is literally opening up a world of data and has the potential to quickly enable functionality that would otherwise take considerable time to implement. The discussion will briefly touch on our experience.
A Story of a Crashed Plane in US-Mexican border
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Hobona, Gobe; Vretanos, Peter; Peterson, Perry
2013-04-01
A plane has crashed on the US-Mexican border. The search and rescue command center planner needs to find information about the crash site, a mountain, nearby mountains for the establishment of a communications tower, as well as ranches for setting up a local incident center. Events like this one occur all over the world and exchanging information seamlessly is key to save lives and prevent further disasters. This abstract describes an interoperability testbed that applied this scenario using technologies based on Open Geospatial Consortium (OGC) standards. The OGC, which has about 500 members, serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC Interoperability Program conducts international interoperability testbeds, such as the OGC Web Services Phase 9 (OWS-9), that encourages rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The Cross-Community Interoperability (CCI) thread in OWS-9 advanced the Web Feature Service for Gazetteers (WFS-G) by providing a Single Point of Entry Global Gazetteer (SPEGG), where a user can submit a single query and access global geographic names data across multiple Federal names databases. Currently users must make two queries with differing input parameters against two separate databases to obtain authoritative cross border geographic names data. The gazetteers in this scenario included: GNIS and GNS. GNIS or Geographic Names Information System is managed by USGS. It was first developed in 1964 and contains information about domestic and Antarctic names. GNS or GeoNET Names Server provides the Geographic Names Data Base (GNDB) and it is managed by National Geospatial Intelligence Agency (NGA). GNS has been in service since 1994, and serves names for areas outside the United States and its dependent areas, as well as names for undersea features. The following challenges were advanced: Cascaded WFS-G servers (allowing to query multiple WFSs with a "parent" WFS), implemented query names filters (e.g. fuzzy search, text search), implemented dealing with multilingualism and diacritics, implemented advanced spatial constraints (e.g. search by radial search and nearest neighbor) and semantically mediated feature types (e.g. mountain vs. hill). To enable semantic mediation, a series of semantic mappings were defined between the NGA GNS, USGS GNIS and the Alexandria Digital Library (ADL) Gazetteer. The mappings were encoded in the Web Ontology Language (OWL) to enable them to be used by semantic web technologies. The semantic mappings were then published for ingestion into a semantic mediator that used the mappings to associate location types from one gazetteer with location types in another. The semantic mediator was then able to transform requests on the fly, providing a single point of entry WFS-G to multiple gazetteers. The presentation will provide a live presentation of the work performed, highlight main developments, and discuss future development.
NASA Astrophysics Data System (ADS)
Simonis, Ingo
2015-04-01
Traditional Spatial Data Infrastructures focus on aspects such as description and discovery of geospatial data, integration of these data into processing workflows, and representation of fusion or other data analysis results. Though lots of interoperability agreements still need to be worked out to achieve a satisfying level of interoperability within large scale initiatives such as INSPIRE, new technologies, use cases and requirements are constantly emerging from the user community. This paper focuses on three aspects that came up recently: The integration of social media data into SDIs, synchronization aspects between datasets used by field workers in shared resources environments, and the generation and maintenance of data for mixed mode online/offline situations that can be easily packed, delivered, modified, and synchronized with reference data sets. The work described in this paper results from the latest testbed executed by the Open Geospatial Consortium, OGC. The testbed is part of the interoperability program (IP), which constitutes a significant part of the OGC standards development process. The IP has a number of instruments to enhance geospatial standards and technologies, such as Testbeds, Pilot Projects, Interoperability Experiments, and Interoperability Expert Services. These activities are designed to encourage rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The latest global activity, testbed-11, aims at exploring new technologies and architectural approaches to enrich and extend traditional spatial data infrastructures with data from Social Media, improved data synchronization, and the capability to take data to the field in new synchronized data containers called GeoPackages. Social media sources are a valuable supplement to providing up to date information in distributed environments. Following an uncoordinated crowdsourcing approach, social media data can be both overwhelming in volume and questionable in its accuracy and legitimacy. Testbed-11 explores how best to make use of such sources of information and how to deal with immanent issues with data from platforms such as OpenStreetMap, Twitter, tumblr, flickr, Snapchat, Facebook, Instagram, YouTube, Vimeo, Panoramio, Pinterest, Picasa or storyful. Further important aspects highlighted here are the synchronization of data and the capability to take complex data sets of any size on mobile devices to the field - and keeping them in sync with reference data stores. In particular in emergency management situations, it is crucial to ensure properly synchronized data sets across different types of data stores and applications. Often data is taken to the field on mobile devices, where it gets updated or annotated. Though bandwidth permanently improves, requirements on data quality and complexity grow in parallel. Intermitted connectivity is paired with high security requirements that have to be fulfilled. This paper discusses the latest approaches using synchronization services and synchronized GeoPackages, the new container format for geospatial data.
Smart Grid Demonstration Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Craig; Carroll, Paul; Bell, Abigail
The National Rural Electric Cooperative Association (NRECA) organized the NRECA-U.S. Department of Energy (DOE) Smart Grid Demonstration Project (DE-OE0000222) to install and study a broad range of advanced smart grid technologies in a demonstration that spanned 23 electric cooperatives in 12 states. More than 205,444 pieces of electronic equipment and more than 100,000 minor items (bracket, labels, mounting hardware, fiber optic cable, etc.) were installed to upgrade and enhance the efficiency, reliability, and resiliency of the power networks at the participating co-ops. The objective of this project was to build a path for other electric utilities, and particularly electrical cooperatives,more » to adopt emerging smart grid technology when it can improve utility operations, thus advancing the co-ops’ familiarity and comfort with such technology. Specifically, the project executed multiple subprojects employing a range of emerging smart grid technologies to test their cost-effectiveness and, where the technology demonstrated value, provided case studies that will enable other electric utilities—particularly electric cooperatives— to use these technologies. NRECA structured the project according to the following three areas: Demonstration of smart grid technology; Advancement of standards to enable the interoperability of components; and Improvement of grid cyber security. We termed these three areas Technology Deployment Study, Interoperability, and Cyber Security. Although the deployment of technology and studying the demonstration projects at coops accounted for the largest portion of the project budget by far, we see our accomplishments in each of the areas as critical to advancing the smart grid. All project deliverables have been published. Technology Deployment Study: The deliverable was a set of 11 single-topic technical reports in areas related to the listed technologies. Each of these reports has already been submitted to DOE, distributed to co-ops, and posted for universal access at www.nreca.coop/smartgrid. This research is available for widespread distribution to both cooperative members and non-members. These reports are listed in Table 1.2. Interoperability: The deliverable in this area was the advancement of the MultiSpeak™ interoperability standard from version 4.0 to version 5.0, and improvement in the MultiSpeak™ documentation to include more than 100 use cases. This deliverable substantially expanded the scope and usability of MultiSpeak, ™ the most widely deployed utility interoperability standard, now in use by more than 900 utilities. MultiSpeak™ documentation can be accessed only at www.multispeak.org. Cyber Security: NRECA’s starting point was to develop cyber security tools that incorporated succinct guidance on best practices. The deliverables were: cyber security extensions to MultiSpeak,™ which allow more security message exchanges; a Guide to Developing a Cyber Security and Risk Mitigation Plan; a Cyber Security Risk Mitigation Checklist; a Cyber Security Plan Template that co-ops can use to create their own cyber security plans; and Security Questions for Smart Grid Vendors.« less
The International Planetary Data Alliance (IPDA)
NASA Astrophysics Data System (ADS)
Stein, Thomas; Gopala Krishna, Barla; Crichton, Daniel J.
2016-07-01
The International Planetary Data Alliance (IPDA) is a close association of partners with the aim of improving the quality of planetary science data and services to the end users of space based instrumentation. The specific mission of the IPDA is to facilitate global access to, and exchange of, high quality scientific data products managed across international boundaries. Ensuring proper capture, accessibility and availability of the data is the task of the individual member space agencies. The IPDA is focused on developing an international standard that allows discovery, query, access, and usage of such data across international planetary data archive systems. While trends in other areas of space science are concentrating on the sharing of science data from diverse standards and collection methods, the IPDA concentrates on promoting governing data standards that drive common methods for collecting and describing planetary science data across the international community. This approach better supports the long term goal of easing data sharing across system and agency boundaries. An initial starting point for developing such a standard will be internationalization of NASA's Planetary Data System's (PDS) PDS4 standard. The IPDA was formed in 2006 with the purpose of adopting standards and developing collaborations across agencies to ensure data is captured in common formats. It has grown to a dozen member agencies represented by a number of different groups through the IPDA Steering Committee. Member agencies include: Armenian Astronomical Society, China National Space Agency (CNSA), European Space Agency (ESA), German Aerospace Center (DLR), Indian Space Research Organization (ISRO), Italian Space Agency (ASI), Japanese Aerospace Exploration Agency (JAXA), National Air and Space Administration (NASA), National Centre for Space Studies (CNES), Space Research Institute (IKI), UAE Space Agency, and UK Space Agency. The IPDA Steering Committee oversees the execution of projects and coordinates international collaboration. In executing its mission, the IPDA conducts a number of focused projects to enable interoperability, construction of compatible archives, and the operation of the IPDA as a whole. These projects have helped to establish the IPDA and to move the collaboration forward. A key project that is currently underway is the implementation of the PDS4 data standard. Given the international focus, it has been critical that the PDS and the IPDA collaborate on its development. Also, many other projects have been conducted successfully, including the IPDA Requirements Document, Data Dictionary Modelling, ESA Registry Integration, the Tools Registry, and several demonstrations of interoperability protocols applied to specific missions and data sets (PDS4/PDAP (Planetary Data Access Protocol), Venus Express Interoperability). The IPDA has grown significantly since its first meetings back in November 2006. The steering committee is composed today of 28 members from 24 countries or international organizations. In addition, a technical expert group composed of 20 members from participating countries provides supportive input on technical and compatibility issues. A number of IPDA projects are ongoing, including the creation of the Memorandum of Understanding (MOU) template for international missions; the investigation of IVOA/IPDA (International Virtual Observatory Alliance-IVOA) interaction; PDS4 implementation project; the development of international registries to enable registration and search of data, tools and services; and Chandrayaan-1 interoperability project with PDAP. In addition, the IPDA continues with outreach activities, being present or represented at national and international levels and at meetings such as COSPAR, AGU, EPSC, and EGU. Further information on IPDA activities, standards, and tools are available at the web page http://www.planetarydata.org. Tool and service developers are encouraged to register their products at the IPDA web site.
Millonig, Marsha K
2009-01-01
To convene a diverse group of stakeholders to discuss medication therapy management (MTM) documentation and billing standardization and its interoperability within the health care system. More than 70 stakeholders from pharmacy, health information systems, insurers/payers, quality, and standard-setting organizations met on October 7-8, 2008, in Bethesda, MD. The American Pharmacists Association (APhA) organized the invitational conference to facilitate discussion on strategic directions for meeting current market need for MTM documentation and billing interoperability and future market needs for MTM integration into electronic health records (EHRs). APhA recently adopted policy that specifically addresses technology barriers and encourages the use and development of standardized systems for the documentation and billing of MTM services. Day 1 of the conference featured six foundational presentations on health information technology (HIT) trends, perspectives on MTM from the profession and the Centers for Medicare & Medicaid Services, health care quality and medication-related outcome measures, integrating MTM workflow in EHRs, and the current state of MTM operalization in practice. After hearing presentations on day 1 and having the opportunity to pose questions to each speaker, conference participants were divided into three breakout groups on day 2. Each group met three times for 60 minutes each and discussed five questions from the perspective of a patient, provider, or payer. Three facilitators met with each of the groups and led discussion from one perspective (i.e., patient, provider, payer). Participants then reconvened as a complete group to participate in a discussion on next steps. HIT is expected to assist in delivering safe, effective, efficient, coordinated care as health professionals strive to improve the quality of care and outcomes for individual patients. The pharmacy profession is actively contributing to quality patient care through MTM services focused on identifying and preventing medication-related problems, improving medication use, and optimizing individual therapeutic outcomes. As MTM programs continue to expand within the health care system, one important limiting factor is the lack of standardization for documentation and billing of MTM services. This lack of interoperability between technology systems, software, and system platforms is presenting as a barrier to MTM service delivery for patients. APhA convened this invitational conference to identify strategic directions to address MTM documentation and billing standardization and interoperability. Participants viewed the meeting as highly successful in bringing together a unique, wide-ranging set of stakeholders, including the government, regulators, standards organizations, other health professions, technology firms, professional organizations, and practitioners, to share perspectives. They strongly encouraged the Association to continue this unique stakeholder dialogue. Participants provided a number of next-step suggestions for APhA to consider because of the event. Participants noted the pharmacy profession's success in building information technology systems for product transactions with systematic, organized, methodical thinking and the need to apply this success to patient services. A unique opportunity exists for the profession to influence and lead the HIT community in creating a workable health technology solution for MTM services. Reaching consensus on minimum data sets for each functional area--clinical, billing, quality improvement--would be a very important short-term gain. Further, participants said it was imperative for pharmacists and the pharmacy community at large to become actively engaged in HIT standards development efforts.
Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542
Macho, Jorge Berzosa; Montón, Luis Gardeazabal; Rodriguez, Roberto Cortiñas
2017-08-01
The Cyber Physical Systems (CPS) paradigm is based on the deployment of interconnected heterogeneous devices and systems, so interoperability is at the heart of any CPS architecture design. In this sense, the adoption of standard and generic data formats for data representation and communication, e.g., XML or JSON, effectively addresses the interoperability problem among heterogeneous systems. Nevertheless, the verbosity of those standard data formats usually demands system resources that might suppose an overload for the resource-constrained devices that are typically deployed in CPS. In this work we present Context- and Template-based Compression (CTC), a data compression approach targeted to resource-constrained devices, which allows reducing the resources needed to transmit, store and process data models. Additionally, we provide a benchmark evaluation and comparison with current implementations of the Efficient XML Interchange (EXI) processor, which is promoted by the World Wide Web Consortium (W3C), and it is the most prominent XML compression mechanism nowadays. Interestingly, the results from the evaluation show that CTC outperforms EXI implementations in terms of memory usage and speed, keeping similar compression rates. As a conclusion, CTC is shown to be a good candidate for managing standard data model representation formats in CPS composed of resource-constrained devices.
Montón, Luis Gardeazabal
2017-01-01
The Cyber Physical Systems (CPS) paradigm is based on the deployment of interconnected heterogeneous devices and systems, so interoperability is at the heart of any CPS architecture design. In this sense, the adoption of standard and generic data formats for data representation and communication, e.g., XML or JSON, effectively addresses the interoperability problem among heterogeneous systems. Nevertheless, the verbosity of those standard data formats usually demands system resources that might suppose an overload for the resource-constrained devices that are typically deployed in CPS. In this work we present Context- and Template-based Compression (CTC), a data compression approach targeted to resource-constrained devices, which allows reducing the resources needed to transmit, store and process data models. Additionally, we provide a benchmark evaluation and comparison with current implementations of the Efficient XML Interchange (EXI) processor, which is promoted by the World Wide Web Consortium (W3C), and it is the most prominent XML compression mechanism nowadays. Interestingly, the results from the evaluation show that CTC outperforms EXI implementations in terms of memory usage and speed, keeping similar compression rates. As a conclusion, CTC is shown to be a good candidate for managing standard data model representation formats in CPS composed of resource-constrained devices. PMID:28763013
Integrated multi-sensor package (IMSP) for unmanned vehicle operations
NASA Astrophysics Data System (ADS)
Crow, Eddie C.; Reichard, Karl; Rogan, Chris; Callen, Jeff; Seifert, Elwood
2007-10-01
This paper describes recent efforts to develop integrated multi-sensor payloads for small robotic platforms for improved operator situational awareness and ultimately for greater robot autonomy. The focus is on enhancements to perception through integration of electro-optic, acoustic, and other sensors for navigation and inspection. The goals are to provide easier control and operation of the robot through fusion of multiple sensor outputs, to improve interoperability of the sensor payload package across multiple platforms through the use of open standards and architectures, and to reduce integration costs by embedded sensor data processing and fusion within the sensor payload package. The solutions investigated in this project to be discussed include: improved capture, processing and display of sensor data from multiple, non-commensurate sensors; an extensible architecture to support plug and play of integrated sensor packages; built-in health, power and system status monitoring using embedded diagnostics/prognostics; sensor payload integration into standard product forms for optimized size, weight and power; and the use of the open Joint Architecture for Unmanned Systems (JAUS)/ Society of Automotive Engineers (SAE) AS-4 interoperability standard. This project is in its first of three years. This paper will discuss the applicability of each of the solutions in terms of its projected impact to reducing operational time for the robot and teleoperator.
Marcelo, A; Adejumo, A; Luna, D
2011-01-01
Describe the issues surrounding health informatics in developing countries and the challenges faced by practitioners in building internal capacity. From these issues, the authors propose cost-effective strategies that can fast track health informatics development in these low to medium income countries (LMICs). The authors conducted a review of literature and consulted key opinion leaders who have experience with health informatics implementations around the world. Despite geographic and cultural differences, many LMICs share similar challenges and opportunities in developing health informatics. Partnerships, standards, and inter-operability are well known components of successful informatics programs. Establishing partnerships can be comprised of formal inter-institutional collaborations on training and research, collaborative open source software development, and effective use of social networking. Lacking legacy systems, LMICs can discuss standards and inter-operability more openly and have greater potential for success. Lastly, since cellphones are pervasive in developing countries, they can be leveraged as access points for delivering and documenting health services in remote under-served areas. Mobile health or mHealth gives LMICs a unique opportunity to leapfrog through most issues that have plagued health informatics in developed countries. By employing this proposed roadmap, LMICs can now develop capacity for health informatics using appropriate and cost-effective technologies.
An Information System for European culture collections: the way forward.
Casaregola, Serge; Vasilenko, Alexander; Romano, Paolo; Robert, Vincent; Ozerskaya, Svetlana; Kopf, Anna; Glöckner, Frank O; Smith, David
2016-01-01
Culture collections contain indispensable information about the microorganisms preserved in their repositories, such as taxonomical descriptions, origins, physiological and biochemical characteristics, bibliographic references, etc. However, information currently accessible in databases rarely adheres to common standard protocols. The resultant heterogeneity between culture collections, in terms of both content and format, notably hampers microorganism-based research and development (R&D). The optimized exploitation of these resources thus requires standardized, and simplified, access to the associated information. To this end, and in the interest of supporting R&D in the fields of agriculture, health and biotechnology, a pan-European distributed research infrastructure, MIRRI, including over 40 public culture collections and research institutes from 19 European countries, was established. A prime objective of MIRRI is to unite and provide universal access to the fragmented, and untapped, resources, information and expertise available in European public collections of microorganisms; a key component of which is to develop a dynamic Information System. For the first time, both culture collection curators as well as their users have been consulted and their feedback, concerning the needs and requirements for collection databases and data accessibility, utilised. Users primarily noted that databases were not interoperable, thus rendering a global search of multiple databases impossible. Unreliable or out-of-date and, in particular, non-homogenous, taxonomic information was also considered to be a major obstacle to searching microbial data efficiently. Moreover, complex searches are rarely possible in online databases thus limiting the extent of search queries. Curators also consider that overall harmonization-including Standard Operating Procedures, data structure, and software tools-is necessary to facilitate their work and to make high-quality data easily accessible to their users. Clearly, the needs of culture collection curators coincide with those of users on the crucial point of database interoperability. In this regard, and in order to design an appropriate Information System, important aspects on which the culture collection community should focus include: the interoperability of data sets with the ontologies to be used; setting best practice in data management, and the definition of an appropriate data standard.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Issuance of Statement of Federal Financial Accounting Standard 37, Social Insurance: Additional Requirements for Management's Discussion and Analysis and Basic Financial Statements AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice...
Collaborative development of predictive toxicology applications
2010-01-01
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436
Collaborative development of predictive toxicology applications.
Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia
2010-08-31
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
Supply Chain Interoperability Measurement
2015-06-19
Supply Chain Interoperability Measurement DISSERTATION June 2015 Christos E. Chalyvidis, Major, Hellenic Air...ENS-DS-15-J-001 SUPPLY CHAIN INTEROPERABILITY MEASUREMENT DISSERTATION Presented to the Faculty Department of Operational Sciences...INTEROPERABILITY MEASUREMENT Christos E. Chalyvidis, BS, MSc. Major, Hellenic Air Force Committee Membership: Dr. A.W. Johnson Chair
Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui
2015-01-01
Introduction The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Methods Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. Results We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. Conclusions We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. PMID:25670753
Lowering Entry Barriers for Multidisciplinary Cyber(e)-Infrastructures
NASA Astrophysics Data System (ADS)
Nativi, S.
2012-04-01
Multidisciplinarity is more and more important to study the Earth System and address Global Changes. To achieve that, multidisciplinary cyber(e)-infrastructures are an important instrument. In the last years, several European, US and international initiatives have been started to carry out multidisciplinary infrastructures, including: the Spatial Information in the European Community (INSPIRE), the Global Monitoring for Environment and Security (GMES), the Data Observation Network for Earth (DataOne), and the Global Earth Observation System of Systems (GEOSS). The majority of these initiatives are developing service-based digital infrastructures asking scientific Communities (i.e. disciplinary Users and data Producers) to implement a set of standards for information interoperability. For scientific Communities, this has represented an entry barrier which has proved to be high, in several cases. In fact, both data Producers and Users do not seem to be willing to invest precious resources to become expert on interoperability solutions -on the contrary, they are focused on developing disciplinary and thematic capacities. Therefore, an important research topic is lowering entry barriers for joining multidisciplinary cyber(e)-Infrastructures. This presentation will introduce a new approach to achieve multidisciplinary interoperability underpinning multidisciplinary infrastructures and lowering the present entry barriers for both Users and data Producers. This is called the Brokering approach: it extends the service-based paradigm by introducing a new a Brokering layer or cloud which is in charge of managing all the interoperability complexity (e.g. data discovery, access, and use) thus easing Users' and Producers' burden. This approach was successfully experimented in the framework of several European FP7 Projects and in GEOSS.
Slotwiner, David J
2016-10-01
The anticipated advantages of electronic health records (EHRs)-improved efficiency and the ability to share information across the healthcare enterprise-have so far failed to materialize. There is growing recognition that interoperability holds the key to unlocking the greatest value of EHRs. Health information technology (HIT) systems including EHRs must be able to share data and be able to interpret the shared data. This requires a controlled vocabulary with explicit definitions (data elements) as well as protocols to communicate the context in which each data element is being used (syntactic structure). Cardiac implantable electronic devices (CIEDs) provide a clear example of the challenges faced by clinicians when data is not interoperable. The proprietary data formats created by each CIED manufacturer, as well as the multiple sources of data generated by CIEDs (hospital, office, remote monitoring, acute care setting), make it challenging to aggregate even a single patient's data into an EHR. The Heart Rhythm Society and CIED manufacturers have collaborated to develop and implement international standard-based specifications for interoperability that provide an end-to-end solution, enabling structured data to be communicated from CIED to a report generation system, EHR, research database, referring physician, registry, patient portal, and beyond. EHR and other health information technology vendors have been slow to implement these tools, in large part, because there have been no financial incentives for them to do so. It is incumbent upon us, as clinicians, to insist that the tools of interoperability be a prerequisite for the purchase of any and all health information technology systems.
ISAIA: Interoperable Systems for Archival Information Access
NASA Technical Reports Server (NTRS)
Hanisch, Robert J.
2002-01-01
The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching for data across a broad community is that information that describe some data products are either not relevant to other data or not applicable in the same way. Some previous metadata standard development efforts (e.g., in the earth science and library communities) have produced standards that are very large and difficult to support. To address this problem, we studied how a standard may be divided into separable pieces. Data providers that wish to participate in interoperable searches can support only those parts of the standard that are relevant to them. We prototyped a top-level metadata standard that was small and applicable to all space science data.
A Metadata Standard for Hydroinformatic Data Conforming to International Standards
NASA Astrophysics Data System (ADS)
Notay, Vikram; Carstens, Georg; Lehfeldt, Rainer
2017-04-01
The affordable availability of computing power and digital storage has been a boon for the scientific community. The hydroinformatics community has also benefitted from the so-called digital revolution, which has enabled the tackling of more and more complex physical phenomena using hydroinformatic models, instruments, sensors, etc. With models getting more and more complex, computational domains getting larger and the resolution of computational grids and measurement data getting finer, a large amount of data is generated and consumed in any hydroinformatics related project. The ubiquitous availability of internet also contributes to this phenomenon with data being collected through sensor networks connected to telecommunications networks and the internet long before the term Internet of Things existed. Although generally good, this exponential increase in the number of available datasets gives rise to the need to describe this data in a standardised way to not only be able to get a quick overview about the data but to also facilitate interoperability of data from different sources. The Federal Waterways Engineering and Research Institute (BAW) is a federal authority of the German Federal Ministry of Transport and Digital Infrastructure. BAW acts as a consultant for the safe and efficient operation of the German waterways. As part of its consultation role, BAW operates a number of physical and numerical models for sections of inland and marine waterways. In order to uniformly describe the data produced and consumed by these models throughout BAW and to ensure interoperability with other federal and state institutes on the one hand and with EU countries on the other, a metadata profile for hydroinformatic data has been developed at BAW. The metadata profile is composed in its entirety using the ISO 19115 international standard for metadata related to geographic information. Due to the widespread use of the ISO 19115 standard in the existing geodata infrastructure worldwide, the profile provides a means to describe hydroinformatic data that conforms to existing metadata standards. Additionally, EU and German national standards, INSPIRE and GDI-DE have been considered to ensure interoperability on an international and national level. Finally, elements of the GovData profile of the Federal Government of Germany have been integrated to be able to participate in its Open Data initiative. All these factors make the metadata profile developed at BAW highly suitable for describing hydroinformatic data in particular and physical state variables in general. Further details about this metadata profile will be presented at the conference. Acknowledgements: The authors would like to thank Christoph Wosniok and Peter Schade for their contributions towards the development of this metadata standard.
NASA Astrophysics Data System (ADS)
Booth, N. L.; Brodaric, B.; Lucido, J. M.; Kuo, I.; Boisvert, E.; Cunningham, W. L.
2011-12-01
The need for a national groundwater monitoring network within the United States is profound and has been recognized by organizations outside government as a major data gap for managing ground-water resources. Our country's communities, industries, agriculture, energy production and critical ecosystems rely on water being available in adequate quantity and suitable quality. To meet this need the Subcommittee on Ground Water, established by the Federal Advisory Committee on Water Information, created a National Ground Water Monitoring Network (NGWMN) envisioned as a voluntary, integrated system of data collection, management and reporting that will provide the data needed to address present and future ground-water management questions raised by Congress, Federal, State and Tribal agencies and the public. The NGWMN Data Portal is the means by which policy makers, academics and the public will be able to access ground water data through one seamless web-based application from disparate data sources. Data systems in the United States exist at many organizational and geographic levels and differing vocabulary and data structures have prevented data sharing and reuse. The data portal will facilitate the retrieval of and access to groundwater data on an as-needed basis from multiple, dispersed data repositories allowing the data to continue to be housed and managed by the data provider while being accessible for the purposes of the national monitoring network. This work leverages Open Geospatial Consortium (OGC) data exchange standards and information models. To advance these standards for supporting the exchange of ground water information, an OGC Interoperability Experiment was organized among international participants from government, academia and the private sector. The experiment focused on ground water data exchange across the U.S. / Canadian border. WaterML2.0, an evolving international standard for water observations, encodes ground water levels and is exchanged using the OGC Sensor Observation Service (SOS) standard. Ground Water Markup Language (GWML) encodes well log, lithology and construction information and is exchanged using the OGC Web Feature Service (WFS) standard. Within the NGWMN Data Portal, data exchange between distributed data provider repositories is achieved through the use of these web services and a central mediation hub, which performs both format (syntactic) and nomenclature (semantic) mediation, conforming heterogeneous inputs into common standards-based outputs. Through these common standards, interoperability between the U.S. NGWMN and Canada's Groundwater Information Network (GIN) is achieved, advancing a ground water virtual observatory across North America.
Federal Register 2010, 2011, 2012, 2013, 2014
2015-08-03
...] Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments AGENCY: Food... workshop entitled ``FDA/CDC/NLM Workshop on Promoting Semantic Interoperability of Laboratory Data.'' The... to promoting the semantic interoperability of laboratory data between in vitro diagnostic devices and...
ERIC Educational Resources Information Center
Hill, Linda L.; Crosier, Scott J.; Smith, Terrence R.; Goodchild, Michael; Iannella, Renato; Erickson, John S.; Reich, Vicky; Rosenthal, David S. H.
2001-01-01
Includes five articles. Topics include requirements for a content standard to describe computational models; architectures for digital rights management systems; access control for digital information objects; LOCKSS (Lots of Copies Keep Stuff Safe) that allows libraries to run Web caches for specific journals; and a Web site from the U.S.…
2012-04-01
Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December
The Units Ontology: a tool for integrating units of measurement in science
Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert
2012-01-01
Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432
NASA Astrophysics Data System (ADS)
Evans, Ben; Wyborn, Lesley; Druken, Kelsey; Richards, Clare; Trenham, Claire; Wang, Jingbo; Rozas Larraondo, Pablo; Steer, Adam; Smillie, Jon
2017-04-01
The National Computational Infrastructure (NCI) facility hosts one of Australia's largest repositories (10+ PBytes) of research data collections spanning datasets from climate, coasts, oceans, and geophysics through to astronomy, bioinformatics, and the social sciences domains. The data are obtained from national and international sources, spanning a wide range of gridded and ungridded (i.e., line surveys, point clouds) data, and raster imagery, as well as diverse coordinate reference projections and resolutions. Rather than managing these data assets as a digital library, whereby users can discover and download files to personal servers (similar to borrowing 'books' from a 'library'), NCI has built an extensive and well-integrated research data platform, the National Environmental Research Data Interoperability Platform (NERDIP, http://nci.org.au/data-collections/nerdip/). The NERDIP architecture enables programmatic access to data via standards-compliant services for high performance data analysis, and provides a flexible cloud-based environment to facilitate the next generation of transdisciplinary scientific research across all data domains. To improve use of modern scalable data infrastructures that are focused on efficient data analysis, the data organisation needs to be carefully managed including performance evaluations of projections and coordinate systems, data encoding standards and formats. A complication is that we have often found multiple domain vocabularies and ontologies are associated with equivalent datasets. It is not practical for individual dataset managers to determine which standards are best to apply to their dataset as this could impact accessibility and interoperability. Instead, they need to work with data custodians across interrelated communities and, in partnership with the data repository, the international scientific community to determine the most useful approach. For the data repository, this approach is essential to enable different disciplines and research communities to invoke new forms of analysis and discovery in an increasingly complex data-rich environment. Driven by the heterogeneity of Earth and environmental datasets, NCI developed a Data Quality/Data Assurance Strategy to ensure consistency is maintained within and across all datasets, as well as functionality testing to ensure smooth interoperability between products, tools, and services. This is particularly so for collections that contain data generated from multiple data acquisition campaigns, often using instruments and models that have evolved over time. By implementing the NCI Data Quality Strategy we have seen progressive improvement in the integration and quality of the datasets across the different subject domains, and through this, the ease by which the users can access data from this major data infrastructure. By both adhering to international standards and also contributing to extensions of these standards, data from the NCI NERDIP platform can be federated with data from other globally distributed data repositories and infrastructures. The NCI approach builds on our experience working with the astronomy and climate science communities, which have been internationally coordinating such interoperability standards within their disciplines for some years. The results of our work so far demonstrate more could be done in the Earth science, solid earth and environmental communities, particularly through establishing better linkages between international/national community efforts such as EPOS, ENVRIplus, EarthCube, AuScope and the Research Data Alliance.
Federal Register 2010, 2011, 2012, 2013, 2014
2016-10-04
...] Workshop on Promoting Semantic Interoperability of Laboratory Data; Public Workshop; Request for Comments... Semantic Interoperability of Laboratory Data.'' The purpose of this public workshop is to receive and... Semantic Interoperability of Laboratory Data.'' Received comments will be placed in the docket and, except...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-24
... Transmission Vegetation Management Reliability Standard; Notice of Compliance Filing Take notice that on July 12, 2013, the North American Electric Reliability Corporation (NERC), pursuant to Order No. 777 \\1... Reliability Standard FAC-003-2 to its Web site. \\1\\ Revisions to Reliability Standard for Transmission...
48 CFR 52.230-1 - Cost Accounting Standards Notices and Certification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Cost Accounting Standards... Provisions and Clauses 52.230-1 Cost Accounting Standards Notices and Certification. As prescribed in 30.201-3, insert the following provisions: Cost Accounting Standards Notices and Certification (OCT 2008...
78 FR 21929 - Transmission Relay Loadability Reliability Standard; Notice of Compliance Filing
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-12
... Relay Loadability Reliability Standard; Notice of Compliance Filing Take notice that on February 19... Relay Loadability Reliability Standard, Order No. 733, 130 FERC ] 61,221 (2010) (Order No. 733); order..., 136 FERC ] 61,185 (2011). \\2\\ Transmission Relay Loadability Reliability Standard, 138 FERC ] 61,197...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Release of the Exposure Draft of Technical... Accounting Standards Advisory Board. ACTION: Notice. Board Action: Pursuant to 31 U.S.C. 3511(d), the Federal... October, 2010, notice is hereby given that the Federal Accounting Standards Advisory Board (FASAB) has...
Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101
2012-07-01
interoperability, although they are supported by some interoperability attributes For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p
ERIC Educational Resources Information Center
Dietze, Stefan; Gugliotta, Alessio; Domingue, John
2009-01-01
Current E-Learning technologies primarily follow a data and metadata-centric paradigm by providing the learner with composite content containing the learning resources and the learning process description, usually based on specific metadata standards such as ADL SCORM or IMS Learning Design. Due to the design-time binding of learning resources,…
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2015-10-01
implemented connectivity to an EHR as an HL7 FHIR gateway from OpenICE. The monitor we used can measure heart rate from EKG, from the pulse oximeter , or... Pulse 2014:5(6):37-39. 2. Wu PL, Raguraman D, Sha L, Berlin RB, Goldman JM. WiP abstract: A treatment coordination protocol for cyber-physical-human
Delivery of QTIiv2 Question Types
ERIC Educational Resources Information Center
Wills, Gary B.; Davis, Hugh C.; Gilbert, Lester; Hare, Jonathon; Howard, Yvonne; Jeyes, Steve; Millard, David; Sherratt, Robert
2009-01-01
The IMS Question and Test Interoperability (QTI) standard identifies 16 different question types which may be used in online assessment. While some partial implementations exist, the R2Q2 project has developed a complete solution that renders and responds to all 16 question types as specified. In addition, care has been taken in the R2Q2 project…
2012-08-01
15 August 2012 2 UNCLASSIFIED VICTORY Overview • The VICTORY technical approach – Incorporate open-standards – Shared services (VICTORY services...availability) 15 August 2012 3 UNCLASSIFIED VICTORY Shared Services • VICTORY core services (Position, Orientation, Direction- of-Travel, and Time) • Service
Medical Device Plug-and-Play Interoperability Standards and Technology Leadership
2012-10-01
External Network Pump Adapter PulseOx Adapter • MD MP3 cart is a platform for the development of smart pump control algorithms • It includes...delivery with bounded latency Medical Device Mobile PnP Prototype Platform (MD MP3 ) • Got MDCF code to run on the BeagleBoard development boards we are
EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.
Interoperability Trends in Extravehicular Activity (EVA) Space Operations for the 21st Century
NASA Technical Reports Server (NTRS)
Miller, Gerald E.
1999-01-01
No other space operations in the 21 st century more comprehensively embody the challenges and dependencies of interoperability than EVA. This discipline is already functioning at an W1paralleled level of interagency, inter-organizational and international cooperation. This trend will only increase as space programs endeavor to expand in the face of shrinking budgets. Among the topics examined in this paper are hardware-oriented issues. Differences in design standards among various space participants dictate differences in the EVA tools that must be manufactured, flown and maintained on-orbit. Presently only two types of functional space suits exist in the world. However, three versions of functional airlocks are in operation. Of the three airlocks, only the International Space Station (ISS) Joint Airlock can accommodate both types of suits. Due to functional differences in the suits, completely different operating protocols are required for each. Should additional space suit or airlock designs become available, the complexity will increase. The lessons learned as a result of designing and operating within such a system are explored. This paper also examines the non-hardware challenges presented by interoperability for a discipline that is as uniquely dependent upon the individual as EVA. Operation of space suits (essentially single-person spacecrafts) by persons whose native language is not that of the suits' designers is explored. The intricacies of shared mission planning, shared control and shared execution of joint EVA's are explained. For example, once ISS is fully functional, the potential exists for two crewmembers of different nationality to be wearing suits manufactured and controlled by a third nation, while operating within an airlock manufactured and controlled by a fourth nation, in an effort to perform tasks upon hardware belonging to a fifth nation. Everything from training issues, to procedures development and writing, to real-time operations is addressed. Finally, this paper looks to the management challenges presented by interoperability in general. With budgets being reduced among all space-faring nations, the need to expand cooperation in the highly expensive field of human space operations is only going to intensify. The question facing management is not if the trend toward interoperation will continue, but how to best facilitate its doing so. Real-world EVA interoperability experience throughout the ShuttlelMir and ISS Programs is discussed to illustrate the challenges and
Modeling Interoperable Information Systems with 3LGM² and IHE.
Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A
2015-01-01
Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE developers can use or develop IHE profiles systematically. In order to improve the usability and handling of the IHE-master-model and its usage as a reference model, some further refinements have to be done. Evaluating the use of the IHE-master-model by information managers and IHE developers is subject to further research.
Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.
Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José
2016-04-01
The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value. Copyright © 2016 Elsevier Inc. All rights reserved.
Samal, Lipika; D'Amore, John D; Bates, David W; Wright, Adam
2017-11-01
Clinical decision support tools for risk prediction are readily available, but typically require workflow interruptions and manual data entry so are rarely used. Due to new data interoperability standards for electronic health records (EHRs), other options are available. As a clinical case study, we sought to build a scalable, web-based system that would automate calculation of kidney failure risk and display clinical decision support to users in primary care practices. We developed a single-page application, web server, database, and application programming interface to calculate and display kidney failure risk. Data were extracted from the EHR using the Consolidated Clinical Document Architecture interoperability standard for Continuity of Care Documents (CCDs). EHR users were presented with a noninterruptive alert on the patient's summary screen and a hyperlink to details and recommendations provided through a web application. Clinic schedules and CCDs were retrieved using existing application programming interfaces to the EHR, and we provided a clinical decision support hyperlink to the EHR as a service. We debugged a series of terminology and technical issues. The application was validated with data from 255 patients and subsequently deployed to 10 primary care clinics where, over the course of 1 year, 569 533 CCD documents were processed. We validated the use of interoperable documents and open-source components to develop a low-cost tool for automated clinical decision support. Since Consolidated Clinical Document Architecture-based data extraction extends to any certified EHR, this demonstrates a successful modular approach to clinical decision support. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Investigating the capabilities of semantic enrichment of 3D CityEngine data
NASA Astrophysics Data System (ADS)
Solou, Dimitra; Dimopoulou, Efi
2016-08-01
In recent years the development of technology and the lifting of several technical limitations, has brought the third dimension to the fore. The complexity of urban environments and the strong need for land administration, intensify the need of using a three-dimensional cadastral system. Despite the progress in the field of geographic information systems and 3D modeling techniques, there is no fully digital 3D cadastre. The existing geographic information systems and the different methods of three-dimensional modeling allow for better management, visualization and dissemination of information. Nevertheless, these opportunities cannot be totally exploited because of deficiencies in standardization and interoperability in these systems. Within this context, CityGML was developed as an international standard of the Open Geospatial Consortium (OGC) for 3D city models' representation and exchange. CityGML defines geometry and topology for city modeling, also focusing on semantic aspects of 3D city information. The scope of CityGML is to reach common terminology, also addressing the imperative need for interoperability and data integration, taking into account the number of available geographic information systems and modeling techniques. The aim of this paper is to develop an application for managing semantic information of a model generated based on procedural modeling. The model was initially implemented in CityEngine ESRI's software, and then imported to ArcGIS environment. Final goal was the original model's semantic enrichment and then its conversion to CityGML format. Semantic information management and interoperability seemed to be feasible by the use of the 3DCities Project ESRI tools, since its database structure ensures adding semantic information to the CityEngine model and therefore automatically convert to CityGML for advanced analysis and visualization in different application areas.
NASA Astrophysics Data System (ADS)
Messner, Richard A.; Hludik, Frank; Crowley, Todd A.; Vidacic, Dragan; Stetson, Barrett; Nadel, Lawrence D.; Nichols, Linda J.; Harris, Carol
2004-08-01
This paper describes the results of a collaborative effort between the University of New Hampshire (UNH) and the Mitretek Systems (MTS) Center for Criminal Justice Technology (CCJT). Mitretek conducted an investigation into the impact of anticipated biometrically encoded driver licenses (DLs) on law enforcement. As part of this activity, Mitretek teamed with UNH to leverage the results of UNH's Project54 and develop a pilot Driver License Interoperability Test Bed to explore both implementation and operational aspects associated with reading and authenticating biometrically encoded DLs in law enforcement scenarios. The test bed enables the exploration of new methods, techniques (both hardware and software), and standards in a structured fashion. Spearheaded by the American Association of Motor Vehicle Administrators (AAMVA) and the International Committee for Information Technology Standards Technical Group M1 (INCITS-M1) initiatives, standards involving both DLs and biometrics, respectively, are evolving at a rapid pace. In order to ensure that the proposed standards will provide for interstate interoperability and proper functionality for the law enforcement community, it is critical to investigate the implementation and deployment issues surrounding biometrically encoded DLs. The test bed described in this paper addresses this and will provide valuable feedback to the standards organizations, the states, and law enforcement officials with respect to implementation and functional issues that are exposed through exploration of actual test systems. The knowledge gained was incorporated into a report prepared by MTS to describe the anticipated impact of biometrically encoded DLs on law enforcement practice.
Progress toward Modular UAS for Geoscience Applications
NASA Astrophysics Data System (ADS)
Dahlgren, R. P.; Clark, M. A.; Comstock, R. J.; Fladeland, M.; Gascot, H., III; Haig, T. H.; Lam, S. J.; Mazhari, A. A.; Palomares, R. R.; Pinsker, E. A.; Prathipati, R. T.; Sagaga, J.; Thurling, J. S.; Travers, S. V.
2017-12-01
Small Unmanned Aerial Systems (UAS) have become accepted tools for geoscience, ecology, agriculture, disaster response, land management, and industry. A variety of consumer UAS options exist as science and engineering payload platforms, but their incompatibilities with one another contribute to high operational costs compared with those of piloted aircraft. This research explores the concept of modular UAS, demonstrating airframes that can be reconfigured in the field for experimental optimization, to enable multi-mission support, facilitate rapid repair, or respond to changing field conditions. Modular UAS is revolutionary in allowing aircraft to be optimized around the payload, reversing the conventional wisdom of designing the payload to accommodate an unmodifiable aircraft. UAS that are reconfigurable like Legos™ are ideal for airborne science service providers, system integrators, instrument designers and end users to fulfill a wide range of geoscience experiments. Modular UAS facilitate the adoption of open-source software and rapid prototyping technology where design reuse is important in the context of a highly regulated industry like aerospace. The industry is now at a stage where consolidation, acquisition, and attrition will reduce the number of small manufacturers, with a reduction of innovation and motivation to reduce costs. Modularity leads to interface specifications, which can evolve into de facto or formal standards which contain minimum (but sufficient) details such that multiple vendors can then design to those standards and demonstrate interoperability. At that stage, vendor coopetition leads to robust interface standards, interoperability standards and multi-source agreements which in turn drive costs down significantly.
Interoperability In The New Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.
2015-12-01
As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
Validation results of specifications for motion control interoperability
NASA Astrophysics Data System (ADS)
Szabo, Sandor; Proctor, Frederick M.
1997-01-01
The National Institute of Standards and Technology (NIST) is participating in the Department of Energy Technologies Enabling Agile Manufacturing (TEAM) program to establish interface standards for machine tool, robot, and coordinate measuring machine controllers. At NIST, the focus is to validate potential application programming interfaces (APIs) that make it possible to exchange machine controller components with a minimal impact on the rest of the system. This validation is taking place in the enhanced machine controller (EMC) consortium and is in cooperation with users and vendors of motion control equipment. An area of interest is motion control, including closed-loop control of individual axes and coordinated path planning. Initial tests of the motion control APIs are complete. The APIs were implemented on two commercial motion control boards that run on two different machine tools. The results for a baseline set of APIs look promising, but several issues were raised. These include resolving differing approaches in how motions are programmed and defining a standard measurement of performance for motion control. This paper starts with a summary of the process used in developing a set of specifications for motion control interoperability. Next, the EMC architecture and its classification of motion control APIs into two classes, Servo Control and Trajectory Planning, are reviewed. Selected APIs are presented to explain the basic functionality and some of the major issues involved in porting the APIs to other motion controllers. The paper concludes with a summary of the main issues and ways to continue the standards process.