Interoperability and information discovery
Christian, E.
2001-01-01
In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.
Tool and data interoperability in the SSE system
NASA Technical Reports Server (NTRS)
Shotton, Chuck
1988-01-01
Information is given in viewgraph form on tool and data interoperability in the Software Support Environment (SSE). Information is given on industry problems, SSE system interoperability issues, SSE solutions to tool and data interoperability, and attainment of heterogeneous tool/data interoperability.
Juzwishin, Donald W M
2009-01-01
Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.
Interoperability of Information Systems Managed and Used by the Local Health Departments.
Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet
2016-01-01
In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
Interoperability of Information Systems Managed and Used by the Local Health Departments
Leider, Jonathon P.; Luo, Huabin; Kaur, Ravneet
2016-01-01
Background: In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). Objectives: To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. Data and Methods: This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. Results: For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Conclusion: Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide. PMID:27684616
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
Enabling interoperability in planetary sciences and heliophysics: The case for an information model
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.
2018-01-01
The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.
Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun
2007-11-01
Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.
On the formal definition of the systems' interoperability capability: an anthropomorphic approach
NASA Astrophysics Data System (ADS)
Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav
2017-03-01
The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.
Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A
2008-02-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).
Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.
2008-01-01
One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service Oriented Architecture (SSOA) for cancer research by the National Cancer Institute’s cancer Biomedical Informatics Grid (caBIG™). PMID:17512259
The interoperability force in the ERP field
NASA Astrophysics Data System (ADS)
Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon
2015-04-01
Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
The Next Generation of Interoperability Agents in Healthcare
Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel ; Abelha, António; Machado, José
2014-01-01
Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA. PMID:24840351
Capurro, Daniel; Echeverry, Aisen; Figueroa, Rosa; Guiñez, Sergio; Taramasco, Carla; Galindo, César; Avendaño, Angélica; García, Alejandra; Härtel, Steffen
2017-01-01
Despite the continuous technical advancements around health information standards, a critical component to their widespread adoption involves political agreement between a diverse set of stakeholders. Countries that have addressed this issue have used diverse strategies. In this vision paper we present the path that Chile is taking to establish a national program to implement health information standards and achieve interoperability. The Chilean government established an inter-agency program to define the current interoperability situation, existing gaps, barriers, and facilitators for interoperable health information systems. As an answer to the identified issues, the government decided to fund a consortium of Chilean universities to create the National Center for Health Information Systems. This consortium should encourage the interaction between all health care stakeholders, both public and private, to advance the selection of national standards and define certification procedures for software and human resources in health information technologies.
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
Designing learning management system interoperability in semantic web
NASA Astrophysics Data System (ADS)
Anistyasari, Y.; Sarno, R.; Rochmawati, N.
2018-01-01
The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
The Long Road to Semantic Interoperability in Support of Public Health: Experiences from Two States
Vreeman, Daniel J.; Grannis, Shaun J.
2014-01-01
Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. PMID:24680985
Security and privacy of EHR systems--ethical, social and legal requirements.
Kluge, Eike-Henner W
2003-01-01
This paper addresses social, ethical and legal concerns about security and privacy that arise in the development of international interoperable health information systems. The paper deals with these concerns under four rubrics: the ethical status of electronic health records, the social and legal embedding of interoperable health information systems, the overall information-requirements healthcare as such, and the role of health information professionals as facilitators. It argues that the concerns that arise can be met if the development of interoperability protocols is guided by the seven basic principles of information ethics that have been enunciated in the IMIA Code of Ethics for Health Information Professionals and that are central to the ethical treatment of electronic health records.
NASA Technical Reports Server (NTRS)
Stephens, J. Briscoe; Grider, Gary W.
1992-01-01
These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.
Myneni, Sahiti; Patel, Vimla L.
2009-01-01
Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900
The Importance of State and Context in Safe Interoperable Medical Systems
Jaffe, Michael B.; Robkin, Michael; Rausch, Tracy; Arney, David; Goldman, Julian M.
2016-01-01
This paper describes why “device state” and “patient context” information are necessary components of device models for safe interoperability. This paper includes a discussion of the importance of describing the roles of devices with respect to interactions (including human user workflows involving devices, and device to device communication) within a system, particularly those intended for use at the point-of-care, and how this role information is communicated. In addition, it describes the importance of clinical scenarios in creating device models for interoperable devices. PMID:27730013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.
The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levelsmore » to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.« less
Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...
Rafael Moreno-Sanchez
2006-01-01
The aim of this is paper is to provide a conceptual framework for the session: âThe role of web-based Geographic Information Systems in supporting sustainable management.â The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...
Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M
2014-01-01
Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.
Oluoch, Tom; Muturi, David; Kiriinya, Rose; Waruru, Anthony; Lanyo, Kevin; Nguni, Robert; Ojwang, James; Waters, Keith P; Richards, Janise
2015-01-01
Sub-Saharan Africa (SSA) bears the heaviest burden of the HIV epidemic. Health workers play a critical role in the scale-up of HIV programs. SSA also has the weakest information and communication technology (ICT) infrastructure globally. Implementing interoperable national health information systems (HIS) is a challenge, even in developed countries. Countries in resource-limited settings have yet to demonstrate that interoperable systems can be achieved, and can improve quality of healthcare through enhanced data availability and use in the deployment of the health workforce. We established interoperable HIS integrating a Master Facility List (MFL), District Health Information Software (DHIS2), and Human Resources Information Systems (HRIS) through application programmers interfaces (API). We abstracted data on HIV care, health workers deployment, and health facilities geo-coordinates. Over 95% of data elements were exchanged between the MFL-DHIS and HRIS-DHIS. The correlation between the number of HIV-positive clients and nurses and clinical officers in 2013 was R2=0.251 and R2=0.261 respectively. Wrong MFL codes, data type mis-match and hyphens in legacy data were key causes of data transmission errors. Lack of information exchange standards for aggregate data made programming time-consuming.
Generic Educational Knowledge Representation for Adaptive and Cognitive Systems
ERIC Educational Resources Information Center
Caravantes, Arturo; Galan, Ramon
2011-01-01
The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…
Weininger, Sandy; Jaffe, Michael B; Goldman, Julian M
2017-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account these systems' perspective. In this article, we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups, some of which focus on safety and effectiveness and others focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development.
Weininger, Sandy; Jaffe, Michael B.; Goldman, Julian M
2016-01-01
Medical device and health information technology systems are increasingly interdependent with users demanding increased interoperability. Related safety standards must be developed taking into account this systems perspective. In this article we describe the current development of medical device standards and the need for these standards to address medical device informatics. Medical device information should be gathered from a broad range of clinical scenarios to lay the foundation for safe medical device interoperability. Five clinical examples show how medical device informatics principles, if applied in the development of medical device standards, could help facilitate the development of safe interoperable medical device systems. These examples illustrate the clinical implications of the failure to capture important signals and device attributes. We provide recommendations relating to the coordination between historically separate standards development groups; some which focus on safety and effectiveness, and others that focus on health informatics. We identify the need for a shared understanding among stakeholders and describe organizational structures to promote cooperation such that device-to-device interactions and related safety information are considered during standards development. PMID:27584685
A common type system for clinical natural language processing
2013-01-01
Background One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. Results We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. Conclusions We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types. PMID:23286462
A common type system for clinical natural language processing.
Wu, Stephen T; Kaggal, Vinod C; Dligach, Dmitriy; Masanz, James J; Chen, Pei; Becker, Lee; Chapman, Wendy W; Savova, Guergana K; Liu, Hongfang; Chute, Christopher G
2013-01-03
One challenge in reusing clinical data stored in electronic medical records is that these data are heterogenous. Clinical Natural Language Processing (NLP) plays an important role in transforming information in clinical text to a standard representation that is comparable and interoperable. Information may be processed and shared when a type system specifies the allowable data structures. Therefore, we aim to define a common type system for clinical NLP that enables interoperability between structured and unstructured data generated in different clinical settings. We describe a common type system for clinical NLP that has an end target of deep semantics based on Clinical Element Models (CEMs), thus interoperating with structured data and accommodating diverse NLP approaches. The type system has been implemented in UIMA (Unstructured Information Management Architecture) and is fully functional in a popular open-source clinical NLP system, cTAKES (clinical Text Analysis and Knowledge Extraction System) versions 2.0 and later. We have created a type system that targets deep semantics, thereby allowing for NLP systems to encapsulate knowledge from text and share it alongside heterogenous clinical data sources. Rather than surface semantics that are typically the end product of NLP algorithms, CEM-based semantics explicitly build in deep clinical semantics as the point of interoperability with more structured data types.
EUnetHTA information management system: development and lessons learned.
Chalon, Patrice X; Kraemer, Peter
2014-11-01
The aim of this study was to describe the techniques used in achieving consensus on common standards to be implemented in the EUnetHTA Information Management System (IMS); and to describe how interoperability between tools was explored. Three face to face meetings were organized to identify and agree on common standards to the development of online tools. Two tools were created to demonstrate the added value of implementing interoperability standards at local levels. Developers of tools outside EUnetHTA were identified and contacted. Four common standards have been agreed on by consensus; and consequently all EUnetHTA tools have been modified or designed accordingly. RDF Site Summary (RSS) has demonstrated a good potential to support rapid dissemination of HTA information. Contacts outside EUnetHTA resulted in direct collaboration (HTA glossary, HTAi Vortal), evaluation of options for interoperability between tools (CRD HTA database) or a formal framework to prepare cooperation on concrete projects (INAHTA projects database). While being entitled a project on IT infrastructure, the work program was also about people. When having to agree on complex topics, fostering a cohesive group dynamic and hosting face to face meetings brings added value and enhances understanding between partners. The adoption of widespread standards enhanced the homogeneity of the EUnetHTA tools and should thus contribute to their wider use, therefore, to the general objective of EUnetHTA. The initiatives on interoperability of systems need to be developed further to support a general interoperable information system that could benefit the whole HTA community.
Semantic and syntactic interoperability in online processing of big Earth observation data.
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).
Semantic and syntactic interoperability in online processing of big Earth observation data
Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea
2018-01-01
ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171
Blazona, Bojan; Koncar, Miroslav
2006-01-01
Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. Interoperability presents data and communication layer interchange. In this context we identified the HL7 standard as the world's leading medical Information and communication technology (ICT) standard for the business layer in healthcare information systems and we tried to explore the ability to exchange clinical documents with minimal integrated healthcare information systems (IHCIS) change. We explored HL7 Clinical Document Architecture (CDA) abilities to achieve radiology information system integration (DICOM) to IHCIS (HL7). We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers.
Multi-disciplinary interoperability challenges (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Annoni, Alessandro
2013-04-01
Global sustainability research requires multi-disciplinary efforts to address the key research challenges to increase our understanding of the complex relationships between environment and society. For this reason dependence on ICT systems interoperability is rapidly growing but, despite some relevant technological improvement is observed, in practice operational interoperable solutions are still lacking. Among the causes is the absence of a generally accepted definition of "interoperability" in all its broader aspects. In fact the concept of interoperability is just a concept and the more popular definitions are not addressing all challenges to realize operational interoperable solutions. The problem become even more complex when multi-disciplinary interoperability is required because in that case solutions for interoperability of different interoperable solution should be envisaged. In this lecture the following definition will be used: "interoperability is the ability to exchange information and to use it". In the lecture the main challenges for addressing multi-disciplinary interoperability will be presented and a set of proposed approaches/solutions shortly introduced.
Achieving Interoperability in GEOSS - How Close Are We?
NASA Astrophysics Data System (ADS)
Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.
2010-12-01
A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.
ERIC Educational Resources Information Center
Aburto, Rafael
2014-01-01
This qualitative study examined efforts by the military organizations and federal agencies to improve information sharing, interoperability, and systems integration in all business practices. More specifically, a survey instrument with six open-ended and eight demographic questions was used to measure the perceived progress, issues, challenges of…
Capturing Essential Information to Achieve Safe Interoperability.
Weininger, Sandy; Jaffe, Michael B; Rausch, Tracy; Goldman, Julian M
2017-01-01
In this article, we describe the role of "clinical scenario" information to assure the safety of interoperable systems, as well as the system's ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore, improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a "patient-controlled analgesia safety interlock" are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a learning health system to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A Web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™ (MD PnP Program), is described.
ERIC Educational Resources Information Center
Akpabio, Akpabio Enebong Ema
2013-01-01
Despite huge growth in hospital technology systems, there remains a dearth of literature examining health care administrator's perceptions of the efficacy of interoperable EHR systems. A qualitative research methodology was used in this multiple-case study to investigate the application of diffusion of innovations theory and the technology…
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
Capturing Essential Information to Achieve Safe Interoperability
Weininger, Sandy; Jaffe, Michael B.; Rausch, Tracy; Goldman, Julian M.
2016-01-01
In this article we describe the role of “clinical scenario” information to assure the safety of interoperable systems, as well as the system’s ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets (MDIDSa) and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a “Patient-controlled analgesia safety interlock” are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a Learning Health System to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™, is described. PMID:27387840
Seeking the Path to Metadata Nirvana
NASA Astrophysics Data System (ADS)
Graybeal, J.
2008-12-01
Scientists have always found reusing other scientists' data challenging. Computers did not fundamentally change the problem, but enabled more and larger instances of it. In fact, by removing human mediation and time delays from the data sharing process, computers emphasize the contextual information that must be exchanged in order to exchange and reuse data. This requirement for contextual information has two faces: "interoperability" when talking about systems, and "the metadata problem" when talking about data. As much as any single organization, the Marine Metadata Interoperability (MMI) project has been tagged with the mission "Solve the metadata problem." Of course, if that goal is achieved, then sustained, interoperable data systems for interdisciplinary observing networks can be easily built -- pesky metadata differences, like which protocol to use for data exchange, or what the data actually measures, will be a thing of the past. Alas, as you might imagine, there will always be complexities and incompatibilities that are not addressed, and data systems that are not interoperable, even within a science discipline. So should we throw up our hands and surrender to the inevitable? Not at all. Rather, we try to minimize metadata problems as much as we can. In this we increasingly progress, despite natural forces that pull in the other direction. Computer systems let us work with more complexity, build community knowledge and collaborations, and preserve and publish our progress and (dis-)agreements. Funding organizations, science communities, and technologists see the importance interoperable systems and metadata, and direct resources toward them. With the new approaches and resources, projects like IPY and MMI can simultaneously define, display, and promote effective strategies for sustainable, interoperable data systems. This presentation will outline the role metadata plays in durable interoperable data systems, for better or worse. It will describe times when "just choosing a standard" can work, and when it probably won't work. And it will point out signs that suggest a metadata storm is coming to your community project, and how you might avoid it. From these lessons we will seek a path to producing interoperable, interdisciplinary, metadata-enlightened environment observing systems.
The Health Service Bus: an architecture and case study in achieving interoperability in healthcare.
Ryan, Amanda; Eklund, Peter
2010-01-01
Interoperability in healthcare is a requirement for effective communication between entities, to ensure timely access to up to-date patient information and medical knowledge, and thus facilitate consistent patient care. An interoperability framework called the Health Service Bus (HSB), based on the Enterprise Service Bus (ESB) middleware software architecture is presented here as a solution to all three levels of interoperability as defined by the HL7 EHR Interoperability Work group in their definitive white paper "Coming to Terms". A prototype HSB system was implemented based on the Mule Open-Source ESB and is outlined and discussed, followed by a clinically-based example.
Operational Interoperability Challenges on the Example of GEOSS and WIS
NASA Astrophysics Data System (ADS)
Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.
2015-12-01
The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.
FLTSATCOM interoperability applications
NASA Astrophysics Data System (ADS)
Woolford, Lynn
A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.
IHE cross-enterprise document sharing for imaging: interoperability testing software
2010-01-01
Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241
IHE cross-enterprise document sharing for imaging: interoperability testing software.
Noumeir, Rita; Renaud, Bérubé
2010-09-21
With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.
Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges
NASA Astrophysics Data System (ADS)
Ryan, B. J.
2015-12-01
Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations Framework Convention on Climate Change (UNFCCC) having occurred this year, it will be essential that the interoperability challenges described herein, regardless of their nature, be expeditiously addressed so that Earth observations can indeed inform societal decision-making.
Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd
2014-01-01
Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.
Scientific Digital Libraries, Interoperability, and Ontologies
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.
2009-01-01
Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.
Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Patten, K.
2014-12-01
The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.
The GEOSS solution for enabling data interoperability and integrative research.
Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola
2014-03-01
Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.
Interoperability in Personalized Adaptive Learning
ERIC Educational Resources Information Center
Aroyo, Lora; Dolog, Peter; Houben, Geert-Jan; Kravcik, Milos; Naeve, Ambjorn; Nilsson, Mikael; Wild, Fridolin
2006-01-01
Personalized adaptive learning requires semantic-based and context-aware systems to manage the Web knowledge efficiently as well as to achieve semantic interoperability between heterogeneous information resources and services. The technological and conceptual differences can be bridged either by means of standards or via approaches based on the…
Dynamic Business Networks: A Headache for Sustainable Systems Interoperability
NASA Astrophysics Data System (ADS)
Agostinho, Carlos; Jardim-Goncalves, Ricardo
Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.
Report on the Second Catalog Interoperability Workshop
NASA Technical Reports Server (NTRS)
Thieman, James R.; James, Mary E.
1988-01-01
The events, resolutions, and recommendations of the Second Catalog Interoperability Workshop, held at JPL in January, 1988, are discussed. This workshop dealt with the issues of standardization and communication among directories, catalogs, and inventories in the earth and space science data management environment. The Directory Interchange Format, being constructed as a standard for the exchange of directory information among participating data systems, is discussed. Involvement in the Interoperability effort by NASA, NOAA, ISGS, and NSF is described, and plans for future interoperability considered. The NASA Master Directory prototype is presented and critiqued and options for additional capabilities debated.
Maturity model for enterprise interoperability
NASA Astrophysics Data System (ADS)
Guédria, Wided; Naudet, Yannick; Chen, David
2015-01-01
Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.
Potential interoperability problems facing multi-site radiation oncology centers in The Netherlands
NASA Astrophysics Data System (ADS)
Scheurleer, J.; Koken, Ph; Wessel, R.
2014-03-01
Aim: To identify potential interoperability problems facing multi-site Radiation Oncology (RO) departments in the Netherlands and solutions for unambiguous multi-system workflows. Specific challenges confronting the RO department of VUmc (RO-VUmc), which is soon to open a satellite department, were characterized. Methods: A nationwide questionnaire survey was conducted to identify possible interoperability problems and solutions. Further detailed information was obtained by in-depth interviews at 3 Dutch RO institutes that already operate in more than one site. Results: The survey had a 100% response rate (n=21). Altogether 95 interoperability problems were described. Most reported problems were on a strategic and semantic level. The majority were DICOM(-RT) and HL7 related (n=65), primarily between treatment planning and verification systems or between departmental and hospital systems. Seven were identified as being relevant for RO-VUmc. Departments have overcome interoperability problems with their own, or with tailor-made vendor solutions. There was little knowledge about or utilization of solutions developed by Integrating the Healthcare Enterprise Radiation Oncology (IHE-RO). Conclusions: Although interoperability problems are still common, solutions have been identified. Awareness of IHE-RO needs to be raised. No major new interoperability problems are predicted as RO-VUmc develops into a multi-site department.
Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat
2015-01-01
Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.
NASA Technical Reports Server (NTRS)
Bradley, Arthur; Dubowsky, Steven; Quinn, Roger; Marzwell, Neville
2005-01-01
Robots that operate independently of one another will not be adequate to accomplish the future exploration tasks of long-distance autonomous navigation, habitat construction, resource discovery, and material handling. Such activities will require that systems widely share information, plan and divide complex tasks, share common resources, and physically cooperate to manipulate objects. Recognizing the need for interoperable robots to accomplish the new exploration initiative, NASA s Office of Exploration Systems Research & Technology recently funded the development of the Joint Technical Architecture for Robotic Systems (JTARS). JTARS charter is to identify the interface standards necessary to achieve interoperability among space robots. A JTARS working group (JTARS-WG) has been established comprising recognized leaders in the field of space robotics including representatives from seven NASA centers along with academia and private industry. The working group s early accomplishments include addressing key issues required for interoperability, defining which systems are within the project s scope, and framing the JTARS manuals around classes of robotic systems.
Fundamental Data Standards for Science Data System Interoperability and Data Correlation
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Gopala Krishna, Barla; Rye, Elizabeth; Crichton, Daniel
The advent of the Web and languages such as XML have brought an explosion of online science data repositories and the promises of correlated data and interoperable systems. However there have been relatively few successes in meeting the expectations of science users in the internet age. For example a Google-like search for images of Mars will return many highly-derived and appropriately tagged images but largely ignore the majority of images in most online image repositories. Once retrieved, users are further frustrated by poor data descriptions, arcane formats, and badly organized ancillary information. A wealth of research indicates that shared information models are needed to enable system interoperability and data correlation. However, at a more fundamental level, data correlation and system interoperability are dependant on a relatively few shared data standards. A com-mon data dictionary standard, for example, allows the controlled vocabulary used in a science repository to be shared with potential collaborators. Common data registry and product iden-tification standards enable systems to efficiently find, locate, and retrieve data products and their metadata from remote repositories. Information content standards define categories of descriptive data that help make the data products scientifically useful to users who were not part of the original team that produced the data. The Planetary Data System (PDS) has a plan to move the PDS to a fully online, federated system. This plan addresses new demands on the system including increasing data volume, numbers of missions, and complexity of missions. A key component of this plan is the upgrade of the PDS Data Standards. The adoption of the core PDS data standards by the International Planetary Data Alliance (IPDA) adds the element of international cooperation to the plan. This presentation will provide an overview of the fundamental data standards being adopted by the PDS that transcend science domains and that will help to meet the PDS's and IPDA's system interoperability and data correlation requirements.
Managing interoperability and complexity in health systems.
Bouamrane, M-M; Tao, C; Sarkar, I N
2015-01-01
In recent years, we have witnessed substantial progress in the use of clinical informatics systems to support clinicians during episodes of care, manage specialised domain knowledge, perform complex clinical data analysis and improve the management of health organisations' resources. However, the vision of fully integrated health information eco-systems, which provide relevant information and useful knowledge at the point-of-care, remains elusive. This journal Focus Theme reviews some of the enduring challenges of interoperability and complexity in clinical informatics systems. Furthermore, a range of approaches are proposed in order to address, harness and resolve some of the many remaining issues towards a greater integration of health information systems and extraction of useful or new knowledge from heterogeneous electronic data repositories.
Laplante-Lévesque, Ariane; Abrams, Harvey; Bülow, Maja; Lunner, Thomas; Nelson, John; Riis, Søren Kamaric; Vanpoucke, Filiep
2016-10-01
This article describes the perspectives of hearing device manufacturers regarding the exciting developments that the Internet makes possible. Specifically, it proposes to join forces toward interoperability and standardization of Internet and audiology. A summary of why such a collaborative effort is required is provided from historical and scientific perspectives. A roadmap toward interoperability and standardization is proposed. Information and communication technologies improve the flow of health care data and pave the way to better health care. However, hearing-related products, features, and services are notoriously heterogeneous and incompatible with other health care systems (no interoperability). Standardization is the process of developing and implementing technical standards (e.g., Noah hearing database). All parties involved in interoperability and standardization realize mutual gains by making mutually consistent decisions. De jure (officially endorsed) standards can be developed in collaboration with large national health care systems as well as spokespeople for hearing care professionals and hearing device users. The roadmap covers mutual collaboration; data privacy, security, and ownership; compliance with current regulations; scalability and modularity; and the scope of interoperability and standards. We propose to join forces to pave the way to the interoperable Internet and audiology products, features, and services that the world needs.
Pyke, Christopher R; Madan, Isaac
2013-08-01
The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.
Orlova, Anna O; Dunnagan, Mark; Finitzo, Terese; Higgins, Michael; Watkins, Todd; Tien, Allen; Beales, Steven
2005-01-01
Information exchange, enabled by computable interoperability, is the key to many of the initiatives underway including the development of Regional Health Information Exchanges, Regional Health Information Organizations, and the National Health Information Network. These initiatives must include public health as a full partner in the emerging transformation of our nation's healthcare system through the adoption and use of information technology. An electronic health record - public health (EHR-PH)system prototype was developed to demonstrate the feasibility of electronic data transfer from a health care provider, i.e. hospital or ambulatory care settings, to multiple customized public health systems which include a Newborn Metabolic Screening Registry, a Newborn Hearing Screening Registry, an Immunization Registry and a Communicable Disease Registry, using HL7 messaging standards. Our EHR-PH system prototype can be considered a distributed EHR-based RHIE/RHIO model - a principal element for a potential technical architecture for a NHIN.
Smart Grid Interoperability Maturity Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Levinson, Alex; Mater, J.
2010-04-28
The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizationalmore » alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.« less
Personalized-detailed clinical model for data interoperability among clinical standards.
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung
2013-08-01
Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.
Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir
2013-01-01
Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730
Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O
2008-11-06
In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.
Enhancing security and improving interoperability in healthcare information systems.
Gritzalis, D A
1998-01-01
Security is a key issue in healthcare information systems, since most aspects of security become of considerable or even critical importance when handling healthcare information. In addition, the intense need for information exchange has revealed interoperability of systems and applications as another key issue. Standardization can play an important role towards both these issues. In this paper, relevant standardization activities are briefly presented, and existing and emerging healthcare information security standards are identified and critically analysed. The analysis is based on a framework which has been developed for this reason. Therefore, the identification of gaps and inconsistencies in current standardization, the description of the conflicts of standards with legislation, and the analysis of implications of these standards to user organizations, are the main results of this paper.
Smart Grid Interoperability Maturity Model Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Drummond, R.; Giroti, Tony
The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across anmore » information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.« less
Design and Implementation of e-Health System Based on Semantic Sensor Network Using IETF YANG.
Jin, Wenquan; Kim, Do Hyeun
2018-02-20
Recently, healthcare services can be delivered effectively to patients anytime and anywhere using e-Health systems. e-Health systems are developed through Information and Communication Technologies (ICT) that involve sensors, mobiles, and web-based applications for the delivery of healthcare services and information. Remote healthcare is an important purpose of the e-Health system. Usually, the eHealth system includes heterogeneous sensors from diverse manufacturers producing data in different formats. Device interoperability and data normalization is a challenging task that needs research attention. Several solutions are proposed in the literature based on manual interpretation through explicit programming. However, programmatically implementing the interpretation of the data sender and data receiver in the e-Health system for the data transmission is counterproductive as modification will be required for each new device added into the system. In this paper, an e-Health system with the Semantic Sensor Network (SSN) is proposed to address the device interoperability issue. In the proposed system, we have used IETF YANG for modeling the semantic e-Health data to represent the information of e-Health sensors. This modeling scheme helps in provisioning semantic interoperability between devices and expressing the sensing data in a user-friendly manner. For this purpose, we have developed an ontology for e-Health data that supports different styles of data formats. The ontology is defined in YANG for provisioning semantic interpretation of sensing data in the system by constructing meta-models of e-Health sensors. The proposed approach assists in the auto-configuration of eHealth sensors and querying the sensor network with semantic interoperability support for the e-Health system.
Design and Implementation of e-Health System Based on Semantic Sensor Network Using IETF YANG
Kim, Do Hyeun
2018-01-01
Recently, healthcare services can be delivered effectively to patients anytime and anywhere using e-Health systems. e-Health systems are developed through Information and Communication Technologies (ICT) that involve sensors, mobiles, and web-based applications for the delivery of healthcare services and information. Remote healthcare is an important purpose of the e-Health system. Usually, the eHealth system includes heterogeneous sensors from diverse manufacturers producing data in different formats. Device interoperability and data normalization is a challenging task that needs research attention. Several solutions are proposed in the literature based on manual interpretation through explicit programming. However, programmatically implementing the interpretation of the data sender and data receiver in the e-Health system for the data transmission is counterproductive as modification will be required for each new device added into the system. In this paper, an e-Health system with the Semantic Sensor Network (SSN) is proposed to address the device interoperability issue. In the proposed system, we have used IETF YANG for modeling the semantic e-Health data to represent the information of e-Health sensors. This modeling scheme helps in provisioning semantic interoperability between devices and expressing the sensing data in a user-friendly manner. For this purpose, we have developed an ontology for e-Health data that supports different styles of data formats. The ontology is defined in YANG for provisioning semantic interpretation of sensing data in the system by constructing meta-models of e-Health sensors. The proposed approach assists in the auto-configuration of eHealth sensors and querying the sensor network with semantic interoperability support for the e-Health system. PMID:29461493
Blazona, Bojan; Koncar, Miroslav
2007-12-01
Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. However, this requirement represents one of the major challenges for the Information and Communication Technology (ICT) solutions, as systems today use diverse technologies, proprietary protocols and communication standards which are often not interoperable. One of the main producers of clinical information in healthcare settings represent Radiology Information Systems (RIS) that communicate using widely adopted DICOM (Digital Imaging and COmmunications in Medicine) standard, but in very few cases can efficiently integrate information of interest with other systems. In this context we identified HL7 standard as the world's leading medical ICT standard that is envisioned to provide the umbrella for medical data semantic interoperability, which amongst other things represents the cornerstone for the Croatia's National Integrated Healthcare Information System (IHCIS). The aim was to explore the ability to integrate and exchange RIS originated data with Hospital Information Systems based on HL7's CDA (Clinical Document Architecture) standard. We explored the ability of HL7 CDA specifications and methodology to address the need of RIS integration HL7 based healthcare information systems. We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers. The outcome of our pilot work proves our original assumption of HL7 standard being able to adopt radiology data into the integrated healthcare systems. Uniform DICOM to CDA translation scripts and business processes within IHCIS is desired and cost effective regarding to use of supporting IHCIS services aligned to SOA.
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
Phillips, Joshua; Chilukuri, Ram; Fragoso, Gilberto; Warzel, Denise; Covitz, Peter A
2006-01-06
Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs). The National Cancer Institute (NCI) developed the cancer common ontologic representation environment (caCORE) to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK) was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. The caCORE SDK requires a Unified Modeling Language (UML) tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR) using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG) program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has emerged as a key enabling technology for caBIG. The caCORE SDK substantially lowers the barrier to implementing systems that are syntactically and semantically interoperable by providing workflow and automation tools that standardize and expedite modeling, development, and deployment. It has gained acceptance among developers in the caBIG program, and is expected to provide a common mechanism for creating data service nodes on the data grid that is under development.
Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano
2012-08-01
Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180,000 first aid medical reports, and 58,000 discharge summaries. Hence, despite there being still work in progress, the Lombardy Region healthcare system is a fully interoperable social healthcare system connecting patients, healthcare providers, healthcare organizations, and healthcare professionals in a large and heterogeneous territory through the implementation of international health standards. Copyright © 2012 Elsevier Inc. All rights reserved.
Look who's talking. A guide to interoperability groups and resources.
2011-06-01
There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.
A SOA-Based Platform to Support Clinical Data Sharing.
Gazzarata, R; Giannini, B; Giacomini, M
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the "Interoperable" Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the "Interoperable" Tier, the current solution actually covers the "Connected" Tier, due to local hospital policy restrictions.
Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.
Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert
2017-08-01
Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.
Semantically Interoperable XML Data
Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel
2013-01-01
XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789
Inter-organizational future proof EHR systems. A review of the security and privacy related issues.
van der Linden, Helma; Kalra, Dipak; Hasman, Arie; Talmon, Jan
2009-03-01
Identification and analysis of privacy and security related issues that occur when health information is exchanged between health care organizations. Based on a generic scenario questions were formulated to reveal the occurring issues. Possible answers were verified in literature. Ensuring secure health information exchange across organizations requires a standardization of security measures that goes beyond organizational boundaries, such as global definitions of professional roles, global standards for patient consent and semantic interoperable audit logs. As to be able to fully address the privacy and security issues in interoperable EHRs and the long-life virtual EHR it is necessary to realize a paradigm shift from storing all incoming information in a local system to retrieving information from external systems whenever that information is deemed necessary for the care of the patient.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
Orlova, Anna O.; Dunnagan, Mark; Finitzo, Terese; Higgins, Michael; Watkins, Todd; Tien, Allen; Beales, Steven
2005-01-01
Information exchange, enabled by computable interoperability, is the key to many of the initiatives underway including the development of Regional Health Information Exchanges, Regional Health Information Organizations, and the National Health Information Network. These initiatives must include public health as a full partner in the emerging transformation of our nation’s healthcare system through the adoption and use of information technology. An electronic health record - public health (EHR-PH) system prototype was developed to demonstrate the feasibility of electronic data transfer from a health care provider, i.e. hospital or ambulatory care settings, to multiple customized public health systems which include a Newborn Metabolic Screening Registry, a Newborn Hearing Screening Registry, an Immunization Registry and a Communicable Disease Registry, using HL7 messaging standards. Our EHR-PH system prototype can be considered a distributed EHR-based RHIE/RHIO model - a principal element for a potential technical architecture for a NHIN. PMID:16779105
Slotwiner, David J
2016-10-01
The anticipated advantages of electronic health records (EHRs)-improved efficiency and the ability to share information across the healthcare enterprise-have so far failed to materialize. There is growing recognition that interoperability holds the key to unlocking the greatest value of EHRs. Health information technology (HIT) systems including EHRs must be able to share data and be able to interpret the shared data. This requires a controlled vocabulary with explicit definitions (data elements) as well as protocols to communicate the context in which each data element is being used (syntactic structure). Cardiac implantable electronic devices (CIEDs) provide a clear example of the challenges faced by clinicians when data is not interoperable. The proprietary data formats created by each CIED manufacturer, as well as the multiple sources of data generated by CIEDs (hospital, office, remote monitoring, acute care setting), make it challenging to aggregate even a single patient's data into an EHR. The Heart Rhythm Society and CIED manufacturers have collaborated to develop and implement international standard-based specifications for interoperability that provide an end-to-end solution, enabling structured data to be communicated from CIED to a report generation system, EHR, research database, referring physician, registry, patient portal, and beyond. EHR and other health information technology vendors have been slow to implement these tools, in large part, because there have been no financial incentives for them to do so. It is incumbent upon us, as clinicians, to insist that the tools of interoperability be a prerequisite for the purchase of any and all health information technology systems.
NASA Astrophysics Data System (ADS)
Mueller, Wolfgang; Mueller, Henning; Marchand-Maillet, Stephane; Pun, Thierry; Squire, David M.; Pecenovic, Zoran; Giess, Christoph; de Vries, Arjen P.
2000-10-01
While in the area of relational databases interoperability is ensured by common communication protocols (e.g. ODBC/JDBC using SQL), Content Based Image Retrieval Systems (CBIRS) and other multimedia retrieval systems are lacking both a common query language and a common communication protocol. Besides its obvious short term convenience, interoperability of systems is crucial for the exchange and analysis of user data. In this paper, we present and describe an extensible XML-based query markup language, called MRML (Multimedia Retrieval markup Language). MRML is primarily designed so as to ensure interoperability between different content-based multimedia retrieval systems. Further, MRML allows researchers to preserve their freedom in extending their system as needed. MRML encapsulates multimedia queries in a way that enable multimedia (MM) query languages, MM content descriptions, MM query engines, and MM user interfaces to grow independently from each other, reaching a maximum of interoperability while ensuring a maximum of freedom for the developer. For benefitting from this, only a few simple design principles have to be respected when extending MRML for one's fprivate needs. The design of extensions withing the MRML framework will be described in detail in the paper. MRML has been implemented and tested for the CBIRS Viper, using the user interface Snake Charmer. Both are part of the GNU project and can be downloaded at our site.
Secure and interoperable communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David
2016-05-01
The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
Improving the Interoperability and Usability of NASA Earth Observation Data
NASA Astrophysics Data System (ADS)
Walter, J.; Berrick, S. W.; Murphy, K. J.; Mitchell, A. E.; Tilmes, C.
2014-12-01
NASA's Earth Science Data and Information System Project (ESDIS) is charged with managing, maintaining, and evolving NASA's Earth Observing System Data and Information System (EOSDIS) and is responsible for processing, archiving, and distributing NASA Earth Science data. The system supports a multitude of missions and serves diverse science research and other user communities. While NASA has made, and continues to make, great strides in the discoverability and accessibility of its earth observation data holdings, issues associated with data interoperability and usability still present significant challenges to realizing the full scientific and societal benefits of these data. This concern has been articulated by multiple government agencies, both U.S. and international, as well as other non-governmental organizations around the world. Among these is the White House Office of Science and Technology Policy who, in response, has launched the Big Earth Data Initiative and the Climate Data Initiative to address these concerns for U.S. government agencies. This presentation will describe NASA's approach for addressing data interoperability and usability issues with our earth observation data.
Integrated care: an Information Model for Patient Safety and Vigilance Reporting Systems.
Rodrigues, Jean-Marie; Schulz, Stefan; Souvignet, Julien
2015-01-01
Quality management information systems for safety as a whole or for specific vigilances share the same information types but are not interoperable. An international initiative tries to develop an integrated information model for patient safety and vigilance reporting to support a global approach of heath care quality.
Modeling Interoperable Information Systems with 3LGM² and IHE.
Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A
2015-01-01
Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE developers can use or develop IHE profiles systematically. In order to improve the usability and handling of the IHE-master-model and its usage as a reference model, some further refinements have to be done. Evaluating the use of the IHE-master-model by information managers and IHE developers is subject to further research.
Economic impact of a nationwide interoperable e-Health system using the PENG evaluation tool.
Parv, L; Saluse, J; Aaviksoo, A; Tiik, M; Sepper, R; Ross, P
2012-01-01
The aim of this paper is to evaluate the costs and benefits of the Estonian interoperable health information exchange system. In addition, a framework will be built for follow-up monitoring and analysis of a nationwide HIE system. PENG evaluation tool was used to map and quantify the costs and benefits arising from type II diabetic patient management for patients, providers and the society. The analysis concludes with a quantification based on real costs and potential benefits identified by a panel of experts. Setting up a countrywide interoperable eHealth system incurs a large initial investment. However, if the system is working seamlessly, benefits will surpass costs within three years. The results show that while the society stands to benefit the most, the costs will be mainly borne by the healthcare providers. Therefore, new government policies should be devised to encourage providers to invest to ensure society wide benefits.
2017-01-01
Background Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. Objective The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. Methods An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). Results The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. Conclusions This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. PMID:28851681
Reis, Zilma Silveira Nogueira; Maia, Thais Abreu; Marcolino, Milena Soriano; Becerra-Posada, Francisco; Novillo-Ortiz, David; Ribeiro, Antonio Luiz Pinho
2017-08-29
Electronic health (eHealth) interventions may improve the quality of care by providing timely, accessible information about one patient or an entire population. Electronic patient care information forms the nucleus of computerized health information systems. However, interoperability among systems depends on the adoption of information standards. Additionally, investing in technology systems requires cost-effectiveness studies to ensure the sustainability of processes for stakeholders. The objective of this study was to assess cost-effectiveness of the use of electronically available inpatient data systems, health information exchange, or standards to support interoperability among systems. An overview of systematic reviews was conducted, assessing the MEDLINE, Cochrane Library, LILACS, and IEEE Library databases to identify relevant studies published through February 2016. The search was supplemented by citations from the selected papers. The primary outcome sought the cost-effectiveness, and the secondary outcome was the impact on quality of care. Independent reviewers selected studies, and disagreement was resolved by consensus. The quality of the included studies was evaluated using a measurement tool to assess systematic reviews (AMSTAR). The primary search identified 286 papers, and two papers were manually included. A total of 211 were systematic reviews. From the 20 studies that were selected after screening the title and abstract, 14 were deemed ineligible, and six met the inclusion criteria. The interventions did not show a measurable effect on cost-effectiveness. Despite the limited number of studies, the heterogeneity of electronic systems reported, and the types of intervention in hospital routines, it was possible to identify some preliminary benefits in quality of care. Hospital information systems, along with information sharing, had the potential to improve clinical practice by reducing staff errors or incidents, improving automated harm detection, monitoring infections more effectively, and enhancing the continuity of care during physician handoffs. This review identified some benefits in the quality of care but did not provide evidence that the implementation of eHealth interventions had a measurable impact on cost-effectiveness in hospital settings. However, further evidence is needed to infer the impact of standards adoption or interoperability in cost benefits of health care; this in turn requires further research. ©Zilma Silveira Nogueira Reis, Thais Abreu Maia, Milena Soriano Marcolino, Francisco Becerra-Posada, David Novillo-Ortiz, Antonio Luiz Pinho Ribeiro. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 29.08.2017.
Phillips, Joshua; Chilukuri, Ram; Fragoso, Gilberto; Warzel, Denise; Covitz, Peter A
2006-01-01
Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs). The National Cancer Institute (NCI) developed the cancer common ontologic representation environment (caCORE) to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK) was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML) tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR) using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG) program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has emerged as a key enabling technology for caBIG. Conclusion The caCORE SDK substantially lowers the barrier to implementing systems that are syntactically and semantically interoperable by providing workflow and automation tools that standardize and expedite modeling, development, and deployment. It has gained acceptance among developers in the caBIG program, and is expected to provide a common mechanism for creating data service nodes on the data grid that is under development. PMID:16398930
Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji
2013-12-01
The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.
Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J
2008-09-01
The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.
Spyropoulos, B; Tzavaras, A; Zogogianni, D; Botsivaly, M
2013-01-01
The purpose of this paper is to present the design and the current development status of an Anesthesia Information Management System (AIMS). For this system, the physical and technical advances, depicted in relevant, recently published Industrial Property documents, have been taken into account. Additional innovative sensors create further data-load to be managed. Novel wireless data-transmission modes demand eventually compliance to further proper standards, so that interoperability between AIMS and the existing Hospital Information Systems is being sustained. We attempted to define, the state-of-the-art concerning the functions, the design-prerequisites and the relevant standards and of an "emerging" AIMS that is combining hardware innovation, real-time data acquisition, processing and displaying and lastly enabling the necessary interoperability with the other components of the existing Hospital Information Systems. Finally, we report based on this approach, about the design and implementation status, of our "real-world" system under development and discuss the multifarious obstacles encountered during this still on-going project.
NASA Astrophysics Data System (ADS)
Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael
2015-04-01
We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.
Garcia, Macarena C; Garrett, Nedra Y; Singletary, Vivian; Brown, Sheereen; Hennessy-Burt, Tamara; Haney, Gillian; Link, Kimberly; Tripp, Jennifer; Mac Kenzie, William R; Yoon, Paula
2017-12-07
State and local public health agencies collect and use surveillance data to identify outbreaks, track cases, investigate causes, and implement measures to protect the public-s health through various surveillance systems and data exchange practices. The purpose of this assessment was to better understand current practices at state and local public health agencies for collecting, managing, processing, reporting, and exchanging notifiable disease surveillance information. Over an 18-month period (January 2014-June 2015), we evaluated the process of data exchange between surveillance systems, reporting burdens, and challenges within 3 states (California, Idaho, and Massachusetts) that were using 3 different reporting systems. All 3 states use a combination of paper-based and electronic information systems for managing and exchanging data on reportable conditions within the state. The flow of data from local jurisdictions to the state health departments varies considerably. When state and local information systems are not interoperable, manual duplicative data entry and other work-arounds are often required. The results of the assessment show the complexity of disease reporting at the state and local levels and the multiple systems, processes, and resources engaged in preparing, processing, and transmitting data that limit interoperability and decrease efficiency. Through this structured assessment, the Centers for Disease Control and Prevention (CDC) has a better understanding of the complexities for surveillance of using commercial off-the-shelf data systems (California and Massachusetts), and CDC-developed National Electronic Disease Surveillance System Base System. More efficient data exchange and use of data will help facilitate interoperability between National Notifiable Diseases Surveillance Systems.
Food product tracing technology capabilities and interoperability.
Bhatt, Tejas; Zhang, Jianrong Janet
2013-12-01
Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise identify the supply-chain partner who provided the information prior to sharing this information with product tracing technology providers. The 9 traceability solution providers who agreed to participate in this project have their systems deployed in a wide range of sectors within the food industry including, but not limited to, livestock, dairy, produce, fruits, seafood, meat, and pork; as well as in pharmaceutical, automotive, retail, and other industries. Some have also been implemented across the globe including Canada, China, USA, Norway, and the EU, among others. This broad commercial use ensures that the findings of this work are applicable to a broad spectrum of the food system. Six of the 9 participants successfully completed the data entry phase of this test. To verify successful data entry for these 6, a demo or screenshots of the data set from each system's user interface was requested. Only 4 of the 6 were able to provide us with this evidence for verification. Of the 6 that completed data entry and moved on to the scenarios phase of the test, 5 were able to provide us with the responses to the scenarios. Time metrics were useful for evaluating the scalability and usability of each technology. Scalability was derived from the time it took to enter the nonstandardized data set into the system (ranges from 7 to 11 d). Usability was derived from the time it took to query the scenarios and provide the results (from a few hours to a week). The time was measured in days it took for the participants to respond after we supplied them all the information they would need to successfully execute each test/scenario. Two of the technology solution providers successfully implemented and participated in a proof-of-concept interoperable framework during Year 2 of this study. While not required, they also demonstrated this interoperability capability on the FSMA-mandated food product tracing pilots for the U.S. FDA. This has significant real-world impact since the demonstration of interoperability enables U.S. FDA to obtain evidence on the importance and impact of data-sharing moving forward. Another real-world accomplishment is the modification or upgrade of commercial technology solutions to enhance or implement interoperability. As these systems get deployed by clients in the food industry, interoperability will no longer be an afterthought but will be built into their traceability systems. In turn, industry and regulators will better understand the capabilities of the currently available technologies, and the technology provider community will identify ways in which their systems may be further developed to increase interoperability and utility. © 2013 Institute of Food Technologists®
A Domain-Specific Language for Aviation Domain Interoperability
ERIC Educational Resources Information Center
Comitz, Paul
2013-01-01
Modern information systems require a flexible, scalable, and upgradeable infrastructure that allows communication and collaboration between heterogeneous information processing and computing environments. Aviation systems from different organizations often use differing representations and distribution policies for the same data and messages,…
Towards semantic interoperability for electronic health records.
Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam
2007-01-01
In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.
Investigating the capabilities of semantic enrichment of 3D CityEngine data
NASA Astrophysics Data System (ADS)
Solou, Dimitra; Dimopoulou, Efi
2016-08-01
In recent years the development of technology and the lifting of several technical limitations, has brought the third dimension to the fore. The complexity of urban environments and the strong need for land administration, intensify the need of using a three-dimensional cadastral system. Despite the progress in the field of geographic information systems and 3D modeling techniques, there is no fully digital 3D cadastre. The existing geographic information systems and the different methods of three-dimensional modeling allow for better management, visualization and dissemination of information. Nevertheless, these opportunities cannot be totally exploited because of deficiencies in standardization and interoperability in these systems. Within this context, CityGML was developed as an international standard of the Open Geospatial Consortium (OGC) for 3D city models' representation and exchange. CityGML defines geometry and topology for city modeling, also focusing on semantic aspects of 3D city information. The scope of CityGML is to reach common terminology, also addressing the imperative need for interoperability and data integration, taking into account the number of available geographic information systems and modeling techniques. The aim of this paper is to develop an application for managing semantic information of a model generated based on procedural modeling. The model was initially implemented in CityEngine ESRI's software, and then imported to ArcGIS environment. Final goal was the original model's semantic enrichment and then its conversion to CityGML format. Semantic information management and interoperability seemed to be feasible by the use of the 3DCities Project ESRI tools, since its database structure ensures adding semantic information to the CityEngine model and therefore automatically convert to CityGML for advanced analysis and visualization in different application areas.
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system's limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa.
NASA's Geospatial Interoperability Office(GIO)Program
NASA Technical Reports Server (NTRS)
Weir, Patricia
2004-01-01
NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
Today, increasing numbers of intermittent generation sources (e.g., wind and photovoltaic) and new mobile intermittent loads (e.g., electric vehicles) can significantly affect traditional utility business practices and operations. At the same time, a growing number of technologies and devices, from appliances to lighting systems, are being deployed at consumer premises that have more sophisticated controls and information that remain underused for anything beyond basic building equipment operations. The intersection of these two drivers is an untapped opportunity and underused resource that, if appropriately configured and realized in open standards, can provide significant energy efficiency and commensurate savings on utility bills,more » enhanced and lower cost reliability to utilities, and national economic benefits in the creation of new markets, sectors, and businesses being fueled by the seamless coordination of energy and information through device and technology interoperability. Or, as the Quadrennial Energy Review puts it, “A plethora of both consumer-level and grid-level devices are either in the market, under development, or at the conceptual stage. When tied together through the information technology that is increasingly being deployed on electric utilities’ distribution grids, they can be an important enabling part of the emerging grid of the future. However, what is missing is the ability for all of these devices to coordinate and communicate their operations with the grid, and among themselves, in a common language — an open standard.” In this paper, we define interoperability as the ability to exchange actionable information between two or more systems within a home or building, or across and within organizational boundaries. Interoperability relies on the shared meaning of the exchanged information, with agreed-upon expectations and consequences, for the response to the information exchange.« less
Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.
Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P
2016-11-14
The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.
Flexible solution for interoperable cloud healthcare systems.
Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena
2012-01-01
It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.
Marschollek, Michael; Wolf, Klaus-H; Bott, Oliver-J; Geisler, Mirko; Plischke, Maik; Ludwig, Wolfram; Hornberger, Andreas; Haux, Reinhold
2007-01-01
Despite the abundance of past home care projects and the maturity of the technologies used, there is no widespread dissemination as yet. The absence of accepted standards and thus interoperability and the inadequate integration into transinstitutional health information systems (tHIS) are perceived as key factors. Based on the respective literature and previous experiences in home care projects we propose an architectural model for home care as part of a transinstitutional health information system using the HL7 clinical document architecture (CDA) as well as the HL7 Arden Syntax for Medical Logic Systems. In two short case studies we describe the practical realization of the architecture as well as first experiences. Our work can be regarded as a first step towards an interoperable - and in our view sustainable - home care architecture based on a prominent document standard from the health information system domain.
eHealth integration and interoperability issues: towards a solution through enterprise architecture.
Adenuga, Olugbenga A; Kekwaletswe, Ray M; Coleman, Alfred
2015-01-01
Investments in healthcare information and communication technology (ICT) and health information systems (HIS) continue to increase. This is creating immense pressure on healthcare ICT and HIS to deliver and show significance in such investments in technology. It is discovered in this study that integration and interoperability contribute largely to this failure in ICT and HIS investment in healthcare, thus resulting in the need towards healthcare architecture for eHealth. This study proposes an eHealth architectural model that accommodates requirement based on healthcare need, system, implementer, and hardware requirements. The model is adaptable and examines the developer's and user's views that systems hold high hopes for their potential to change traditional organizational design, intelligence, and decision-making.
NASA Astrophysics Data System (ADS)
Fox, P. A.; Diviacco, P.; Busato, A.
2016-12-01
Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st century. A web based collaborative portal has been developed that integrates both approaches and will be presented. Reports will be given on initial tests that have encouraging results.
77 FR 37001 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... of the Interoperability Services Layer, Attn: Ron Chen, 400 Gigling Road, Seaside, CA 93955. Title; Associated Form; and OMB Number: Interoperability Services Layer; OMB Control Number 0704-TBD. Needs and Uses... INFORMATION: Summary of Information Collection IoLS (Interoperability Layer Services) is an application in a...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, Dave; Stephan, Eric G.; Wang, Weimin
Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability liemore » the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.« less
NASA Astrophysics Data System (ADS)
2018-01-01
The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.
Seebregts, Christopher; Dane, Pierre; Parsons, Annie Neo; Fogwill, Thomas; Rogers, Debbie; Bekker, Marcha; Shaw, Vincent; Barron, Peter
2018-01-01
MomConnect is a national initiative coordinated by the South African National Department of Health that sends text-based mobile phone messages free of charge to pregnant women who voluntarily register at any public healthcare facility in South Africa. We describe the system design and architecture of the MomConnect technical platform, planned as a nationally scalable and extensible initiative. It uses a health information exchange that can connect any standards-compliant electronic front-end application to any standards-compliant electronic back-end database. The implementation of the MomConnect technical platform, in turn, is a national reference application for electronic interoperability in line with the South African National Health Normative Standards Framework. The use of open content and messaging standards enables the architecture to include any application adhering to the selected standards. Its national implementation at scale demonstrates both the use of this technology and a key objective of global health information systems, which is to achieve implementation scale. The system’s limited clinical information, initially, allowed the architecture to focus on the base standards and profiles for interoperability in a resource-constrained environment with limited connectivity and infrastructural capacity. Maintenance of the system requires mobilisation of national resources. Future work aims to use the standard interfaces to include data from additional applications as well as to extend and interface the framework with other public health information systems in South Africa. The development of this platform has also shown the benefits of interoperability at both an organisational and technical level in South Africa. PMID:29713506
Kalra, Dipak; Kobayashi, Shinji
2013-01-01
Objectives The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Methods Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. Results The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Conclusions Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes. PMID:24523993
Wollersheim, Dennis; Sari, Anny; Rahayu, Wenny
Health Information Managers (HIMs) are responsible for overseeing health information. The change management necessary during the transition to electronic health records (EHR) is substantial, and ongoing. Archetype-based EHRs are a core health information system component which solve many of the problems that arise during this period of change. Archetypes are models of clinical content, and they have many beneficial properties. They are interoperable, both between settings and through time. They are more amenable to change than conventional paradigms, and their design is congruent with clinical practice. This paper is an overview of the current archetype literature relevant to Health Information Managers. The literature was sourced in the English language sections of ScienceDirect, IEEE Explore, Pubmed, Google Scholar, ACM Digital library and other databases on the usage of archetypes for electronic health record storage, looking at the current areas of archetype research, appropriate usage, and future research. We also used reference lists from the cited papers, papers referenced by the openEHR website, and the recommendations from experts in the area. Criteria for inclusion were (a) if studies covered archetype research and (b) were either studies of archetype use, archetype system design, or archetype effectiveness. The 47 papers included show a wide and increasing worldwide archetype usage, in a variety of medical domains. Most of the papers noted that archetypes are an appropriate solution for future-proof and interoperable medical data storage. We conclude that archetypes are a suitable solution for the complex problem of electronic health record storage and interoperability.
The role of markup for enabling interoperability in health informatics.
McKeever, Steve; Johnson, David
2015-01-01
Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.
Exploring a model-driven architecture (MDA) approach to health care information systems development.
Raghupathi, Wullianallur; Umar, Amjad
2008-05-01
To explore the potential of the model-driven architecture (MDA) in health care information systems development. An MDA is conceptualized and developed for a health clinic system to track patient information. A prototype of the MDA is implemented using an advanced MDA tool. The UML provides the underlying modeling support in the form of the class diagram. The PIM to PSM transformation rules are applied to generate the prototype application from the model. The result of the research is a complete MDA methodology to developing health care information systems. Additional insights gained include development of transformation rules and documentation of the challenges in the application of MDA to health care. Design guidelines for future MDA applications are described. The model has the potential for generalizability. The overall approach supports limited interoperability and portability. The research demonstrates the applicability of the MDA approach to health care information systems development. When properly implemented, it has the potential to overcome the challenges of platform (vendor) dependency, lack of open standards, interoperability, portability, scalability, and the high cost of implementation.
Integrating Data and Networks: Human Factors
NASA Astrophysics Data System (ADS)
Chen, R. S.
2012-12-01
The development of technical linkages and interoperability between scientific networks is a necessary but not sufficient step towards integrated use and application of networked data and information for scientific and societal benefit. A range of "human factors" must also be addressed to ensure the long-term integration, sustainability, and utility of both the interoperable networks themselves and the scientific data and information to which they provide access. These human factors encompass the behavior of both individual humans and human institutions, and include system governance, a common framework for intellectual property rights and data sharing, consensus on terminology, metadata, and quality control processes, agreement on key system metrics and milestones, the compatibility of "business models" in the short and long term, harmonization of incentives for cooperation, and minimization of disincentives. Experience with several national and international initiatives and research programs such as the International Polar Year, the Group on Earth Observations, the NASA Earth Observing Data and Information System, the U.S. National Spatial Data Infrastructure, the Global Earthquake Model, and the United Nations Spatial Data Infrastructure provide a range of lessons regarding these human factors. Ongoing changes in science, technology, institutions, relationships, and even culture are creating both opportunities and challenges for expanded interoperability of scientific networks and significant improvement in data integration to advance science and the use of scientific data and information to achieve benefits for society as a whole.
A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows
NASA Astrophysics Data System (ADS)
Babin, B. L.; Hu, L.
2008-12-01
Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).
Extending the GI Brokering Suite to Support New Interoperability Specifications
NASA Astrophysics Data System (ADS)
Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.
2014-12-01
The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by public administrations. CERIF: used by CRIS (Current Research Information System) instances. HYRAX Server: a scientific dataset publishing component. This presentation will discuss these and other latest GI suite extensions implemented to support new interoperability protocols in use by the Earth Science Communities.
Benefits of Enterprise Ontology for the Development of ICT-Based Value Networks
NASA Astrophysics Data System (ADS)
Albani, Antonia; Dietz, Jan L. G.
The competitiveness of value networks is highly dependent on the cooperation between business partners and the interoperability of their information systems. Innovations in information and communication technology (ICT), primarily the emergence of the Internet, offer possibilities to increase the interoperability of information systems and therefore enable inter-enterprise cooperation. For the design of inter-enterprise information systems, the concept of business component appears to be very promising. However, the identification of business components is strongly dependent on the appropriateness and the quality of the underlying business domain model. The ontological model of an enterprise - or an enterprise network - as presented in this article, is a high-quality and very adequate business domain model. It provides all essential information that is necessary for the design of the supporting information systems, and at a level of abstraction that makes it also understandable for business people. The application of enterprise ontology for the identification of business components is clarified. To exemplify our approach, a practical case is taken from the domain of strategic supply network development. By doing this, a widespread problem of the practical application of inter-enterprise information systems is being addressed.
Search for supporting methodologies - Or how to support SEI for 35 years
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Masline, Richard C.
1991-01-01
Concepts relevant to the development of an evolvable information management system are examined in terms of support for the Space Exploration Initiative. The issues of interoperability within NASA and industry initiatives are studied including the Open Systems Interconnection standard and the operating system of the Open Software Foundation. The requirements of partitioning functionality into separate areas are determined with attention given to the infrastructure required to ensure system-wide compliance. The need for a decision-making context is a key to the distributed implementation of the program, and this environment is concluded to be next step in developing an evolvable, interoperable, and securable support network.
Development of NATO's recognized environmental picture
NASA Astrophysics Data System (ADS)
Teufert, John F.; Trabelsi, Mourad
2006-05-01
An important element for the fielding of a viable, effective NATO Response Force (NRF) is access to meteorological, oceanographic, geospatial data (GEOMETOC) and imagery. Currently, the available GEOMETOC information suffers from being very fragmented. NATO defines the Recognised Environmental Picture as controlled information base for GEOMETOC data. The NATO REP proposes an architecture that is both flexible and open. The focus lies on enabling a network-centric approach. The key into achieving this is relying on using open, well recognized standards that apply to both the data exchange protocols and the data formats. Communication and information exchange based on open standards enables system interoperability. Diverse systems, each with unique, specialized contributions to an increased understanding of the battlespace, can now cooperate to a manageable information sphere. By clearly defining responsibilities in the generation of information, a reduction in data transfer overhead is achieved . REP identifies three main stages in the dissemination of GEOMETOC data. These are Collection, Fusion (and Analysis) and Publication. A REP architecture has been successfully deployed during the NATO Coalition Warrior Interoperability Demonstration (CWID) in Lillehammer, Norway during June 2005. CWID is an annual event to validate and improve the interoperability of NATO and national Consultation and command, control, communications, computers, intelligence, surveillance, and reconnaissance (C4ISR) systems. With a test case success rate of 84%, it was able to provide relevant GEOMETOC support to the main NRF component headquarters. In 2006, the REP architecture will be deployed and validated during the NATO NRF Steadfast live exercises.
Interoperable web applications for sharing data and products of the International DORIS Service
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.
2017-12-01
The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.
NASA Astrophysics Data System (ADS)
Orellana, Diego A.; Salas, Alberto A.; Solarz, Pablo F.; Medina Ruiz, Luis; Rotger, Viviana I.
2016-04-01
The production of clinical information about each patient is constantly increasing, and it is noteworthy that the information is created in different formats and at diverse points of care, resulting in fragmented, incomplete, inaccurate and isolated, health information. The use of health information technology has been promoted as having a decisive impact to improve the efficiency, cost-effectiveness, quality and safety of medical care delivery. However in developing countries the utilization of health information technology is insufficient and lacking of standards among other situations. In the present work we evaluate the framework EHRGen, based on the openEHR standard, as mean to reach generation and availability of patient centered information. The framework has been evaluated through the provided tools for final users, that is, without intervention of computer experts. It makes easier to adopt the openEHR ideas and provides an open source basis with a set of services, although some limitations in its current state conspire against interoperability and usability. However, despite the described limitations respect to usability and semantic interoperability, EHRGen is, at least regionally, a considerable step toward EHR adoption and interoperability, so that it should be supported from academic and administrative institutions.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2010-10-01
The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.
Tyndall, Timothy; Tyndall, Ayami
2018-01-01
Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.
RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service
NASA Astrophysics Data System (ADS)
Yang, Chao; Chen, Nengcheng; Di, Liping
2012-10-01
Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.
Clinical data interoperability based on archetype transformation.
Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2011-10-01
The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.
A Proposed Information Architecture for Telehealth System Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.
1999-04-20
We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that amore » single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.« less
Warfighter IT Interoperability Standards Study
2012-07-22
data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data
Implementation of Medical Information Exchange System Based on EHR Standard
Han, Soon Hwa; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-01-01
Objectives To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. Methods To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. Results The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. Conclusions This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information. PMID:21818447
Implementation of Medical Information Exchange System Based on EHR Standard.
Han, Soon Hwa; Lee, Min Ho; Kim, Sang Guk; Jeong, Jun Yong; Lee, Bi Na; Choi, Myeong Seon; Kim, Il Kon; Park, Woo Sung; Ha, Kyooseob; Cho, Eunyoung; Kim, Yoon; Bae, Jae Bong
2010-12-01
To develop effective ways of sharing patients' medical information, we developed a new medical information exchange system (MIES) based on a registry server, which enabled us to exchange different types of data generated by various systems. To assure that patient's medical information can be effectively exchanged under different system environments, we adopted the standardized data transfer methods and terminologies suggested by the Center for Interoperable Electronic Healthcare Record (CIEHR) of Korea in order to guarantee interoperability. Regarding information security, MIES followed the security guidelines suggested by the CIEHR of Korea. This study aimed to develop essential security systems for the implementation of online services, such as encryption of communication, server security, database security, protection against hacking, contents, and network security. The registry server managed information exchange as well as the registration information of the clinical document architecture (CDA) documents, and the CDA Transfer Server was used to locate and transmit the proper CDA document from the relevant repository. The CDA viewer showed the CDA documents via connection with the information systems of related hospitals. This research chooses transfer items and defines document standards that follow CDA standards, such that exchange of CDA documents between different systems became possible through ebXML. The proposed MIES was designed as an independent central registry server model in order to guarantee the essential security of patients' medical information.
Greater Yellowstone regional traveler and weather information system evaluation plan
DOT National Transportation Integrated Search
2002-04-19
The ITS Integration Program is being conducted to accelerate the integration and interoperability of intelligent transportation systems in metropolitan and rural areas. Projects approved for funding have been assessed as supporting the improvements o...
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
Expanding the scope of health information systems. Challenges and developments.
Kuhn, K A; Wurst, S H R; Bott, O J; Giuse, D A
2006-01-01
To identify current challenges and developments in health information systems. Reports on HIS, eHealth and process support were analyzed, core problems and challenges were identified. Health information systems are extending their scope towards regional networks and health IT infrastructures. Integration, interoperability and interaction design are still today's core problems. Additional problems arise through the integration of genetic information into the health care process. There are noticeable trends towards solutions for these problems.
Interoperability Context-Setting Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widergren, Steven E.; Hardin, Dave; Ambrosio, Ron
2007-01-31
As the deployment of automation technology advances, it touches upon many areas of our corporate and personal lives. A trend is emerging where systems are growing to the extent that integration is taking place with other systems to provide even greater capabilities more efficiently and effectively. GridWise™ provides a vision for this type of integration as it applies to the electric system. Imagine a time in the not too distant future when homeowners can offer the management of their electricity demand to participate in a more efficient and environmentally friendly operation of the electric power grid. They will do thismore » using technology that acts on their behalf in response to information from other components of the electric system. This technology will recognize their preferences to parameters such as comfort and the price of energy to form responses that optimize the local need to a signal that satisfies a higher-level need in the grid. For example, consider a particularly hot day with air stagnation in an area with a significant dependence on wind generation. To manage the forecasted peak electricity demand, the bulk system operator issues a critical peak price warning. Their automation systems alert electric service providers who distribute electricity from the wholesale electricity system to consumers. In response, the electric service providers use their automation systems to inform consumers of impending price increases for electricity. This information is passed to an energy management system at the premises, which acts on the homeowner’s behalf, to adjust the electricity usage of the onsite equipment (which might include generation from such sources as a fuel cell). The objective of such a system is to honor the agreement with the electricity service provider and reduce the homeowner’s bill while keeping the occupants as comfortable as possible. This will include actions such as moving the thermostat on the heating, ventilation, and air-conditioning (HVAC) unit up several degrees. The resulting load reduction becomes part of an aggregated response from the electricity service provider to the bulk system operator who is now in a better position to manage total system load with available generation. Looking across the electric system, from generating plants, to transmission substations, to the distribution system, to factories, office parks, and buildings, automation is growing, and the opportunities for unleashing new value propositions are exciting. How can we facilitate this change and do so in a way that ensures the reliability of electric resources for the wellbeing of our economy and security? The GridWise Architecture Council (GWAC) mission is to enable interoperability among the many entities that interact with the electric power system. A good definition of interoperability is, “The capability of two or more networks, systems, devices, applications, or components to exchange information between them and to use the information so exchanged.” As a step in the direction of enabling interoperability, the GWAC proposes a context-setting framework to organize concepts and terminology so that interoperability issues can be identified and debated, improvements to address issues articulated, and actions prioritized and coordinated across the electric power community.« less
An EarthCube Roadmap for Cross-Domain Interoperability in the Geosciences: Governance Aspects
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Couch, A.; Richard, S. M.; Valentine, D. W.; Stocks, K.; Murphy, P.; Lehnert, K. A.
2012-12-01
The goal of cross-domain interoperability is to enable reuse of data and models outside the original context in which these data and models are collected and used and to facilitate analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries. A new research initiative of the U.S. National Science Foundation, called EarthCube, is developing a roadmap to address challenges of interoperability in the earth sciences and create a blueprint for community-guided cyberinfrastructure accessible to a broad range of geoscience researchers and students. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place for such secondary or derivative-use of information to be both scientifically sound and technically feasible. In this initial assessment we consider the following four basic infrastructure components that need to be present to enable cross-domain interoperability in the geosciences: metadata catalogs (at the appropriate community defined granularity) that provide standard discovery services over datasets, data access services, models and other resources of the domain; vocabularies that support unambiguous interpretation of domain resources and metadata; services used to access data repositories and other resources including models, visualizations and workflows; and formal information models that define structure and semantics of the information returned on service requests. General standards for these components have been proposed; they form the backbone of large scale integration activities in the geosciences. By utilizing these standards, EarthCube research designs can take advantage of data discovery across disciplines using the commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. Data can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data services can be used to transparently compile composite data products. Key questions addressed in this presentation are: (1) How to define and assess readiness of existing domain information systems for cross-domain re-use? (2) How to determine EarthCube development priorities given a multitude of use cases that involve cross-domain data flows? and (3) How to involve a wider community of geoscientists in the development and curation of cross-domain resources and incorporate community feedback in the CI design? Answering them involves consideration of governance mechanisms for cross-domain interoperability: while domain information systems and projects developed governance mechanisms, managing cross-domain CI resources and supporting cross-domain information re-use hasn't been the development focus at the scale of the geosciences. We present a cross-domain readiness model as enabling effective communication among scientists, governance bodies, and information providers. We also present an initial readiness assessment and a cross-domain connectivity map for the geosciences, and outline processes for eliciting user requirements, setting priorities, and obtaining community consensus.
Improving Interoperability between Registries and EHRs
Blumenthal, Seth
2018-01-01
National performance measurement needs clinical data that track the performance of multi disciplinary teams across episodes of care. Clinical registries are ideal platforms for this work due to their capture of structured, specific data across specialties. Because registries collect data at a national level, and registry data are captured in a consistent structure and format within each registry, registry data are useful for measurement and analysis “out of the box”. Registry business models are hampered by the cost of collecting data from EHRs and other source systems and abstracting or mapping them to fit registry data models. The National Quality Registry Network (NQRN) has launched Registries on FHIR, an initiative to lower barriers to achieving semantic interoperability between registries and source data systems. In 2017 Registries on FHIR conducted an information gathering campaign to learn where registries want better interoperability, and how to go about improving it. PMID:29888033
Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh;
2014-01-01
The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.
Air Ground Integration and the Brigade Combat Team
2013-06-13
Theater Air Control System TADIL-J Tactical Digital Information Link-J TAGS Theater Air Ground System TAIS Tactical Air Integration System TBMCS Theater...during planning and execution. This system interacts with the Theater Battle Management Core System ( TBMCS ) used by the JAOC to build and disseminate...control nodes within the AAGS, in conjunction with the interoperability with the TBMCS and Army mission command systems facilitates information flow during
Principles of data integration and interoperability in the GEO Biodiversity Observation Network
NASA Astrophysics Data System (ADS)
Saarenmaa, Hannu; Ó Tuama, Éamonn
2010-05-01
The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow systems will also help expedite the delivery of products and reports that are required for integrated assessment of data from many disciplines. Going beyond the Service Oriented Architectures which now are mainstream, these challenges have lately been addressed in the business world by adopting what is called a Semantic Enterprise Architecture. Semantic portals have been built, in particular, to address interoperability across domains, where users may not be familiar with concepts of all networks. We will discuss the applicability of these approaches for building the global GEO BON.
Interoperability Outlook in the Big Data Future
NASA Astrophysics Data System (ADS)
Kuo, K. S.; Ramachandran, R.
2015-12-01
The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.
NASA Technical Reports Server (NTRS)
Subrahmanian, V. S.
1994-01-01
An architecture called hybrid knowledge system (HKS) is described that can be used to interoperate between a specification of the control laws describing a physical system, a collection of databases, knowledge bases and/or other data structures reflecting information about the world in which the physical system controlled resides, observations (e.g. sensor information) from the external world, and actions that must be taken in response to external observations.
Lowering Entry Barriers for Multidisciplinary Cyber(e)-Infrastructures
NASA Astrophysics Data System (ADS)
Nativi, S.
2012-04-01
Multidisciplinarity is more and more important to study the Earth System and address Global Changes. To achieve that, multidisciplinary cyber(e)-infrastructures are an important instrument. In the last years, several European, US and international initiatives have been started to carry out multidisciplinary infrastructures, including: the Spatial Information in the European Community (INSPIRE), the Global Monitoring for Environment and Security (GMES), the Data Observation Network for Earth (DataOne), and the Global Earth Observation System of Systems (GEOSS). The majority of these initiatives are developing service-based digital infrastructures asking scientific Communities (i.e. disciplinary Users and data Producers) to implement a set of standards for information interoperability. For scientific Communities, this has represented an entry barrier which has proved to be high, in several cases. In fact, both data Producers and Users do not seem to be willing to invest precious resources to become expert on interoperability solutions -on the contrary, they are focused on developing disciplinary and thematic capacities. Therefore, an important research topic is lowering entry barriers for joining multidisciplinary cyber(e)-Infrastructures. This presentation will introduce a new approach to achieve multidisciplinary interoperability underpinning multidisciplinary infrastructures and lowering the present entry barriers for both Users and data Producers. This is called the Brokering approach: it extends the service-based paradigm by introducing a new a Brokering layer or cloud which is in charge of managing all the interoperability complexity (e.g. data discovery, access, and use) thus easing Users' and Producers' burden. This approach was successfully experimented in the framework of several European FP7 Projects and in GEOSS.
A logical approach to semantic interoperability in healthcare.
Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni
2011-01-01
Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.
Measures for interoperability of phenotypic data: minimum information requirements and formatting.
Ćwiek-Kupczyńska, Hanna; Altmann, Thomas; Arend, Daniel; Arnaud, Elizabeth; Chen, Dijun; Cornut, Guillaume; Fiorani, Fabio; Frohmberg, Wojciech; Junker, Astrid; Klukas, Christian; Lange, Matthias; Mazurek, Cezary; Nafissi, Anahita; Neveu, Pascal; van Oeveren, Jan; Pommier, Cyril; Poorter, Hendrik; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Scholz, Uwe; van Schriek, Marco; Seren, Ümit; Usadel, Björn; Weise, Stephan; Kersey, Paul; Krajewski, Paweł
2016-01-01
Plant phenotypic data shrouds a wealth of information which, when accurately analysed and linked to other data types, brings to light the knowledge about the mechanisms of life. As phenotyping is a field of research comprising manifold, diverse and time-consuming experiments, the findings can be fostered by reusing and combining existing datasets. Their correct interpretation, and thus replicability, comparability and interoperability, is possible provided that the collected observations are equipped with an adequate set of metadata. So far there have been no common standards governing phenotypic data description, which hampered data exchange and reuse. In this paper we propose the guidelines for proper handling of the information about plant phenotyping experiments, in terms of both the recommended content of the description and its formatting. We provide a document called "Minimum Information About a Plant Phenotyping Experiment", which specifies what information about each experiment should be given, and a Phenotyping Configuration for the ISA-Tab format, which allows to practically organise this information within a dataset. We provide examples of ISA-Tab-formatted phenotypic data, and a general description of a few systems where the recommendations have been implemented. Acceptance of the rules described in this paper by the plant phenotyping community will help to achieve findable, accessible, interoperable and reusable data.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Niskar, Amanda Sue
2005-01-01
The Centers for Disease Control and Prevention (CDC) is coordinating HELIX- Atlanta to provide information regarding the five-county Metropolitan Atlanta Area (Clayton, Cobb, DeKalb, Fulton, and Gwinett) via a network of integrated environmental monitoring and public health data systems so that all sectors can take action to prevent and control environmentally related health effects. The HELIX-Atlanta Network is a tool to access interoperable information systems with optional information technology linkage functionality driven by scientific rationale. HELIX-Atlanta is a collaborative effort with local, state, federal, and academic partners, including the NASA Marshall Space Flight Center. The HELIX-Atlanta Partners identified the following HELIX-Atlanta initial focus areas: childhood lead poisoning, short-latency cancers, developmental disabilities, birth defects, vital records, respiratory health, age of housing, remote sensing data, and environmental monitoring, HELIX-Atlanta Partners identified and evaluated information systems containing information on the above focus areas. The information system evaluations resulted in recommendations for what resources would be needed to interoperate selected information systems in compliance with the CDC Public Health Information Network (PHIN). This presentation will discuss the collaborative process of building a network that links health and environment data for information exchange, including NASA remote sensing data, for use in HELIX-Atlanta.
PIML: the Pathogen Information Markup Language.
He, Yongqun; Vines, Richard R; Wattam, Alice R; Abramochkin, Georgiy V; Dickerman, Allan W; Eckart, J Dana; Sobral, Bruno W S
2005-01-01
A vast amount of information about human, animal and plant pathogens has been acquired, stored and displayed in varied formats through different resources, both electronically and otherwise. However, there is no community standard format for organizing this information or agreement on machine-readable format(s) for data exchange, thereby hampering interoperation efforts across information systems harboring such infectious disease data. The Pathogen Information Markup Language (PIML) is a free, open, XML-based format for representing pathogen information. XSLT-based visual presentations of valid PIML documents were developed and can be accessed through the PathInfo website or as part of the interoperable web services federation known as ToolBus/PathPort. Currently, detailed PIML documents are available for 21 pathogens deemed of high priority with regard to public health and national biological defense. A dynamic query system allows simple queries as well as comparisons among these pathogens. Continuing efforts are being taken to include other groups' supporting PIML and to develop more PIML documents. All the PIML-related information is accessible from http://www.vbi.vt.edu/pathport/pathinfo/
Empowering open systems through cross-platform interoperability
NASA Astrophysics Data System (ADS)
Lyke, James C.
2014-06-01
Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.
Pape-Haugaard, Louise; Frank, Lars
2011-01-01
A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.
Reference Architecture Model Enabling Standards Interoperability.
Blobel, Bernd
2017-01-01
Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.
A Model for Nationwide Patient Tracking
2009-09-01
information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing...Interoperability between Modules and Other Systems ....................................77 Figure 18. Ideal System for EMS... Other agencies do not employ patient tracking at all. Overall, patient tracking needs to be redefined, so that agencies do not see it as a
Future Interoperability of Camp Protection Systems (FICAPS)
NASA Astrophysics Data System (ADS)
Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann
2013-05-01
The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other force protection and ISR (Intelligence Surveillance Reconnaissance) tasks not only due to its flexibility but also due to the chosen interfacing.
NASA Astrophysics Data System (ADS)
Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois
2017-04-01
Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.
Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos
2016-01-01
Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451
Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli
2011-01-01
The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.
Using Ontologies to Formalize Services Specifications in Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann
2004-01-01
One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.
NASA Astrophysics Data System (ADS)
Glavev, Victor
2016-12-01
The types of software applications used by public administrations can be divided in three main groups: document management systems, record management systems and business process systems. Each one of them generates outputs that can be used as input data to the others. This is the main reason that requires exchange of data between these three groups and well defined models that should be followed. There are also many other reasons that will be discussed in the paper. Interoperability is a key aspect when those models are implemented, especially when there are different manufactures of systems in the area of software applications used by public authorities. The report includes examples of implementation of models for exchange of data between software systems deployed in one of the biggest administration in Bulgaria.
Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.
Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher
2017-08-01
Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.
A review on digital ECG formats and the relationships between them.
Trigo, Jesús Daniel; Alesanco, Alvaro; Martínez, Ignacio; García, José
2012-05-01
A plethora of digital ECG formats have been proposed and implemented. This heterogeneity hinders the design and development of interoperable systems and entails critical integration issues for the healthcare information systems. This paper aims at performing a comprehensive overview on the current state of affairs of the interoperable exchange of digital ECG signals. This includes 1) a review on existing digital ECG formats, 2) a collection of applications and cardiology settings using such formats, 3) a compilation of the relationships between such formats, and 4) a reflection on the current situation and foreseeable future of the interoperable exchange of digital ECG signals. The objectives have been approached by completing and updating previous reviews on the topic through appropriate database mining. 39 digital ECG formats, 56 applications, tools or implantation experiences, 47 mappings/converters, and 6 relationships between such formats have been found in the literature. The creation and generalization of a single standardized ECG format is a desirable goal. However, this unification requires political commitment and international cooperation among different standardization bodies. Ongoing ontology-based approaches covering ECG domain have recently emerged as a promising alternative for reaching fully fledged ECG interoperability in the near future.
Measuring interoperable EHR adoption and maturity: a Canadian example.
Gheorghiu, Bobby; Hagens, Simon
2016-01-25
An interoperable electronic health record is a secure consolidated record of an individual's health history and care, designed to facilitate authorized information sharing across the care continuum. Each Canadian province and territory has implemented such a system and for all, measuring adoption is essential to understanding progress and optimizing use in order to realize intended benefits. About 250,000 health professionals-approximately half of Canada's anticipated potential physician, nurse, pharmacist, and administrative users-indicated that they electronically access data, such as those found in provincial/territorial lab or drug information systems, in 2015. Trends suggest further growth as maturity of use increases. There is strong interest in health information exchange through the iEHR in Canada, and continued growth in adoption is expected. Central to managing the evolution of digital health is access to robust data about who is using solutions, how they are used, where and when. Stakeholders such as government, program leads, and health system administrators must critically assess progress and achievement of benefits, to inform future strategic and operational decisions.
Digital document imaging systems: An overview and guide
NASA Technical Reports Server (NTRS)
1990-01-01
This is an aid to NASA managers in planning the selection of a Digital Document Imaging System (DDIS) as a possible solution for document information processing and storage. Intended to serve as a manager's guide, this document contains basic information on digital imaging systems, technology, equipment standards, issues of interoperability and interconnectivity, and issues related to selecting appropriate imaging equipment based upon well defined needs.
Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I
2017-10-01
Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P < .001, as such were validated as the building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.
Standard Information Models for Representing Adverse Sensitivity Information in Clinical Documents.
Topaz, M; Seger, D L; Goss, F; Lai, K; Slight, S P; Lau, J J; Nandigam, H; Zhou, L
2016-01-01
Adverse sensitivity (e.g., allergy and intolerance) information is a critical component of any electronic health record system. While several standards exist for structured entry of adverse sensitivity information, many clinicians record this data as free text. This study aimed to 1) identify and compare the existing common adverse sensitivity information models, and 2) to evaluate the coverage of the adverse sensitivity information models for representing allergy information on a subset of inpatient and outpatient adverse sensitivity clinical notes. We compared four common adverse sensitivity information models: Health Level 7 Allergy and Intolerance Domain Analysis Model, HL7-DAM; the Fast Healthcare Interoperability Resources, FHIR; the Consolidated Continuity of Care Document, C-CDA; and OpenEHR, and evaluated their coverage on a corpus of inpatient and outpatient notes (n = 120). We found that allergy specialists' notes had the highest frequency of adverse sensitivity attributes per note, whereas emergency department notes had the fewest attributes. Overall, the models had many similarities in the central attributes which covered between 75% and 95% of adverse sensitivity information contained within the notes. However, representations of some attributes (especially the value-sets) were not well aligned between the models, which is likely to present an obstacle for achieving data interoperability. Also, adverse sensitivity exceptions were not well represented among the information models. Although we found that common adverse sensitivity models cover a significant portion of relevant information in the clinical notes, our results highlight areas needed to be reconciled between the standards for data interoperability.
Interoperation of an UHF RFID Reader and a TCP/IP Device via Wired and Wireless Links
Lee, Sang Hoon; Jin, Ik Soo
2011-01-01
A main application in radio frequency identification (RFID) sensor networks is the function that processes real-time tag information after gathering the required data from multiple RFID tags. The component technologies that contain an RFID reader, called the interrogator, which has a tag chip, processors, coupling antenna, and a power management system have advanced significantly over the last decade. This paper presents a system implementation for interoperation between an UHF RFID reader and a TCP/IP device that is used as a gateway. The proposed system consists of an UHF RFID tag, an UHF RFID reader, an RF end-device, an RF coordinator, and a TCP/IP I/F. The UHF RFID reader, operating at 915 MHz, is compatible with EPC Class-0/Gen1, Class-1/Gen1 and 2, and ISO18000-6B. In particular, the UHF RFID reader can be combined with the RF end-device/coordinator for a ZigBee (IEEE 802.15.4) interface, which is a low-power wireless standard. The TCP/IP device communicates with the RFID reader via wired links. On the other hand, it is connected to the ZigBee end-device via wireless links. The web based test results show that the developed system can remotely recognize information of multiple tags through the interoperation between the RFID reader and the TCP/IP device. PMID:22346665
Interoperation of an UHF RFID reader and a TCP/IP device via wired and wireless links.
Lee, Sang Hoon; Jin, Ik Soo
2011-01-01
A main application in radio frequency identification (RFID) sensor networks is the function that processes real-time tag information after gathering the required data from multiple RFID tags. The component technologies that contain an RFID reader, called the interrogator, which has a tag chip, processors, coupling antenna, and a power management system have advanced significantly over the last decade. This paper presents a system implementation for interoperation between an UHF RFID reader and a TCP/IP device that is used as a gateway. The proposed system consists of an UHF RFID tag, an UHF RFID reader, an RF end-device, an RF coordinator, and a TCP/IP I/F. The UHF RFID reader, operating at 915 MHz, is compatible with EPC Class-0/Gen1, Class-1/Gen1 and 2, and ISO18000-6B. In particular, the UHF RFID reader can be combined with the RF end-device/coordinator for a ZigBee (IEEE 802.15.4) interface, which is a low-power wireless standard. The TCP/IP device communicates with the RFID reader via wired links. On the other hand, it is connected to the ZigBee end-device via wireless links. The web based test results show that the developed system can remotely recognize information of multiple tags through the interoperation between the RFID reader and the TCP/IP device.
System and methods of resource usage using an interoperable management framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.
Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.
An Interoperable, Agricultural Information System Based on Satellite Remote Sensing Data
NASA Technical Reports Server (NTRS)
Teng, William; Chiu, Long; Doraiswamy, Paul; Kempler, Steven; Liu, Zhong; Pham, Long; Rui, Hualan
2005-01-01
Monitoring global agricultural crop conditions during the growing season and estimating potential seasonal production are critically important for market development of US. agricultural products and for global food security. The Goddard Space Flight Center Earth Sciences Data and Information Services Center Distributed Active Archive Center (GES DISC DAAC) is developing an Agricultural Information System (AIS), evolved from an existing TRMM Online Visualization and Analysis System (TOVAS), which will operationally provide satellite remote sensing data products (e.g., rainfall) and services. The data products will include crop condition and yield prediction maps, generated from a crop growth model with satellite data inputs, in collaboration with the USDA Agricultural Research Service. The AIS will enable the remote, interoperable access to distributed data, by using the GrADS-DODS Server (GDS) and by being compliant with Open GIS Consortium standards. Users will be able to download individual files, perform interactive online analysis, as well as receive operational data flows. AIS outputs will be integrated into existing operational decision support systems for global crop monitoring, such as those of the USDA Foreign Agricultural Service and the U.N. World Food Program.
Dixon, Brian E; Gamache, Roland E; Grannis, Shaun J
2013-05-01
To summarize the literature describing computer-based interventions aimed at improving bidirectional communication between clinical and public health. A systematic review of English articles using MEDLINE and Google Scholar. Search terms included public health, epidemiology, electronic health records, decision support, expert systems, and decision-making. Only articles that described the communication of information regarding emerging health threats from public health agencies to clinicians or provider organizations were included. Each article was independently reviewed by two authors. Ten peer-reviewed articles highlight a nascent but promising area of research and practice related to alerting clinicians about emerging threats. Current literature suggests that additional research and development in bidirectional communication infrastructure should focus on defining a coherent architecture, improving interoperability, establishing clear governance, and creating usable systems that will effectively deliver targeted, specific information to clinicians in support of patient and population decision-making. Increasingly available clinical information systems make it possible to deliver timely, relevant knowledge to frontline clinicians in support of population health. Future work should focus on developing a flexible, interoperable infrastructure for bidirectional communications capable of integrating public health knowledge into clinical systems and workflows.
Big Data in the Earth Observing System Data and Information System
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Baynes, Katie; McInerney, Mark
2016-01-01
Approaches that are being pursued for the Earth Observing System Data and Information System (EOSDIS) data system to address the challenges of Big Data were presented to the NASA Big Data Task Force. Cloud prototypes are underway to tackle the volume challenge of Big Data. However, advances in computer hardware or cloud won't help (much) with variety. Rather, interoperability standards, conventions, and community engagement are the key to addressing variety.
Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.
Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658
ERIC Educational Resources Information Center
Stoltzfus, Kimberly Ann
2012-01-01
The problem of information sharing and coordination was made starkly evident by the September 11th attacks. September 11th illuminated the problems that justice agencies had in sharing information in a timely and collaborative nature without an interoperable data-sharing system. A number of government audits and justice agency leaders have sought…
The MADE Reference Information Model for Interoperable Pervasive Telemedicine Systems.
Fung, Nick L S; Jones, Valerie M; Hermens, Hermie J
2017-03-23
The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the MobiGuide project. To validate our RIM, we applied it to a clinical guideline for patients with gestational diabetes mellitus (GDM). The RIM is derived from a generic data flow model of disease management which comprises a network of four types of concurrent processes: Monitoring (M), Analysis (A), Decision (D) and Effectuation (E). This resulting MADE RIM, which was specified using the formal Vienna Development Method (VDM), includes six main, high-level data types representing measurements, observations, abstractions, action plans, action instructions and control instructions. The authors applied the MADE RIM to the complete GDM guideline and derived from it a domain information model (DIM) comprising 61 archetypes, specifically 1 measurement, 8 observation, 10 abstraction, 18 action plan, 3 action instruction and 21 control instruction archetypes. It was observed that there are six generic patterns for transforming different guideline elements into MADE archetypes, although a direct mapping does not exist in some cases. Most notable examples are notifications to the patient and/or clinician as well as decision conditions which pertain to specific stages in the therapy. The results provide evidence that the MADE RIM is suitable for modelling clinical data in the design of pervasive telemedicine systems. Together with the other components of the MADE language, the MADE RIM supports development of pervasive telemedicine systems that are interoperable and independent of particular clinical applications.
NASA Technical Reports Server (NTRS)
Maluf, David A.; Koga, Dennis (Technical Monitor)
2002-01-01
This presentation discuss NASA's proposed NETMARK knowledge management tool which aims 'to control and interoperate with every block in a document, email, spreadsheet, power point, database, etc. across the lifecycle'. Topics covered include: system software requirements and hardware requirements, seamless information systems, computer architecture issues, and potential benefits to NETMARK users.
Finet, Philippe; Gibaud, Bernard; Dameron, Olivier; Le Bouquin Jeannès, Régine
2016-03-01
The number of patients with complications associated with chronic diseases increases with the ageing population. In particular, complex chronic wounds raise the re-admission rate in hospitals. In this context, the implementation of a telemedicine application in Basse-Normandie, France, contributes to reduce hospital stays and transport. This application requires a new collaboration among general practitioners, private duty nurses and the hospital staff. However, the main constraint mentioned by the users of this system is the lack of interoperability between the information system of this application and various partners' information systems. To improve medical data exchanges, the authors propose a new implementation based on the introduction of interoperable clinical documents and a digital document repository for managing the sharing of the documents between the telemedicine application users. They then show that this technical solution is suitable for any telemedicine application and any document sharing system in a healthcare facility or network.
Economic Perspective on Cloud Computing: Three Essays
ERIC Educational Resources Information Center
Dutt, Abhijit
2013-01-01
Improvements in Information Technology (IT) infrastructure and standardization of interoperability standards among heterogeneous Information System (IS) applications have brought a paradigm shift in the way an IS application could be used and delivered. Not only an IS application can be built using standardized component but also parts of it can…
Meaningful use of health information technology and declines in in-hospital adverse drug events.
Furukawa, Michael F; Spector, William D; Rhona Limcangco, M; Encinosa, William E
2017-07-01
Nationwide initiatives have promoted greater adoption of health information technology as a means to reduce adverse drug events (ADEs). Hospital adoption of electronic health records with Meaningful Use (MU) capabilities expected to improve medication safety has grown rapidly. However, evidence that MU capabilities are associated with declines in in-hospital ADEs is lacking. Data came from the 2010-2013 Medicare Patient Safety Monitoring System and the 2008-2013 Healthcare Information and Management Systems Society (HIMSS) Analytics Database. Two-level random intercept logistic regression was used to estimate the association of MU capabilities and occurrence of ADEs, adjusting for patient characteristics, hospital characteristics, and year of observation. Rates of in-hospital ADEs declined by 19% from 2010 to 2013. Adoption of MU capabilities was associated with 11% lower odds of an ADE (95% confidence interval [CI], 0.84-0.96). Interoperability capability was associated with 19% lower odds of an ADE (95% CI, 0.67- 0.98). Adoption of MU capabilities explained 22% of the observed reduction in ADEs, or 67,000 fewer ADEs averted by MU. Concurrent with the rapid uptake of MU and interoperability, occurrence of in-hospital ADEs declined significantly from 2010 to 2013. MU capabilities and interoperability were associated with lower occurrence of ADEs, but the effects did not vary by experience with MU. About one-fifth of the decline in ADEs from 2010 to 2013 was attributable to MU capabilities. Findings support the contention that adoption of MU capabilities and interoperability spurred by the Health Information Technology for Economic and Clinical Health Act contributed in part to the recent decline in ADEs. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the United States.
Security and Privacy in a DACS.
Delgado, Jaime; Llorente, Silvia; Pàmies, Martí; Vilalta, Josep
2016-01-01
The management of electronic health records (EHR), in general, and clinical documents, in particular, is becoming a key issue in the daily work of Healthcare Organizations (HO). The need for providing secure and private access to, and storage for, clinical documents together with the need for HO to interoperate, raises a number of issues difficult to solve. Many systems are in place to manage EHR and documents. Some of these Healthcare Information Systems (HIS) follow standards in their document structure and communications protocols, but many do not. In fact, they are mostly proprietary and do not interoperate. Our proposal to solve the current situation is the use of a DACS (Document Archiving and Communication System) for providing security, privacy and standardized access to clinical documents.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.
The objectives of this Congressional hearing on high definition information systems were: (1) to receive testimony on standards for systems that permit interoperability between the computer, communications, and broadcasting industries; (2) to examine the implications of the Grand Alliance, an agreement by high definition television (HDTV)…
Information Systems Security Management: A Review and a Classification of the ISO Standards
NASA Astrophysics Data System (ADS)
Tsohou, Aggeliki; Kokolakis, Spyros; Lambrinoudakis, Costas; Gritzalis, Stefanos
The need for common understanding and agreement of functional and non-functional requirements is well known and understood by information system designers. This is necessary for both: designing the "correct" system and achieving interoperability with other systems. Security is maybe the best example of this need. If the understanding of the security requirements is not the same for all involved parties and the security mechanisms that will be implemented do not comply with some globally accepted rules and practices, then the system that will be designed will not necessarily achieve the desired security level and it will be very difficult to securely interoperate with other systems. It is therefore clear that the role and contribution of international standards to the design and implementation of security mechanisms is dominant. In this paper we provide a state of the art review on information security management standards published by the International Organization for Standardization and the International Electrotechnical Commission. Such an analysis is meaningful to security practitioners for an efficient management of information security. Moreover, the classification of the standards in the clauses of ISO/IEC 27001:2005 that results from our analysis is expected to provide assistance in dealing with the plethora of security standards.
Information Management Challenges in Achieving Coalition Interoperability
2001-12-01
by J. Dyer SESSION I: ARCHITECTURES AND STANDARDS: FUNDAMENTAL ISSUES Chairman: Dr I. WHITE (UK) Planning for Interoperability 1 by W.M. Gentleman...framework – a crucial step toward achieving coalition C4I interoperability. TOPICS TO BE COVERED: 1 ) Maintaining secure interoperability 2) Command...d’une coalition. SUJETS À EXAMINER : 1 ) Le maintien d’une interopérabilité sécurisée 2) Les interfaces des systèmes de commandement : 2a
42 CFR 495.330 - Termination of FFP for failure to provide access to information.
Code of Federal Regulations, 2010 CFR
2010-10-01
... RECORD TECHNOLOGY INCENTIVE PROGRAM Requirements Specific to the Medicaid Program § 495.330 Termination... HIT planning and implementation efforts, and the systems used to interoperate with electronic HIT...
Incorporating Brokers within Collaboration Environments
NASA Astrophysics Data System (ADS)
Rajasekar, A.; Moore, R.; de Torcy, A.
2013-12-01
A collaboration environment, such as the integrated Rule Oriented Data System (iRODS - http://irods.diceresearch.org), provides interoperability mechanisms for accessing storage systems, authentication systems, messaging systems, information catalogs, networks, and policy engines from a wide variety of clients. The interoperability mechanisms function as brokers, translating actions requested by clients to the protocol required by a specific technology. The iRODS data grid is used to enable collaborative research within hydrology, seismology, earth science, climate, oceanography, plant biology, astronomy, physics, and genomics disciplines. Although each domain has unique resources, data formats, semantics, and protocols, the iRODS system provides a generic framework that is capable of managing collaborative research initiatives that span multiple disciplines. Each interoperability mechanism (broker) is linked to a name space that enables unified access across the heterogeneous systems. The collaboration environment provides not only support for brokers, but also support for virtualization of name spaces for users, files, collections, storage systems, metadata, and policies. The broker enables access to data or information in a remote system using the appropriate protocol, while the collaboration environment provides a uniform naming convention for accessing and manipulating each object. Within the NSF DataNet Federation Consortium project (http://www.datafed.org), three basic types of interoperability mechanisms have been identified and applied: 1) drivers for managing manipulation at the remote resource (such as data subsetting), 2) micro-services that execute the protocol required by the remote resource, and 3) policies for controlling the execution. For example, drivers have been written for manipulating NetCDF and HDF formatted files within THREDDS servers. Micro-services have been written that manage interactions with the CUAHSI data repository, the DataONE information catalog, and the GeoBrain broker. Policies have been written that manage transfer of messages between an iRODS message queue and the Advanced Message Queuing Protocol. Examples of these brokering mechanisms will be presented. The DFC collaboration environment serves as the intermediary between community resources and compute grids, enabling reproducible data-driven research. It is possible to create an analysis workflow that retrieves data subsets from a remote server, assemble the required input files, automate the execution of the workflow, automatically track the provenance of the workflow, and share the input files, workflow, and output files. A collaborator can re-execute a shared workflow, compare results, change input files, and re-execute an analysis.
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
Kobayashi, Shinji; Kume, Naoto; Yoshihara, Hiroyuki
2015-01-01
In 2001, we developed an EHR system for regional healthcare information inter-exchange and to provide individual patient data to patients. This system was adopted in three regions in Japan. We also developed a Medical Markup Language (MML) standard for inter- and intra-hospital communications. The system was built on a legacy platform, however, and had not been appropriately maintained or updated to meet clinical requirements. To improve future maintenance costs, we reconstructed the EHR system using archetype technology on the Ruby on Rails platform, and generated MML equivalent forms from archetypes. The system was deployed as a cloud-based system for preliminary use as a regional EHR. The system now has the capability to catch up with new requirements, maintaining semantic interoperability with archetype technology. It is also more flexible than the legacy EHR system.
Galarraga, M; Serrano, L; Martinez, I; de Toledo, P; Reynolds, Melvin
2007-01-01
Advances in Information and Communication Technologies, ICT, are bringing new opportunities and use cases in the field of systems and Personal Health Devices used for the telemonitoring of citizens in Home or Mobile scenarios. At a time of such challenges, this review arises from the need to identify robust technical telemonitoring solutions that are both open and interoperable. These systems demand standardized solutions to be cost effective and to take advantage of standardized operation and interoperability. Thus, the fundamental challenge is to design plug-&-play devices that, either as individual elements or as components, can be incorporated in a simple way into different Telecare systems, perhaps configuring a personal user network. Moreover, there is an increasing market pressure from companies not traditionally involved in medical markets, asking for a standard for Personal Health Devices, which foresee a vast demand for telemonitoring, wellness, Ambient Assisted Living (AAL) and e-health applications. However, the newly emerging situations imply very strict requirements for the protocols involved in the communication. The ISO/IEEE 11073 family of standards is adapting and moving in order to face the challenge and might appear the best positioned international standards to reach this goal. This work presents an updated survey of these standards, trying to track the changes that are being fulfilled, and tries to serve as a starting-point for those who want to familiarize themselves with them.
ERIC Educational Resources Information Center
Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.
Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through Web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI)…
A semantically-aided architecture for a web-based monitoring system for carotid atherosclerosis.
Kolias, Vassileios D; Stamou, Giorgos; Golemati, Spyretta; Stoitsis, Giannis; Gkekas, Christos D; Liapis, Christos D; Nikita, Konstantina S
2015-08-01
Carotid atherosclerosis is a multifactorial disease and its clinical diagnosis depends on the evaluation of heterogeneous clinical data, such as imaging exams, biochemical tests and the patient's clinical history. The lack of interoperability between Health Information Systems (HIS) does not allow the physicians to acquire all the necessary data for the diagnostic process. In this paper, a semantically-aided architecture is proposed for a web-based monitoring system for carotid atherosclerosis that is able to gather and unify heterogeneous data with the use of an ontology and to create a common interface for data access enhancing the interoperability of HIS. The architecture is based on an application ontology of carotid atherosclerosis that is used to (a) integrate heterogeneous data sources on the basis of semantic representation and ontological reasoning and (b) access the critical information using SPARQL query rewriting and ontology-based data access services. The architecture was tested over a carotid atherosclerosis dataset consisting of the imaging exams and the clinical profile of 233 patients, using a set of complex queries, constructed by the physicians. The proposed architecture was evaluated with respect to the complexity of the queries that the physicians could make and the retrieval speed. The proposed architecture gave promising results in terms of interoperability, data integration of heterogeneous sources with an ontological way and expanded capabilities of query and retrieval in HIS.
Beštek, Mate; Stanimirović, Dalibor
2017-08-09
The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries which are still in an early phase of eHealth planning and development and are often confronted with differing interests, requirements, and contending strategies.
Lemnos interoperable security project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halbgewachs, Ronald D.
2010-03-01
With the Lemnos framework, interoperability of control security equipment is straightforward. To obtain interoperability between proprietary security appliance units, one or both vendors must now write cumbersome 'translation code.' If one party changes something, the translation code 'breaks.' The Lemnos project is developing and testing a framework that uses widely available security functions and protocols like IPsec - to form a secure communications channel - and Syslog, to exchange security log messages. Using this model, security appliances from two or more different vendors can clearly and securely exchange information, helping to better protect the total system. Simplify regulatory compliance inmore » a complicated security environment by leveraging the Lemnos framework. As an electric utility, are you struggling to implement the NERC CIP standards and other regulations? Are you weighing the misery of multiple management interfaces against committing to a ubiquitous single-vendor solution? When vendors build their security appliances to interoperate using the Lemnos framework, it becomes practical to match best-of-breed offerings from an assortment of vendors to your specific control systems needs. The Lemnos project is developing and testing a framework that uses widely available open-source security functions and protocols like IPsec and Syslog to create a secure communications channel between appliances in order to exchange security data.« less
Samal, Lipika; Dykes, Patricia C; Greenberg, Jeffrey O; Hasan, Omar; Venkatesh, Arjun K; Volk, Lynn A; Bates, David W
2016-04-22
Health information technology (HIT) could improve care coordination by providing clinicians remote access to information, improving legibility, and allowing asynchronous communication, among other mechanisms. We sought to determine, from a clinician perspective, how care is coordinated and to what extent HIT is involved when transitioning patients between emergency departments, acute care hospitals, skilled nursing facilities, and home health agencies in settings across the United States. We performed a qualitative study with clinicians and information technology professionals from six regions of the U.S. which were chosen as national leaders in HIT. We analyzed data through a two person consensus approach, assigning responses to each of nine care coordination activities. We also conducted a literature review of MEDLINE®, CINAHL®, and Embase, analyzing results of studies that examined interventions to improve information transfer during transitions of care. We enrolled 29 respondents from 17 organizations and conducted six focus groups. Respondents reported how HIT is currently used for care coordination activities. HIT is currently used to monitor patients and to align systems-level resources with population needs. However, we identified multiple areas where the lack of interoperability leads to inefficient processes and missing data. Additionally, the literature review identified ten intervention studies that address information transfer, seven of which employed HIT and three of which utilized other communication methods such as telephone calls, faxed records, and nurse case management. Significant care coordination gaps exist due to the lack of interoperability across the United States. We must design, evaluate, and incentivize the use of HIT for care coordination. We should focus on the domains where we found the largest gaps: information transfer, systems to monitor patients, tools to support patients' self-management goals, and tools to link patients and their caregivers with community resources.
A distributed framework for health information exchange using smartphone technologies.
Abdulnabi, Mohamed; Al-Haiqi, Ahmed; Kiah, M L M; Zaidan, A A; Zaidan, B B; Hussain, Muzammil
2017-05-01
Nationwide health information exchange (NHIE) continues to be a persistent concern for government agencies, despite the many efforts and the conceived benefits of sharing patient data among healthcare providers. Difficulties in ensuring global connectivity, interoperability, and concerns on security have always hampered the government from successfully deploying NHIE. By looking at NHIE from a fresh perspective and bearing in mind the pervasiveness and power of modern mobile platforms, this paper proposes a new approach to NHIE that builds on the notion of consumer-mediated HIE, albeit without the focus on central health record banks. With the growing acceptance of smartphones as reliable, indispensable, and most personal devices, we suggest to leverage the concept of mobile personal health records (PHRs installed on smartphones) to the next level. We envision mPHRs that take the form of distributed storage units for health information, under the full control and direct possession of patients, who can have ready access to their personal data whenever needed. However, for the actual exchange of data with health information systems managed by healthcare providers, the latter have to be interoperable with patient-carried mPHRs. Computer industry has long ago solved a similar problem of interoperability between peripheral devices and operating systems. We borrow from that solution the idea of providing special interfaces between mPHRs and provider systems. This interface enables the two entities to communicate with no change to either end. The design and operation of the proposed approach is explained. Additional pointers on potential implementations are provided, and issues that pertain to any solution to implement NHIE are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
A SOA-Based Platform to Support Clinical Data Sharing
Gazzarata, R; Giannini, B; Giacomini, M
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. © 2017 R. Gazzarata et al.
A SOA-Based Platform to Support Clinical Data Sharing
Gazzarata, R.; Giannini, B.
2017-01-01
The eSource Data Interchange Group, part of the Clinical Data Interchange Standards Consortium, proposed five scenarios to guide stakeholders in the development of solutions for the capture of eSource data. The fifth scenario was subdivided into four tiers to adapt the functionality of electronic health records to support clinical research. In order to develop a system belonging to the “Interoperable” Tier, the authors decided to adopt the service-oriented architecture paradigm to support technical interoperability, Health Level Seven Version 3 messages combined with LOINC (Logical Observation Identifiers Names and Codes) vocabulary to ensure semantic interoperability, and Healthcare Services Specification Project standards to provide process interoperability. The developed architecture enhances the integration between patient-care practice and medical research, allowing clinical data sharing between two hospital information systems and four clinical data management systems/clinical registries. The core is formed by a set of standardized cloud services connected through standardized interfaces, involving client applications. The system was approved by a medical staff, since it reduces the workload for the management of clinical trials. Although this architecture can realize the “Interoperable” Tier, the current solution actually covers the “Connected” Tier, due to local hospital policy restrictions. PMID:29065576
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative a requirement to standardize and control the naming conventions of very disparate systems and technologies are emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation Initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation Initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and birds of a feather activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another
D-ATM, a working example of health care interoperability: From dirt path to gravel road.
DeClaris, John-William
2009-01-01
For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
Self-Assembling Texts & Courses of Study.
ERIC Educational Resources Information Center
Gibson, David
This paper describes the development of an interoperable meta-database system--a system of applications using metadata--that is intended to facilitate learner-centered collaboration, access to learning resources, and the fitness of channels of information to the emerging needs of learners at both individual and group levels. Highlights include:…
Why Digital Data Collections Are Important
ERIC Educational Resources Information Center
Mitchell, Erik T.
2012-01-01
The silo is a well-worn metaphor in information systems used to illustrate separateness, isolation, and lack of connectivity. Through the many iterations of system development, libraries, archives, and museums (LAMs) have sought to avoid silos and find the sweet spot between interface design and metadata interoperability. This effort is being…
Enabling the Interoperability of Large-Scale Legacy Systems
2008-01-01
information retrieval systems ( Salton and McGill 1983). We use this method because, in the schema mapping task, only one instance per class is...2001). A survey of approaches to automatic schema matching. The VLDB Journal, 10, 334-350. Salton , G., & McGill, M.J. (1983). Introduction to
An open, interoperable, and scalable prehospital information technology network architecture.
Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy
2011-01-01
Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.
Creating and Sharing Understanding: GEOSS and ArcGIS Online
NASA Astrophysics Data System (ADS)
White, C. E.; Hogeweg, M.; Foust, J.
2014-12-01
The GEOSS program brokers various forms of earth observation data and information via its online platform Discovery and Access Broker (DAB). The platform connects relevant information systems and infrastructures through the world. Esri and the National Research Council of Italy Institute of Atmospheric Pollution Research (CNR-IIA) are building two-way technology between DAB framework and ArcGIS Online using the ArcGIS Online API. Developers will engineer Esri and DAB interfaces and build interoperable web services that connect the two systems. This collaboration makes GEOSS earth observation data and services available to the ArcGIS Online community, and ArcGIS Online a significant part of the GEOSS DAB infrastructure. ArcGIS Online subscribers can discover and access the resources published by GEOSS, use GEOSS data services, and build applications. Making GEOSS content available in ArcGIS Online increases opportunities for scientists in other communities to visualize information in greater context. Moreover, because the platform supports authoritative and crowd-sourcing information, GEOSS members can build networks into other disciplines. This talk will discuss the power of interoperable service architectures that make such a collaboration possible, and the results thus far.
Standard-compliant real-time transmission of ECGs: harmonization of ISO/IEEE 11073-PHD and SCP-ECG.
Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José
2009-01-01
Ambient assisted living and integrated care in an aging society is based on the vision of the lifelong Electronic Health Record calling for HealthCare Information Systems and medical device interoperability. For medical devices this aim can be achieved by the consistent implementation of harmonized international interoperability standards. The ISO/IEEE 11073 (x73) family of standards is a reference standard for medical device interoperability. In its Personal Health Device (PHD) version several devices have been included, but an ECG device specialization is not yet available. On the other hand, the SCP-ECG standard for short-term diagnostic ECGs (EN1064) has been recently approved as an international standard ISO/IEEE 11073-91064:2009. In this paper, the relationships between a proposed x73-PHD model for an ECG device and the fields of the SCP-ECG standard are investigated. A proof-of-concept implementation of the proposed x73-PHD ECG model is also presented, identifying open issues to be addressed by standards development for the wider interoperability adoption of x73-PHD standards.
System architecture of communication infrastructures for PPDR organisations
NASA Astrophysics Data System (ADS)
Müller, Wilmuth
2017-04-01
The growing number of events affecting public safety and security (PS and S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on organizations responsible for PS and S. In order to respond timely and in an adequate manner to such events Public Protection and Disaster Relief (PPDR) organizations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies do not provide broadband capability, which is a major limitation in supporting new services hence new information flows and currently they have no successor. There is also no known standard that addresses interoperability of these technologies. The paper at hands provides an approach to tackle the above mentioned aspects by defining an Enterprise Architecture (EA) of PPDR organizations and a System Architecture of next generation PPDR communication networks for a variety of applications and services on broadband networks, including the ability of inter-system, inter-agency and cross-border operations. The Open Safety and Security Architecture Framework (OSSAF) provides a framework and approach to coordinate the perspectives of different types of stakeholders within a PS and S organization. It aims at bridging the silos in the chain of commands and on leveraging interoperability between PPDR organizations. The framework incorporates concepts of several mature enterprise architecture frameworks including the NATO Architecture Framework (NAF). However, OSSAF is not providing details on how NAF should be used for describing the OSSAF perspectives and views. In this contribution a mapping of the NAF elements to the OSSAF views is provided. Based on this mapping, an EA of PPDR organizations with a focus on communication infrastructure related capabilities is presented. Following the capability modeling, a system architecture for secure and interoperable communication infrastructures for PPDR organizations is presented. This architecture was implemented within a project sponsored by the European Union and successfully demonstrated in a live validation exercise in June 2016.
A Semantic Web-based System for Managing Clinical Archetypes.
Fernandez-Breis, Jesualdo Tomas; Menarguez-Tortosa, Marcos; Martinez-Costa, Catalina; Fernandez-Breis, Eneko; Herrero-Sempere, Jose; Moner, David; Sanchez, Jesus; Valencia-Garcia, Rafael; Robles, Montserrat
2008-01-01
Archetypes facilitate the sharing of clinical knowledge and therefore are a basic tool for achieving interoperability between healthcare information systems. In this paper, a Semantic Web System for Managing Archetypes is presented. This system allows for the semantic annotation of archetypes, as well for performing semantic searches. The current system is capable of working with both ISO13606 and OpenEHR archetypes.
Integrating hospital information systems in healthcare institutions: a mediation architecture.
El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian
2012-10-01
Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.
Requirements and Solutions for Personalized Health Systems.
Blobel, Bernd; Ruotsalainen, Pekka; Lopez, Diego M; Oemig, Frank
2017-01-01
Organizational, methodological and technological paradigm changes enable a precise, personalized, predictive, preventive and participative approach to health and social services supported by multiple actors from different domains at diverse level of knowledge and skills. Interoperability has to advance beyond Information and Communication Technologies (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. The paper introduces and compares personalized health definitions, summarizes requirements and principles for pHealth systems, and considers intelligent interoperability. It addresses knowledge representation and harmonization, decision intelligence, and usability as crucial issues in pHealth. On this basis, a system-theoretical, ontology-based, policy-driven reference architecture model for open and intelligent pHealth ecosystems and its transformation into an appropriate ICT design and implementation is proposed.
75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010... directs the development of a framework to achieve interoperability of smart grid devices and systems...
Examining the Relationship between Electronic Health Record Interoperability and Quality Management
ERIC Educational Resources Information Center
Purcell, Bernice M.
2013-01-01
A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…
Dixon, Brian E; Gamache, Roland E; Grannis, Shaun J
2013-01-01
Objective To summarize the literature describing computer-based interventions aimed at improving bidirectional communication between clinical and public health. Materials and Methods A systematic review of English articles using MEDLINE and Google Scholar. Search terms included public health, epidemiology, electronic health records, decision support, expert systems, and decision-making. Only articles that described the communication of information regarding emerging health threats from public health agencies to clinicians or provider organizations were included. Each article was independently reviewed by two authors. Results Ten peer-reviewed articles highlight a nascent but promising area of research and practice related to alerting clinicians about emerging threats. Current literature suggests that additional research and development in bidirectional communication infrastructure should focus on defining a coherent architecture, improving interoperability, establishing clear governance, and creating usable systems that will effectively deliver targeted, specific information to clinicians in support of patient and population decision-making. Conclusions Increasingly available clinical information systems make it possible to deliver timely, relevant knowledge to frontline clinicians in support of population health. Future work should focus on developing a flexible, interoperable infrastructure for bidirectional communications capable of integrating public health knowledge into clinical systems and workflows. PMID:23467470
NASA Astrophysics Data System (ADS)
McDonald, K. R.; Faundeen, J. L.; Petiteville, I.
2005-12-01
The Committee on Earth Observation Satellites (CEOS) was established in 1984 in response to a recommendation from the Economic Summit of Industrialized Nations Working Group on Growth, Technology, and Employment's Panel of Experts on Satellite Remote Sensing. CEOS participants are Members, who are national or international governmental organizations who operate civil spaceborne Earth observation satellites, and Associates who are governmental organizations with civil space programs in development or international scientific or governmental bodies who have an interest in and support CEOS objectives. The primary objective of CEOS is to optimize benefits of satellite Earth observations through cooperation of its participants in mission planning and in development of compatible data products, formats, services, applications and policies. To pursue its objectives, CEOS establishes working groups and associated subgroups that focus on relevant areas of interest. While the structure of CEOS has evolved over its lifetime, today there are three permanent working groups. One is the Working Group on Calibration and Validation that addresses sensor-specific calibration and validation and geophysical parameter validation. A second is the Working Group on Education, Training, and Capacity Building that facilitates activities that enhance international education and training in Earth observation techniques, data analysis, interpretation and applications, with a particular focus on developing countries. The third permanent working group is the Working Group on Information Systems and Services (WGISS). The purpose of WGISS is to promote collaboration in the development of the systems and services based on international standards that manage and supply the Earth observation data and information from participating agencies' missions. WGISS places great emphasis on the use of demonstration projects involving user groups to solve the critical interoperability issues associated with the achievement of global services and its structure reflects that objective. The Technology and Services Subgroup initiates tasks to explore emerging technologies that can be employed to create data and information systems and to develop interoperable services. The interests of the subgroup span the full range of the information processing chain from the initial ingestion of satellite data into archives through to the incorporation of derived information into end-user applications. The subgroup has overseen the creation of an Interoperable Directory Network and an Interoperable Catalog System and has tasks that are investigating the use of new technologies such as Web Services, Grid, and Open Geographical Information Systems to provide enhanced capabilities. The WGISS Projects and Applications Subgroup works with outside organizations to understand their requirements and then helps them to exploit the tools and services available through WGISS and its members and associates. WGISS has instituted the concept of a WGISS Test Facility to test and develop information systems and services prototypes collaboratively with these organizations to meet their specific requirements. This approach has the dual benefit of addressing real information systems and services needs of science and applications projects and helping WGISS to expand and improve its capabilities based on the experience and lessons learned from working with the projects.
Ontological modeling of electronic health information exchange.
McMurray, J; Zhu, L; McKillop, I; Chen, H
2015-08-01
Investments of resources to purposively improve the movement of information between health system providers are currently made with imperfect information. No inventories of system-level electronic health information flows currently exist, nor do measures of inter-organizational electronic information exchange. Using Protégé 4, an open-source OWL Web ontology language editor and knowledge-based framework, we formalized a model that decomposes inter-organizational electronic health information flow into derivative concepts such as diversity, breadth, volume, structure, standardization and connectivity. The ontology was populated with data from a regional health system and the flows were measured. Individual instance's properties were inferred from their class associations as determined by their data and object property rules. It was also possible to visualize interoperability activity for regional analysis and planning purposes. A property called Impact was created from the total number of patients or clients that a health entity in the region served in a year, and the total number of health service providers or organizations with whom it exchanged information in support of clinical decision-making, diagnosis or treatment. Identifying providers with a high Impact but low Interoperability score could assist planners and policy-makers to optimize technology investments intended to electronically share patient information across the continuum of care. Finally, we demonstrated how linked ontologies were used to identify logical inconsistencies in self-reported data for the study. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lucido, J. M.; Booth, N.
2014-12-01
Interoperable sharing of groundwater data across international boarders is essential for the proper management of global water resources. However storage and management of groundwater data is often times distributed across many agencies or organizations. Furthermore these data may be represented in disparate proprietary formats, posing a significant challenge for integration. For this reason standard data models are required to achieve interoperability across geographical and political boundaries. The GroundWater Markup Language 1.0 (GWML1) was developed in 2010 as an extension of the Geography Markup Language (GML) in order to support groundwater data exchange within Spatial Data Infrastructures (SDI). In 2013, development of GWML2 was initiated under the sponsorship of the Open Geospatial Consortium (OGC) for intended adoption by the international community as the authoritative standard for the transfer of groundwater feature data, including data about water wells, aquifers, and related entities. GWML2 harmonizes GWML1 and the EU's INSPIRE models related to geology and hydrogeology. Additionally, an interoperability experiment was initiated to test the model for commercial, technical, scientific, and policy use cases. The scientific use case focuses on the delivery of data required for input into computational flow modeling software used to determine the flow of groundwater within a particular aquifer system. It involves the delivery of properties associated with hydrogeologic units, observations related to those units, and information about the related aquifers. To test this use case web services are being implemented using GWML2 and WaterML2, which is the authoritative standard for water time series observations, in order to serve USGS water well and hydrogeologic data via standard OGC protocols. Furthermore, integration of these data into a computational groundwater flow model will be tested. This submission will present the GWML2 information model and results of an interoperability experiment with a particular emphasis on the scientific use case.
The role of architecture and ontology for interoperability.
Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka
2010-01-01
Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.
NASA Astrophysics Data System (ADS)
Gilman, Charles R.; Aparicio, Manuel; Barry, J.; Durniak, Timothy; Lam, Herman; Ramnath, Rajiv
1997-12-01
An enterprise's ability to deliver new products quickly and efficiently to market is critical for competitive success. While manufactureres recognize the need for speed and flexibility to compete in this market place, companies do not have the time or capital to move to new automation technologies. The National Industrial Information Infrastructure Protocols Consortium's Solutions for MES Adaptable Replicable Technology (NIIIP SMART) subgroup is developing an information infrastructure to enable the integration and interoperation among Manufacturing Execution Systems (MES) and Enterprise Information Systems within an enterprise or among enterprises. The goal of these developments is an adaptable, affordable, reconfigurable, integratable manufacturing system. Key innovative aspects of NIIIP SMART are: (1) Design of an industry standard object model that represents the diverse aspects of MES. (2) Design of a distributed object network to support real-time information sharing. (3) Product data exchange based on STEP and EXPRESS (ISO 10303). (4) Application of workflow and knowledge management technologies to enact manufacturing and business procedures and policy. (5) Application of intelligent agents to support emergent factories. This paper illustrates how these technologies have been incorporated into the NIIIP SMART system architecture to enable the integration and interoperation of existing tools and future MES applications in a 'plug and play' environment.
NASA Astrophysics Data System (ADS)
Benaben, Frederick; Mu, Wenxin; Boissel-Dallier, Nicolas; Barthe-Delanoe, Anne-Marie; Zribi, Sarah; Pingaud, Herve
2015-08-01
The Mediation Information System Engineering project is currently finishing its second iteration (MISE 2.0). The main objective of this scientific project is to provide any emerging collaborative situation with methods and tools to deploy a Mediation Information System (MIS). MISE 2.0 aims at defining and designing a service-based platform, dedicated to initiating and supporting the interoperability of collaborative situations among potential partners. This MISE 2.0 platform implements a model-driven engineering approach to the design of a service-oriented MIS dedicated to supporting the collaborative situation. This approach is structured in three layers, each providing their own key innovative points: (i) the gathering of individual and collaborative knowledge to provide appropriate collaborative business behaviour (key point: knowledge management, including semantics, exploitation and capitalisation), (ii) deployment of a mediation information system able to computerise the previously deduced collaborative processes (key point: the automatic generation of collaborative workflows, including connection with existing devices or services) (iii) the management of the agility of the obtained collaborative network of organisations (key point: supervision of collaborative situations and relevant exploitation of the gathered data). MISE covers business issues (through BPM), technical issues (through an SOA) and agility issues of collaborative situations (through EDA).
Daskalakis, S; Mantas, J
2009-01-01
The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.
Hovenga, Evelyn J S; Grain, Heather
2013-01-01
Health information provides the foundation for all decision making in healthcare whether clinical at the bed side, or at a national government level. This information is generally collected as part of systems which support administrative or clinical workflow and practice. This chapter describes the many and varied features of systems such as electronic health records (EHRs), how they fit with health information systems and how they collectively manage information flow. Systems engineering methods and tools are described together with their use to suit the health industry. This focuses on the need for suitable system architectures and semantic interoperability. These concepts and their relevance to the health industry are explained. The relationship and requirements for appropriate data governance in these systems is also considered.
U.S. Navy Interoperability with its High-End Allies
2000-10-01
Precision weapons require tremendous amounts of information from multiple sensors . Information is first used to plan missions. Then when the weapon is...programed and launched, information must be con - tinuously transmitted at very high rates of speed. The U.S. has developed systems capable of...liberal, on the assumption that advanced sensors can provide sufficient information to judge the severity of incoming threats U.S. allies develop
Text mining meets workflow: linking U-Compare with Taverna
Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia
2010-01-01
Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690
DOT National Transportation Integrated Search
2012-08-01
The United States Department of Transportation (U.S. DOT) has initiated a multimodal connected vehicle research initiative (hereafter referred to as the Initiative) that aims to enable safe, interoperable, networked, wireless communications amo...
Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaidon, Clement; Poplawski, Michael
First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.
Interoperability challenges in river discharge modelling: A cross domain application scenario
NASA Astrophysics Data System (ADS)
Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin
2018-06-01
River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.
Design and implementation of a health data interoperability mediator.
Kuo, Mu-Hsing; Kushniruk, Andre William; Borycki, Elizabeth Marie
2010-01-01
The objective of this study is to design and implement a common-gateway oriented mediator to solve the health data interoperability problems that exist among heterogeneous health information systems. The proposed mediator has three main components: (1) a Synonym Dictionary (SD) that stores a set of global metadata and terminologies to serve as the mapping intermediary, (2) a Semantic Mapping Engine (SME) that can be used to map metadata and instance semantics, and (3) a DB-to-XML module that translates source health data stored in a database into XML format and back. A routine admission notification data exchange scenario is used to test the efficiency and feasibility of the proposed mediator. The study results show that the proposed mediator can make health information exchange more efficient.
Bridging data models and terminologies to support adverse drug event reporting using EHR data.
Declerck, G; Hussain, S; Daniel, C; Yuksel, M; Laleci, G B; Twagirumukiza, M; Jaulent, M-C
2015-01-01
This article is part of the Focus Theme of METHODs of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". SALUS project aims at building an interoperability platform and a dedicated toolkit to enable secondary use of electronic health records (EHR) data for post marketing drug surveillance. An important component of this toolkit is a drug-related adverse events (AE) reporting system designed to facilitate and accelerate the reporting process using automatic prepopulation mechanisms. To demonstrate SALUS approach for establishing syntactic and semantic interoperability for AE reporting. Standard (e.g. HL7 CDA-CCD) and proprietary EHR data models are mapped to the E2B(R2) data model via SALUS Common Information Model. Terminology mapping and terminology reasoning services are designed to ensure the automatic conversion of source EHR terminologies (e.g. ICD-9-CM, ICD-10, LOINC or SNOMED-CT) to the target terminology MedDRA which is expected in AE reporting forms. A validated set of terminology mappings is used to ensure the reliability of the reasoning mechanisms. The percentage of data elements of a standard E2B report that can be completed automatically has been estimated for two pilot sites. In the best scenario (i.e. the available fields in the EHR have actually been filled), only 36% (pilot site 1) and 38% (pilot site 2) of E2B data elements remain to be filled manually. In addition, most of these data elements shall not be filled in each report. SALUS platform's interoperability solutions enable partial automation of the AE reporting process, which could contribute to improve current spontaneous reporting practices and reduce under-reporting, which is currently one major obstacle in the process of acquisition of pharmacovigilance data.
Lee, Jaehoon; Hulse, Nathan C; Wood, Grant M; Oniki, Thomas A; Huff, Stanley M
2016-01-01
In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM elements and existing FHIR resources. Based on the analysis, we developed a profile that consists of 1) FHIR resources for essential FHH data elements, 2) extensions for additional elements that were not covered by the resources, and 3) a structured definition to integrate patient and family member information in a FHIR message. We implemented the profile using an open-source based FHIR framework and validated it using patient-entered FHH data that was captured through a locally developed FHH tool.
CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype
NASA Technical Reports Server (NTRS)
Lucord, Steve; Martinez, Lindolfo
2009-01-01
We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.
2006-06-01
systems. Cyberspace is the electronic medium of net-centric operations, communications systems, and computers, in which horizontal integration and online...will be interoperable, more robust, responsive, and able to support faster spacecraft initialization times. This Intergrated Satellite Control... horizontally and vertically integrated information through machine-to-machine conversations enabled by a peer-based network of sensors, command
2015-01-01
Background A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple’s smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. Objective The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. Methods For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. Results The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. Conclusions By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care. PMID:26581920
Gay, Valerie; Leijdekkers, Peter
2015-11-18
A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple's smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. By limiting the negative effects of health data silos, mobile apps can offer a better holistic view of health and fitness data. Data can then be analyzed to offer better and more personalized advice and care.
Knowledge Discovery from Biomedical Ontologies in Cross Domains.
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies.
Knowledge Discovery from Biomedical Ontologies in Cross Domains
Shen, Feichen; Lee, Yugyung
2016-01-01
In recent years, there is an increasing demand for sharing and integration of medical data in biomedical research. In order to improve a health care system, it is required to support the integration of data by facilitating semantic interoperability systems and practices. Semantic interoperability is difficult to achieve in these systems as the conceptual models underlying datasets are not fully exploited. In this paper, we propose a semantic framework, called Medical Knowledge Discovery and Data Mining (MedKDD), that aims to build a topic hierarchy and serve the semantic interoperability between different ontologies. For the purpose, we fully focus on the discovery of semantic patterns about the association of relations in the heterogeneous information network representing different types of objects and relationships in multiple biological ontologies and the creation of a topic hierarchy through the analysis of the discovered patterns. These patterns are used to cluster heterogeneous information networks into a set of smaller topic graphs in a hierarchical manner and then to conduct cross domain knowledge discovery from the multiple biological ontologies. Thus, patterns made a greater contribution in the knowledge discovery across multiple ontologies. We have demonstrated the cross domain knowledge discovery in the MedKDD framework using a case study with 9 primary biological ontologies from Bio2RDF and compared it with the cross domain query processing approach, namely SLAP. We have confirmed the effectiveness of the MedKDD framework in knowledge discovery from multiple medical ontologies. PMID:27548262
Concept of information technology of monitoring and decision-making support
NASA Astrophysics Data System (ADS)
Kovalenko, Aleksandr S.; Tymchyk, Sergey V.; Kostyshyn, Sergey V.; Zlepko, Sergey M.; Wójcik, Waldemar; Kalizhanova, Aliya; Burlibay, Aron; Kozbekova, Ainur
2017-08-01
Presented concept of information technology monitoring and decision support to determine the health of students. The preconditions of a concept formulated its goal and purpose. Subject area concepts proposed to consider a set of problems, grouped into 8 categories, which in turn necessitates the application when creating technology basic principles from the principles of "first head" and "systems approach" to the principles of "interoperability" and "system integration ". The content of the information providing IT, its position in the segment of single information space, stages of creation. To evaluate the efficiency of the IT system developed proposed criteria.
Information Interaction Study for DER and DMS Interoperability
NASA Astrophysics Data System (ADS)
Liu, Haitao; Lu, Yiming; Lv, Guangxian; Liu, Peng; Chen, Yu; Zhang, Xinhui
The Common Information Model (CIM) is an abstract data model that can be used to represent the major objects in Distribution Management System (DMS) applications. Because the Common Information Model (CIM) doesn't modeling the Distributed Energy Resources (DERs), it can't meet the requirements of DER operation and management for Distribution Management System (DMS) advanced applications. Modeling of DER were studied based on a system point of view, the article initially proposed a CIM extended information model. By analysis the basic structure of the message interaction between DMS and DER, a bidirectional messaging mapping method based on data exchange was proposed.
Approaching semantic interoperability in Health Level Seven
Alschuler, Liora
2010-01-01
‘Semantic Interoperability’ is a driving objective behind many of Health Level Seven's standards. The objective in this paper is to take a step back, and consider what semantic interoperability means, assess whether or not it has been achieved, and, if not, determine what concrete next steps can be taken to get closer. A framework for measuring semantic interoperability is proposed, using a technique called the ‘Single Logical Information Model’ framework, which relies on an operational definition of semantic interoperability and an understanding that interoperability improves incrementally. Whether semantic interoperability tomorrow will enable one computer to talk to another, much as one person can talk to another person, is a matter for speculation. It is assumed, however, that what gets measured gets improved, and in that spirit this framework is offered as a means to improvement. PMID:21106995
Turning Interoperability Operational with GST
NASA Astrophysics Data System (ADS)
Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha
2013-04-01
GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.
NASA Astrophysics Data System (ADS)
Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela
2014-05-01
The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf
An Information System for European culture collections: the way forward.
Casaregola, Serge; Vasilenko, Alexander; Romano, Paolo; Robert, Vincent; Ozerskaya, Svetlana; Kopf, Anna; Glöckner, Frank O; Smith, David
2016-01-01
Culture collections contain indispensable information about the microorganisms preserved in their repositories, such as taxonomical descriptions, origins, physiological and biochemical characteristics, bibliographic references, etc. However, information currently accessible in databases rarely adheres to common standard protocols. The resultant heterogeneity between culture collections, in terms of both content and format, notably hampers microorganism-based research and development (R&D). The optimized exploitation of these resources thus requires standardized, and simplified, access to the associated information. To this end, and in the interest of supporting R&D in the fields of agriculture, health and biotechnology, a pan-European distributed research infrastructure, MIRRI, including over 40 public culture collections and research institutes from 19 European countries, was established. A prime objective of MIRRI is to unite and provide universal access to the fragmented, and untapped, resources, information and expertise available in European public collections of microorganisms; a key component of which is to develop a dynamic Information System. For the first time, both culture collection curators as well as their users have been consulted and their feedback, concerning the needs and requirements for collection databases and data accessibility, utilised. Users primarily noted that databases were not interoperable, thus rendering a global search of multiple databases impossible. Unreliable or out-of-date and, in particular, non-homogenous, taxonomic information was also considered to be a major obstacle to searching microbial data efficiently. Moreover, complex searches are rarely possible in online databases thus limiting the extent of search queries. Curators also consider that overall harmonization-including Standard Operating Procedures, data structure, and software tools-is necessary to facilitate their work and to make high-quality data easily accessible to their users. Clearly, the needs of culture collection curators coincide with those of users on the crucial point of database interoperability. In this regard, and in order to design an appropriate Information System, important aspects on which the culture collection community should focus include: the interoperability of data sets with the ontologies to be used; setting best practice in data management, and the definition of an appropriate data standard.
Zhou, Yuan; Ancker, Jessica S; Upadhye, Mandar; McGeorge, Nicolette M; Guarrera, Theresa K; Hegde, Sudeep; Crane, Peter W; Fairbanks, Rollin J; Bisantz, Ann M; Kaushal, Rainu; Lin, Li
2013-01-01
The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques. To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices. Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members. High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients. This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.
Lu, Xiaoqi; Wang, Lei; Zhao, Jianfeng
2012-02-01
With the development of medical information, Picture Archiving and Communications System (PACS), Hospital Information System/Radiology Information System(HIS/RIS) and other medical information management system become popular and developed, and interoperability between these systems becomes more frequent. So, these enclosed systems will be open and regionalized by means of network, and this is inevitable. If the trend becomes true, the security of information transmission may be the first problem to be solved. Based on the need for network security, we investigated the Digital Imaging and Communications in Medicine (DICOM) Standard and Transport Layer Security (TLS) Protocol, and implemented the TLS transmission of the DICOM medical information with OpenSSL toolkit and DCMTK toolkit.
A Proposed Information Architecture for Telehealth System Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warren, S.; Craft, R.L.; Parks, R.C.
1999-04-07
Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor providesmore » and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.« less
Local, regional and national interoperability in hospital-level systems architecture.
Mykkänen, J; Korpela, M; Ripatti, S; Rannanheimo, J; Sorri, J
2007-01-01
Interoperability of applications in health care is faced with various needs by patients, health professionals, organizations and policy makers. A combination of existing and new applications is a necessity. Hospitals are in a position to drive many integration solutions, but need approaches which combine local, regional and national requirements and initiatives with open standards to support flexible processes and applications on a local hospital level. We discuss systems architecture of hospitals in relation to various processes and applications, and highlight current challenges and prospects using a service-oriented architecture approach. We also illustrate these aspects with examples from Finnish hospitals. A set of main services and elements of service-oriented architectures for health care facilities are identified, with medium-term focus which acknowledges existing systems as a core part of service-oriented solutions. The services and elements are grouped according to functional and interoperability cohesion. A transition towards service-oriented architecture in health care must acknowledge existing health information systems and promote the specification of central processes and software services locally and across organizations. Software industry best practices such as SOA must be combined with health care knowledge to respond to central challenges such as continuous change in health care. A service-oriented approach cannot entirely rely on common standards and frameworks but it must be locally adapted and complemented.
Employing Semantic Technologies for the Orchestration of Government Services
NASA Astrophysics Data System (ADS)
Sabol, Tomáš; Furdík, Karol; Mach, Marián
The main aim of the eGovernment is to provide efficient, secure, inclusive services for its citizens and businesses. The necessity to integrate services and information resources, to increase accessibility, to reduce the administrative burden on citizens and enterprises - these are only a few reasons why the paradigm of the eGovernment has been shifted from the supply-driven approach toward the connected governance, emphasizing the concept of interoperability (Archmann and Nielsen 2008). On the EU level, the interoperability is explicitly addressed as one of the four main challenges, including in the i2010 strategy (i2010 2005). The Commission's Communication (Interoperability for Pan-European eGovernment Services 2006) strongly emphasizes the necessity of interoperable eGovernment services, based on standards, open specifications, and open interfaces. The Pan-European interoperability initiatives, such as the European Interoperability Framework (2004) and IDABC, as well as many projects supported by the European Commission within the IST Program and the Competitiveness and Innovation Program (CIP), illustrate the importance of interoperability on the EU level.
NASA Astrophysics Data System (ADS)
Glaves, Helen
2015-04-01
Marine research is rapidly moving away from traditional discipline specific science to a wider ecosystem level approach. This more multidisciplinary approach to ocean science requires large amounts of good quality, interoperable data to be readily available for use in an increasing range of new and complex applications. Significant amounts of marine data and information are already available throughout the world as a result of e-infrastructures being established at a regional level to manage and deliver marine data to the end user. However, each of these initiatives has been developed to address specific regional requirements and independently of those in other regions. Establishing a common framework for marine data management on a global scale necessitates that there is interoperability across these existing data infrastructures and active collaboration between the organisations responsible for their management. The Ocean Data Interoperability Platform (ODIP) project is promoting co-ordination between a number of these existing regional e-infrastructures including SeaDataNet and Geo-Seas in Europe, the Integrated Marine Observing System (IMOS) in Australia, the Rolling Deck to Repository (R2R) in the USA and the international IODE initiative. To demonstrate this co-ordinated approach the ODIP project partners are currently working together to develop several prototypes to test and evaluate potential interoperability solutions for solving the incompatibilities between the individual regional marine data infrastructures. However, many of the issues being addressed by the Ocean Data Interoperability Platform are not specific to marine science. For this reason many of the outcomes of this international collaborative effort are equally relevant and transferable to other domains.
2001-07-01
Web-based applications to improve health data systems and quality of care; innovative strategies for data collection in clinical settings; approaches...research to increase interoperability and integration of software in distributed systems ; protocols and tools for data annotation and management; and...Generation National Defense and National Security Systems .......................... 27 Improved Health Care Systems for All Citizens
NASA Astrophysics Data System (ADS)
Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.
2009-12-01
Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine users formulate their requests using a common ontology of metadata terms, called global ontology. A CSW mediator rewrites the user’s request into CSW requests over local CSWs using their own (local) ontologies, collects the results and sends them back to the user. To extend the system, we have recently added global maritime boundaries and are also considering nearshore ocean observing system data. Ongoing work includes adding WFS, error management, and exception handling, enabling Smart Searches, and writing full documentation. This prototype is a central research project of the new International Coastal Atlas Network (ICAN), a group of 30+ organizations from 14 nations (and growing) dedicated to seeking interoperability approaches to CWAs in support of coastal zone management and the translation of coastal science to coastal decision-making.
Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101
2012-07-01
interoperability, although they are supported by some interoperability attributes For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p
Towards a Ubiquitous User Model for Profile Sharing and Reuse
de Lourdes Martinez-Villaseñor, Maria; Gonzalez-Mendoza, Miguel; Hernandez-Gress, Neil
2012-01-01
People interact with systems and applications through several devices and are willing to share information about preferences, interests and characteristics. Social networking profiles, data from advanced sensors attached to personal gadgets, and semantic web technologies such as FOAF and microformats are valuable sources of personal information that could provide a fair understanding of the user, but profile information is scattered over different user models. Some researchers in the ubiquitous user modeling community envision the need to share user model's information from heterogeneous sources. In this paper, we address the syntactic and semantic heterogeneity of user models in order to enable user modeling interoperability. We present a dynamic user profile structure based in Simple Knowledge Organization for the Web (SKOS) to provide knowledge representation for ubiquitous user model. We propose a two-tier matching strategy for concept schemas alignment to enable user modeling interoperability. Our proposal is proved in the application scenario of sharing and reusing data in order to deal with overweight and obesity. PMID:23201995
Space Network Interoperability Panel (SNIP) study
NASA Technical Reports Server (NTRS)
Ryan, Thomas; Lenhart, Klaus; Hara, Hideo
1991-01-01
The Space Network Interoperability Panel (SNIP) study is a tripartite study that involves the National Aeronautics and Space Administration (NASA), the European Space Agency (ESA), and the National Space Development Agency (NASDA) of Japan. SNIP involves an ongoing interoperability study of the Data Relay Satellite (DRS) Systems of the three organizations. The study is broken down into two parts; Phase one deals with S-band (2 GHz) interoperability and Phase two deals with Ka-band (20/30 GHz) interoperability (in addition to S-band). In 1987 the SNIP formed a Working Group to define and study operations concepts and technical subjects to assure compatibility of the international data relay systems. Since that time a number of Panel and Working Group meetings have been held to continue the study. Interoperability is of interest to the three agencies because it offers a number of potential operation and economic benefits. This paper presents the history and status of the SNIP study.
On architecting and composing engineering information services to enable smart manufacturing
Ivezic, Nenad; Srinivasan, Vijay
2016-01-01
Engineering information systems play an important role in the current era of digitization of manufacturing, which is a key component to enable smart manufacturing. Traditionally, these engineering information systems spanned the lifecycle of a product by providing interoperability of software subsystems through a combination of open and proprietary exchange of data. But research and development efforts are underway to replace this paradigm with engineering information services that can be composed dynamically to meet changing needs in the operation of smart manufacturing systems. This paper describes the opportunities and challenges in architecting such engineering information services and composing them to enable smarter manufacturing. PMID:27840595
Hosek, Susan D; Straus, Susan G
2013-01-01
The Military Health System (MHS) and the Veterans Health Administration (VHA) have been among the nation's leaders in health information technology (IT), including the development of health IT systems and electronic health records that summarize patients' care from multiple providers. Health IT interoperability within MHS and across MHS partners, including VHA, is one of ten goals in the current MHS Strategic Plan. As a step toward achieving improved interoperability, the MHS is seeking to develop a research roadmap to better coordinate health IT research efforts, address IT capability gaps, and reduce programmatic risk for its enterprise projects. This article contributes to that effort by identifying gaps in research, policy, and practice involving patient privacy, consent, and identity management that need to be addressed to bring about improved quality and efficiency of care through health information exchange. Major challenges include (1) designing a meaningful patient consent procedure, (2) recording patients' consent preferences and designing procedures to implement restrictions on disclosures of protected health information, and (3) advancing knowledge regarding the best technical approaches to performing patient identity matches and how best to monitor results over time. Using a sociotechnical framework, this article suggests steps for overcoming these challenges and topics for future research.
NASA Astrophysics Data System (ADS)
Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.
2017-09-01
The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.
An Ontological Solution to Support Interoperability in the Textile Industry
NASA Astrophysics Data System (ADS)
Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo
Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.
75 FR 70966 - Transit Asset Management (TAM) Pilot Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-19
... interoperability between diverse types of information technology systems through use of open data formats and... to develop asset management plans, technical assistance, data collection and a pilot program as... condition assessment methodologies, as well as new data collection and analysis activities. $3 million has...
Patient Privacy, Consent, and Identity Management in Health Information Exchange
Hosek, Susan D.; Straus, Susan G.
2013-01-01
Abstract The Military Health System (MHS) and the Veterans Health Administration (VHA) have been among the nation's leaders in health information technology (IT), including the development of health IT systems and electronic health records that summarize patients' care from multiple providers. Health IT interoperability within MHS and across MHS partners, including VHA, is one of ten goals in the current MHS Strategic Plan. As a step toward achieving improved interoperability, the MHS is seeking to develop a research roadmap to better coordinate health IT research efforts, address IT capability gaps, and reduce programmatic risk for its enterprise projects. This article contributes to that effort by identifying gaps in research, policy, and practice involving patient privacy, consent, and identity management that need to be addressed to bring about improved quality and efficiency of care through health information exchange. Major challenges include (1) designing a meaningful patient consent procedure, (2) recording patients' consent preferences and designing procedures to implement restrictions on disclosures of protected health information, and (3) advancing knowledge regarding the best technical approaches to performing patient identity matches and how best to monitor results over time. Using a sociotechnical framework, this article suggests steps for overcoming these challenges and topics for future research. PMID:28083296
ERIC Educational Resources Information Center
Shakib, Shaun Cameron
2013-01-01
Controlled clinical terminologies are essential to realizing the benefits of electronic health record systems. However, implementing consistent and sustainable use of terminology has proven to be both intellectually and practically challenging. First, this project derives a conceptual understanding of the scope and intricacies of the challenge by…
EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.
A Linguistic Foundation for Communicating Geo-Information in the context of BML and geoBML
2010-03-23
BML Standard. 2009 Spring Simulation Interoperability Workshop (09S- SIW -046). San Diego, CA. Rein, K., Schade, U. & Hieb, M.R. (2009). Battle...Formalizing Battle Management Language: A Grammar for Specifying Orders. 2006 Spring Simulation Interoperability Workshop (06S- SIW - 068). Huntsville...Hieb, M.R. (2007). Battle Management Language: A Grammar for Specifying Reports. 2007 Spring Simulation Interoperability Workshop (07S- SIW -036
[Framework for the strengthening of health information systems in Peru].
Curioso, Walter H; Espinoza-Portilla, Elizabeth
2015-01-01
In this article we present the essential components and policies that are most relevant regarding the conceptual framework to strengthen the health information systems in Peru. The article also presents the main policies, actions and strategies made in the field of electronic health in Peru that are most significant. The health information systems in Peru play a key role and are expected to achieve an integrated and interoperable information system. This will allow health information to be complete, efficient, of good quality and available in a timely manner to achieve better quality of life for people and allow meaningful modernization of public health in the context of health reform in Peru.
Framework for Informed Policy Making Using Data from National Environmental Observatories
NASA Astrophysics Data System (ADS)
Wee, B.; Taylor, J. R.; Poinsatte, J.
2012-12-01
Large-scale environmental changes pose challenges that straddle environmental, economic, and social boundaries. As we design and implement climate adaptation strategies at the Federal, state, local, and tribal levels, accessible and usable data are essential for implementing actions that are informed by the best available information. Data-intensive science has been heralded as an enabler for scientific breakthroughs powered by advanced computing capabilities and interoperable data systems. Those same capabilities can be applied to data and information systems that facilitate the transformation of data into highly processed products. At the interface of scientifically informed public policy and data intensive science lies the potential for producers of credible, integrated, multi-scalar environmental data like the National Ecological Observatory Network (NEON) and its partners to capitalize on data and informatics interoperability initiatives that enable the integration of environmental data from across credible data sources. NSF's large-scale environmental observatories such as NEON and the Ocean Observatories Initiative (OOI) are designed to provide high-quality, long-term environmental data for research. These data are also meant to be repurposed for operational needs that like risk management, vulnerability assessments, resource management, and others. The proposed USDA Agriculture Research Service (ARS) Long Term Agro-ecosystem Research (LTAR) network is another example of such an environmental observatory that will produce credible data for environmental / agricultural forecasting and informing policy. To facilitate data fusion across observatories, there is a growing call for observation systems to more closely coordinate and standardize how variables are measured. Together with observation standards, cyberinfrastructure standards enable the proliferation of an ecosystem of applications that utilize diverse, high-quality, credible data. Interoperability facilitates the integration of data from multiple credible sources of data, and enables the repurposing of data for use at different geographical scales. Metadata that captures the transformation of data into value-added products ("provenance") lends reproducability and transparency to the entire process. This way, the datasets and model code used to create any product can be examined by other parties. This talk outlines a pathway for transforming environmental data into value-added products by various stakeholders to better inform sustainable agriculture using data from environmental observatories including NEON and LTAR.;
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.
2012-12-01
To address interoperability, we first need to understand how human perspectives and worldviews influence the way people conceive of and describe geophysical phenomena. There is never a single, unambiguous description of a phenomenon - the terminology used is based on the relationship people have with it and what their interests are. So how can these perspectives be reconciled in a way that is not only clear to different people but also formally described so that information systems can interoperate? In this paper we explore conceptions of Arctic sea ice as a means of exploring these issues. We examine multiple conceptions of sea ice and related processes as fundamental components of the Earth system. Arctic sea ice is undergoing rapid and dramatic decline. This will have huge impact on climate and biological systems as well as on shipping, exploration, human culture, and geopolitics. Local hunters, operational shipping forecasters, global climate researchers, and others have critical needs for sea ice data and information, but they conceive of, and describe sea ice phenomena in very different ways. Our hypothesis is that formally representing these diverse conceptions in a suite of formal ontologies can help facilitate sharing of information across communities and enhance overall Arctic data interoperability. We present initial work to model operational, research, and Indigenous (Iñupiat and Yup'ik) concepts of sea ice phenomena and data. Our results illustrate important and surprising differences in how these communities describe and represent sea ice, and we describe our approach to resolving incongruities and inconsistencies. We begin by exploring an intriguing information artifact, the World Meteorological Organization "egg code". The egg code is a compact, information rich way of illustrating detailed ice conditions that has been used broadly for a century. There is much agreement on construction and content encoding, but there are important regional differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.
MENTOR: an enabler for interoperable intelligent systems
NASA Astrophysics Data System (ADS)
Sarraipa, João; Jardim-Goncalves, Ricardo; Steiger-Garcao, Adolfo
2010-07-01
A community with knowledge organisation based on ontologies will enable an increase in the computational intelligence of its information systems. However, due to the worldwide diversity of communities, a high number of knowledge representation elements, which are not semantically coincident, have appeared representing the same segment of reality, becoming a barrier to business communications. Even if a domain community uses the same kind of technologies in its information systems, such as ontologies, it doesn't solve its semantics differences. In order to solve this interoperability problem, a solution is to use a reference ontology as an intermediary in the communications between the community enterprises and the outside, while allowing the enterprises to keep their own ontology and semantics unchanged internally. This work proposes MENTOR, a methodology to support the development of a common reference ontology for a group of organisations sharing the same business domain. This methodology is based on the mediator ontology (MO) concept, which assists the semantic transformations among each enterprise's ontology and the referential one. The MO enables each organisation to keep its own terminology, glossary and ontological structures, while providing seamless communication and interaction with the others.
Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul
2008-01-01
With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Managing Complex Interoperability Solutions using Model-Driven Architecture
2011-06-01
such as Oracle or MySQL . Each data model for a specific RDBMS is a distinct PSM. Or the system may want to exchange information with other C2...reduced number of transformations, e.g., from an RDBMS physical schema to the corresponding SQL script needed to instantiate the tables in a relational...tance of models. In engineering, a model serves several purposes: 1. It presents an abstract view of a complex system or of a complex information
NASA Technical Reports Server (NTRS)
Newman, Doug; Mitchell, Andrew
2016-01-01
During the development of the CMR (Common Metadata Repository) (CMR) for the Earth Observing System Data and Information System (EOSDIS), CSW (Catalog Service for the Web) a number of best practices came to light. Given that the ESIP (Earth Science Information Partners) Discovery Cluster is committed to interoperability and standards in earth data discovery this seemed like a convenient moment to provide Best Practices to the organization in the same way we did for OpenSearch for this widely-used standard.
NASA Astrophysics Data System (ADS)
Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup
When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.
NASA Astrophysics Data System (ADS)
Thomas, Paul A.; Marshall, Gillian; Faulkner, David; Kent, Philip; Page, Scott; Islip, Simon; Oldfield, James; Breckon, Toby P.; Kundegorski, Mikolaj E.; Clark, David J.; Styles, Tim
2016-05-01
Currently, most land Intelligence, Surveillance and Reconnaissance (ISR) assets (e.g. EO/IR cameras) are simply data collectors. Understanding, decision making and sensor control are performed by the human operators, involving high cognitive load. Any automation in the system has traditionally involved bespoke design of centralised systems that are highly specific for the assets/targets/environment under consideration, resulting in complex, non-flexible systems that exhibit poor interoperability. We address a concept of Autonomous Sensor Modules (ASMs) for land ISR, where these modules have the ability to make low-level decisions on their own in order to fulfil a higher-level objective, and plug in, with the minimum of preconfiguration, to a High Level Decision Making Module (HLDMM) through a middleware integration layer. The dual requisites of autonomy and interoperability create challenges around information fusion and asset management in an autonomous hierarchical system, which are addressed in this work. This paper presents the results of a demonstration system, known as Sensing for Asset Protection with Integrated Electronic Networked Technology (SAPIENT), which was shown in realistic base protection scenarios with live sensors and targets. The SAPIENT system performed sensor cueing, intelligent fusion, sensor tasking, target hand-off and compensation for compromised sensors, without human control, and enabled rapid integration of ISR assets at the time of system deployment, rather than at design-time. Potential benefits include rapid interoperability for coalition operations, situation understanding with low operator cognitive burden and autonomous sensor management in heterogenous sensor systems.
Myneni, Sahiti; Patel, Vimla L.; Bova, G. Steven; Wang, Jian; Ackerman, Christopher F.; Berlinicke, Cynthia A.; Chen, Steve H.; Lindvall, Mikael; Zack, Donald J.
2016-01-01
This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to 1) characterize specific problems faced by biomedical researchers with traditional information management practices, 2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to 3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. PMID:26652980
Multi-model-based interactive authoring environment for creating shareable medical knowledge.
Ali, Taqdir; Hussain, Maqbool; Ali Khan, Wajahat; Afzal, Muhammad; Hussain, Jamil; Ali, Rahman; Hassan, Waseem; Jamshed, Arif; Kang, Byeong Ho; Lee, Sungyoung
2017-10-01
Technologically integrated healthcare environments can be realized if physicians are encouraged to use smart systems for the creation and sharing of knowledge used in clinical decision support systems (CDSS). While CDSSs are heading toward smart environments, they lack support for abstraction of technology-oriented knowledge from physicians. Therefore, abstraction in the form of a user-friendly and flexible authoring environment is required in order for physicians to create shareable and interoperable knowledge for CDSS workflows. Our proposed system provides a user-friendly authoring environment to create Arden Syntax MLM (Medical Logic Module) as shareable knowledge rules for intelligent decision-making by CDSS. Existing systems are not physician friendly and lack interoperability and shareability of knowledge. In this paper, we proposed Intelligent-Knowledge Authoring Tool (I-KAT), a knowledge authoring environment that overcomes the above mentioned limitations. Shareability is achieved by creating a knowledge base from MLMs using Arden Syntax. Interoperability is enhanced using standard data models and terminologies. However, creation of shareable and interoperable knowledge using Arden Syntax without abstraction increases complexity, which ultimately makes it difficult for physicians to use the authoring environment. Therefore, physician friendliness is provided by abstraction at the application layer to reduce complexity. This abstraction is regulated by mappings created between legacy system concepts, which are modeled as domain clinical model (DCM) and decision support standards such as virtual medical record (vMR) and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT). We represent these mappings with a semantic reconciliation model (SRM). The objective of the study is the creation of shareable and interoperable knowledge using a user-friendly and flexible I-KAT. Therefore we evaluated our system using completeness and user satisfaction criteria, which we assessed through the system- and user-centric evaluation processes. For system-centric evaluation, we compared the implementation of clinical information modelling system requirements in our proposed system and in existing systems. The results suggested that 82.05% of the requirements were fully supported, 7.69% were partially supported, and 10.25% were not supported by our system. In the existing systems, 35.89% of requirements were fully supported, 28.20% were partially supported, and 35.89% were not supported. For user-centric evaluation, the assessment criterion was 'ease of use'. Our proposed system showed 15 times better results with respect to MLM creation time than the existing systems. Moreover, on average, the participants made only one error in MLM creation using our proposed system, but 13 errors per MLM using the existing systems. We provide a user-friendly authoring environment for creation of shareable and interoperable knowledge for CDSS to overcome knowledge acquisition complexity. The authoring environment uses state-of-the-art decision support-related clinical standards with increased ease of use. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kuo, K.
2010-12-01
As a practitioner in the field of atmospheric remote sensing, the author, like many other similar science users, depends on and uses heavily NASA Earth Science remote sensing data. Thus the author is asked by the NASA Earth Science Data Information System Project (ESDIS) to assess the capabilities of the Earth Observing System Data and Information System (EOSDIS) in order to provide suggestions and recommendations for the evolution of EOSDIS in the path towards its 2015 Vision Tenets. As NASA's Earth science data system, EOSDIS provides data processing and data archiving and distribution services for EOS missions. The science operations of EOSDIS are the focus of this report, i.e. data archiving and distribution, which are performed within a distributed system of many interconnected nodes, namely the Science Investigator-led Processing Systems, or SIPS, and distributed data centers. Since its inception in the early 1990s, EOSDIS has represented a democratization of data, a break from the past when data dissemination was at the discretion of project scientists. Its “open data” policy is so highly valued and well received by its user communities that it has influenced other agencies, even those of other countries, to adopt the same open policy. In the last ~10 years EOSDIS has matured to serve very well users of any given science community in which the varieties of data being used change infrequently. The unpleasant effects of interoperability barriers are now more often felt by users who try to use new data outside their existing familiar set. This paper first defines interoperability and identifies the purposes for achieving interoperability. The sources of interoperability barriers, classified by the author into software, hardware, and human categories, are examined. For a subset of issues related to software, it presents diagnoses obtained from experience of the author and his survey of the EOSDIS data finding, ordering, retrieving, and extraction services. it also reports on an analysis of his survey regarding tools provided by EOSDIS or its user communities and intended to make routine data manipulations easier. Barriers in the hardware category are those resulting from differences in orbit configurations of the spacecrafts and differences in remote sensing modality (active or passive), spectral and spatial resolutions, scanning strategies, etc. of the instruments. Such differences are best understood by considering the nature of remotely sensed observations. Human factors are further classified into institutional and individual subcategories. The former includes factors such as NASA’s funding practices and the latter relates to individuals’ propensity in adopting new technologies. Finally, a strategy for overcoming these barriers is proposed.
A/E/C CAD Standard, Release 4.0
2009-07-01
Insulating (Transformer) Oil System Lubrication Oil Hot Water Heating System Machine Design Appendix A Model File Level/Layer Assignment Tables A51...of the A /E/C CAD Standard are: “Uniform Drawing System ” The Construction Specifications Institute 99 Canal Center Plaza, Suite 300 Alexandria, VA...FM – Facility Management GIS – Geographic Information System IAI – International Alliance for Interoperability IFC – Industry Foundation
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
..., Reliability, and Interoperability Council AGENCY: Federal Communications Commission. ACTION: Notice of public... persons that the Federal Communications Commission's (FCC) Communications Security, Reliability, and... the security, reliability, and interoperability of communications systems. On March 19, 2011, the FCC...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Reminiscing about 15 years of interoperability efforts
Van de Sompel, Herbert; Nelson, Michael L.
2015-11-01
Over the past fifteen years, our perspective on tackling information interoperability problems for web-based scholarship has evolved significantly. In this opinion piece, we look back at three efforts that we have been involved in that aptly illustrate this evolution: OAI-PMH, OAI-ORE, and Memento. Understanding that no interoperability specification is neutral, we attempt to characterize the perspectives and technical toolkits that provided the basis for these endeavors. With that regard, we consider repository-centric and web-centric interoperability perspectives, and the use of a Linked Data or a REST/HATEAOS technology stack, respectively. In addition, we lament the lack of interoperability across nodes thatmore » play a role in web-based scholarship, but end on a constructive note with some ideas regarding a possible path forward.« less
NASA Technical Reports Server (NTRS)
Greenberg, Paul S.
2012-01-01
Firefighters and other emergency response personnel are presented with an increasing array of technologies to improve their health and safety. This includes real-time bidirectional communication, navigation and positional information, data on physiological and metabolic functions, as well as data on their surrounding environment. The emerging challenge is to integrate these elements into a practical system, addressing such features as power, data transfer, and inter-element coordination and communication. In many respects, NASA has addressed these aspects in the context of Extra Vehicular Activity (EVA). The EVA environment shares many common attributes with that of emergency response scenarios. A similar situation exists in terms of the need for interoperability among the various system sub-elements. A brief overview is presented on the similarities and differences in these two applications, as well as the technical approach adopted by NASA in terms of system design philosophy.
E-health and healthcare enterprise information system leveraging service-oriented architecture.
Hsieh, Sung-Huai; Hsieh, Sheau-Ling; Cheng, Po-Hsun; Lai, Feipei
2012-04-01
To present the successful experiences of an integrated, collaborative, distributed, large-scale enterprise healthcare information system over a wired and wireless infrastructure in National Taiwan University Hospital (NTUH). In order to smoothly and sequentially transfer from the complex relations among the old (legacy) systems to the new-generation enterprise healthcare information system, we adopted the multitier framework based on service-oriented architecture to integrate the heterogeneous systems as well as to interoperate among many other components and multiple databases. We also present mechanisms of a logical layer reusability approach and data (message) exchange flow via Health Level 7 (HL7) middleware, DICOM standard, and the Integrating the Healthcare Enterprise workflow. The architecture and protocols of the NTUH enterprise healthcare information system, especially in the Inpatient Information System (IIS), are discussed in detail. The NTUH Inpatient Healthcare Information System is designed and deployed on service-oriented architecture middleware frameworks. The mechanisms of integration as well as interoperability among the components and the multiple databases apply the HL7 standards for data exchanges, which are embedded in XML formats, and Microsoft .NET Web services to integrate heterogeneous platforms. The preliminary performance of the current operation IIS is evaluated and analyzed to verify the efficiency and effectiveness of the designed architecture; it shows reliability and robustness in the highly demanding traffic environment of NTUH. The newly developed NTUH IIS provides an open and flexible environment not only to share medical information easily among other branch hospitals, but also to reduce the cost of maintenance. The HL7 message standard is widely adopted to cover all data exchanges in the system. All services are independent modules that enable the system to be deployed and configured to the highest degree of flexibility. Furthermore, we can conclude that the multitier Inpatient Healthcare Information System has been designed successfully and in a collaborative manner, based on the index of performance evaluations, central processing unit, and memory utilizations.
Relevance of eHealth standards for big data interoperability in radiology and beyond.
Marcheschi, Paolo
2017-06-01
The aim of this paper is to report on the implementation of radiology and related information technology standards to feed big data repositories and so to be able to create a solid substrate on which to operate with analysis software. Digital Imaging and Communications in Medicine (DICOM) and Health Level 7 (HL7) are the major standards for radiology and medical information technology. They define formats and protocols to transmit medical images, signals, and patient data inside and outside hospital facilities. These standards can be implemented but big data expectations are stimulating a new approach, simplifying data collection and interoperability, seeking reduction of time to full implementation inside health organizations. Virtual Medical Record, DICOM Structured Reporting and HL7 Fast Healthcare Interoperability Resources (FHIR) are changing the way medical data are shared among organization and they will be the keys to big data interoperability. Until we do not find simple and comprehensive methods to store and disseminate detailed information on the patient's health we will not be able to get optimum results from the analysis of those data.
2003-04-09
reduction, marijuana eradication, mobile enforcement teams, and the Organized Crime Drug Enforcement Task Force. The DEA also operates eight laboratories...Owens’ Columbine Review Commission. Denver , CO: State of Colorado . Council on Foreign Relations. 2002. FBI and Law Enforcement. Washington DC...results of this research indicate high usage and perceived usefulness of the National Crime Information Center Network (NCIC Net), National Law
Personal Health Records: Is Rapid Adoption Hindering Interoperability?
Studeny, Jana; Coustasse, Alberto
2014-01-01
The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability. PMID:25214822
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Ingalls, John; Bengtsson, Kjell
2014-01-01
No known system is in place to allow NASA technical data interoperability throughout the whole life cycle. Life Cycle Cost (LCC) will be higher on many developing programs if action isn't taken soon to join disparate systems efficiently. Disparate technical data also increases safety risks from poorly integrated elements. NASA requires interoperability and industry standards, but breaking legacy ways is a challenge.
NASA Technical Reports Server (NTRS)
Tavenner, Leslie A. (Editor)
1991-01-01
These proceedings overview major space information system projects and lessons learned from current missions. Other topics include the science information system requirements for the 1990s, an information systems design approach for major programs, the technology needs and projections, the standards for space data information systems, the artificial intelligence technology and applications, international interoperability, and spacecraft data systems and architectures advanced communications. Other topics include the software engineering technology and applications, the multimission multidiscipline information system architectures, the distributed planning and scheduling systems and operations, and the computer and information systems architectures. Paper presented include prospects for scientific data analysis systems for solar-terrestrial physics in the 1990s, the Columbus data management system, data storage technologies for the future, the German aerospace research establishment, and launching artificial intelligence in NASA ground systems.
Enabling Interoperability in Heliophysical Domains
NASA Astrophysics Data System (ADS)
Bentley, Robert
2013-04-01
There are many aspects of science in the Solar System that are overlapping - phenomena observed in one domain can have effects in other domains. However, there are many problems related to exploiting the data in cross-disciplinary studies because of lack of interoperability of the data and services. The CASSIS project is a Coordination Action funded under FP7 that has the objective of improving the interoperability of data and services related Solar System science. CASSIS has been investigating how the data could be made more accessible with some relatively minor changes to the observational metadata. The project has been looking at the services that are used within the domain and determining whether they are interoperable with each other and if not what would be required make them so. It has also been examining all types of metadata that are used when identifying and using observations and trying to make them more compliant with techniques and standards developed by bodies such as the International Virtual Observatory Alliance (IVOA). Many of the lessons that are being learnt in the study are applicable to domains that go beyond those directly involved in heliophysics. Adopting some simple standards related to the design of the services interfaces and metadata that are used would make it much easier to investigate interdisciplinary science topics. We will report on our finding and describe a roadmap for the future. For more information about CASSIS, please visit the project Web site on cassis-vo.eu
Military Interoperable Digital Hospital Testbed (MIDHT)
2013-01-01
users of the PACS system in terms of viewing images originating from Miners and Meyersdale are Emergency Medicine and Trauma physicians. This...conditions, over the counter/ herbal medications, physician list, and emergency contacts. Through secure messaging with their physician, patients...et al. (1999). Impact of a patient-centered, computer- based health information/support system. American Journal of Preventive Medicine , 16(1), 1- 9
What it will take to achieve the as-yet-unfulfilled promises of health information technology.
Kellermann, Arthur L; Jones, Spencer S
2013-01-01
A team of RAND Corporation researchers projected in 2005 that rapid adoption of health information technology (IT) could save the United States more than $81 billion annually. Seven years later the empirical data on the technology's impact on health care efficiency and safety are mixed, and annual health care expenditures in the United States have grown by $800 billion. In our view, the disappointing performance of health IT to date can be largely attributed to several factors: sluggish adoption of health IT systems, coupled with the choice of systems that are neither interoperable nor easy to use; and the failure of health care providers and institutions to reengineer care processes to reap the full benefits of health IT. We believe that the original promise of health IT can be met if the systems are redesigned to address these flaws by creating more-standardized systems that are easier to use, are truly interoperable, and afford patients more access to and control over their health data. Providers must do their part by reengineering care processes to take full advantage of efficiencies offered by health IT, in the context of redesigned payment models that favor value over volume.
2013-10-01
exchange (COBie), Building Information Modeling ( BIM ), value-added analysis, business processes, project management 16. SECURITY CLASSIFICATION OF: 17...equipment. The innovative aspect of Building In- formation Modeling ( BIM ) is that it creates a computable building descrip- tion. The ability to use a...interoperability. In order for the building information to be interoperable, it must also con- form to a common data model , or schema, that defines the class
NASA Technical Reports Server (NTRS)
Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.
2002-01-01
Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI) Program Office are tasked to find more effective ways to disseminate information to the public. The NASA STI Program is an agency-wide program charged with gathering, organizing, storing, and disseminating NASA-produced information for research and public use. The program is investigating the use of a new protocol called the Open Archives Initiative (OAI) as a means to improve data interoperability and data collection. OAI promotes the use of the OAI harvesting protocol as a simple way for data sharing among repositories. In two separate initiatives, the STI Program is implementing OAI In collaboration with the Air Force, Department of Energy, and Old Dominion University, the NASA STI Program has funded research on implementing the OAI to exchange data between the three organizations. The second initiative is the deployment of OAI for the NASA technical report server (TRS) environment. The NASA TRS environment is comprised of distributed technical report servers with a centralized search interface. This paper focuses on the implementation of OAI to promote interoperability among diverse data repositories.
Interoperability prototype between hospitals and general practitioners in Switzerland.
Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar
2010-01-01
Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.
Improving Cancer-Related Outcomes with Connected Health - Action Items at a Glance
Action Item 1.1: Health IT stakeholder groups should continue to collaborate to overcome policy and technical barriers to a nationwide, interoperable health IT system. Action Item 1.2: Technical standards for information related to cancer care across the continuum should be developed, tested, disseminated, and adopted.
Operational Plan Ontology Model for Interconnection and Interoperability
NASA Astrophysics Data System (ADS)
Long, F.; Sun, Y. K.; Shi, H. Q.
2017-03-01
Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.
International Convergence on Geoscience Cyberinfrastructure
NASA Astrophysics Data System (ADS)
Allison, M. L.; Atkinson, R.; Arctur, D. K.; Cox, S.; Jackson, I.; Nativi, S.; Wyborn, L. A.
2012-04-01
There is growing international consensus on addressing the challenges to cyber(e)-infrastructure for the geosciences. These challenges include: Creating common standards and protocols; Engaging the vast number of distributed data resources; Establishing practices for recognition of and respect for intellectual property; Developing simple data and resource discovery and access systems; Building mechanisms to encourage development of web service tools and workflows for data analysis; Brokering the diverse disciplinary service buses; Creating sustainable business models for maintenance and evolution of information resources; Integrating the data management life-cycle into the practice of science. Efforts around the world are converging towards de facto creation of an integrated global digital data network for the geosciences based on common standards and protocols for data discovery and access, and a shared vision of distributed, web-based, open source interoperable data access and integration. Commonalities include use of Open Geospatial Consortium (OGC) and ISO specifications and standardized data interchange mechanisms. For multidisciplinarity, mediation, adaptation, and profiling services have been successfully introduced to leverage the geosciences standards which are commonly used by the different geoscience communities -introducing a brokering approach which extends the basic SOA archetype. Principal challenges are less technical than cultural, social, and organizational. Before we can make data interoperable, we must make people interoperable. These challenges are being met by increased coordination of development activities (technical, organizational, social) among leaders and practitioners in national and international efforts across the geosciences to foster commonalities across disparate networks. In doing so, we will 1) leverage and share resources, and developments, 2) facilitate and enhance emerging technical and structural advances, 3) promote interoperability across scientific domains, 4) support the promulgation and institutionalization of agreed-upon standards, protocols, and practice, and 5) enhance knowledge transfer not only across the community, but into the domain sciences, 6) lower existing entry barriers for users and data producers, 7) build on the existing disciplinary infrastructures leveraging their service buses. . All of these objectives are required for establishing a permanent and sustainable cyber(e)-infrastructure for the geosciences. The rationale for this approach is well articulated in the AuScope mission statement: "Many of these problems can only be solved on a national, if not global scale. No single researcher, research institution, discipline or jurisdiction can provide the solutions. We increasingly need to embrace e-Research techniques and use the internet not only to access nationally distributed datasets, instruments and compute infrastructure, but also to build online, 'virtual' communities of globally dispersed researchers." Multidisciplinary interoperability can be successfully pursued by adopting a "system of systems" or a "Network of Networks" philosophy. This approach aims to: (a) supplement but not supplant systems mandates and governance arrangements; (b) keep the existing capacities as autonomous as possible; (c) lower entry barriers; (d) Build incrementally on existing infrastructures (information systems); (e) incorporate heterogeneous resources by introducing distribution and mediation functionalities. This approach has been adopted by the European INSPIRE (Infrastructure for Spatial Information in the European Community) initiative and by the international GEOSS (Global Earth Observation System of Systems) programme.
Standardized exchange of clinical documents--towards a shared care paradigm in glaucoma treatment.
Gerdsen, F; Müller, S; Jablonski, S; Prokosch, H-U
2006-01-01
The exchange of medical data from research and clinical routine across institutional borders is essential to establish an integrated healthcare platform. In this project we want to realize the standardized exchange of medical data between different healthcare institutions to implement an integrated and interoperable information system supporting clinical treatment and research of glaucoma. The central point of our concept is a standardized communication model based on the Clinical Document Architecture (CDA). Further, a communication concept between different health care institutions applying the developed document model has been defined. With our project we have been able to prove that standardized communication between an Electronic Medical Record (EMR), an Electronic Health Record (EHR) and the Erlanger Glaucoma Register (EGR) based on the established conceptual models, which rely on CDA rel.1 level 1 and SCIPHOX, could be implemented. The HL7-tool-based deduction of a suitable CDA rel.2 compliant schema showed significant differences when compared with the manually created schema. Finally fundamental requirements, which have to be implemented for an integrated health care platform, have been identified. An interoperable information system can enhance both clinical treatment and research projects. By automatically transferring screening findings from a glaucoma research project to the electronic medical record of our ophthalmology clinic, clinicians could benefit from the availability of a longitudinal patient record. The CDA as a standard for exchanging clinical documents has demonstrated its potential to enhance interoperability within a future shared care paradigm.
OAI and NASA's Scientific and Technical Information
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Rocker, JoAnne; Harrison, Terry L.
2002-01-01
The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) is an evolving protocol and philosophy regarding interoperability for digital libraries (DLs). Previously, "distributed searching" models were popular for DL interoperability. However, experience has shown distributed searching systems across large numbers of DLs to be difficult to maintain in an Internet environment. The OAI-PMH is a move away from distributed searching, focusing on the arguably simpler model of "metadata harvesting". We detail NASA s involvement in defining and testing the OAI-PMH and experience to date with adapting existing NASA distributed searching DLs (such as the NASA Technical Report Server) to use the OAI-PMH and metadata harvesting. We discuss some of the entirely new DL projects that the OAI-PMH has made possible, such as the Technical Report Interchange project. We explain the strategic importance of the OAI-PMH to the mission of NASA s Scientific and Technical Information Program.
NASA Astrophysics Data System (ADS)
Plessel, T.; Szykman, J.; Freeman, M.
2012-12-01
EPA's Remote Sensing Information Gateway (RSIG) is a widely used free applet and web service for quickly and easily retrieving, visualizing and saving user-specified subsets of atmospheric data - by variable, geographic domain and time range. Petabytes of available data include thousands of variables from a set of NASA and NOAA satellites, aircraft, ground stations and EPA air-quality models. The RSIG applet is used by atmospheric researchers and uses the rsigserver web service to obtain data and images. The rsigserver web service is compliant with the Open Geospatial Consortium Web Coverage Service (OGC-WCS) standard to facilitate data discovery and interoperability. Since rsigserver is publicly accessible, it can be (and is) used by other applications. This presentation describes the architecture and technical implementation details of this successful system with an emphasis on achieving convenience, high-performance, data integrity and security.
Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems
2008-09-01
divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that
Interoperability, Scaling, and the Digital Libraries Research Agenda.
ERIC Educational Resources Information Center
Lynch, Clifford; Garcia-Molina, Hector
1996-01-01
Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Communication Interoperability Plan Implementation Report AGENCY: National Protection and Programs Directorate... Directorate/Cybersecurity and Communications/Office of Emergency Communications, has submitted the following... INFORMATION: The Office of Emergency Communications (OEC), formed under Title XVIII of the Homeland Security...
Methods and apparatus for distributed resource discovery using examples
NASA Technical Reports Server (NTRS)
Chang, Yuan-Chi (Inventor); Li, Chung-Sheng (Inventor); Smith, John Richard (Inventor); Hill, Matthew L. (Inventor); Bergman, Lawrence David (Inventor); Castelli, Vittorio (Inventor)
2005-01-01
Distributed resource discovery is an essential step for information retrieval and/or providing information services. This step is usually used for determining the location of an information or data repository which has relevant information. The most fundamental challenge is the usual lack of semantic interoperability of the requested resource. In accordance with the invention, a method is disclosed where distributed repositories achieve semantic interoperability through the exchange of examples and, optionally, classifiers. The outcome of the inventive method can be used to determine whether common labels are referring to the same semantic meaning.
Watershed and Economic Data InterOperability (WEDO) ...
Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interoperability goes beyond the current practice of publishing modeling studies as reports or journal articles. Rather than summarized results, modeling studies can be published with their full complement of input data, calibration parameters and output with associated metadata for easy duplication by others. Reproducible science is possible only if researchers can find, evaluate and use complete modeling studies performed by other modelers. WEDO greatly increases transparency by making detailed data available to the scientific community.WEDO is a next generation technology, a Web Service linked to the EPA’s EnviroAtlas for discovery of modeling studies nationwide. Streams and rivers are identified using the National Hydrography Dataset network and stream IDs. Streams with modeling studies available are color coded in the EnviroAtlas. One can select streams within a watershed of interest to readily find data available via WEDO. The WEDO website is linked from the EnviroAtlas to provide a thorough review of each modeling study. WEDO currently provides modeled flow and water quality time series, designed for a broad range of watershed and economic models for nutrient trading market analysis. M
Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak
2015-07-01
This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A Common Metadata System for Marine Data Portals
NASA Astrophysics Data System (ADS)
Wosniok, C.; Breitbach, G.; Lehfeldt, R.
2012-04-01
Processing and allocation of marine datasets depend on the nature of the data resulting from field campaigns, continuous monitoring and numerical modeling. Two research and development projects in northern Germany manage different types of marine data. Due to different data characteristics and institutional frameworks separate data portals are required. This paper describes the integration of distributed marine data in Germany. The Marine Data Infrastructure of Germany (MDI-DE) supports public authorities in the German coastal zone with the implementation of European directives like INSPIRE or the Marine Strategy Framework Directive. This is carried out through setting up standardized web services within a network of participating coastal agencies and the installation of a common data portal (http://www.mdi-de.org), which integrates distributed marine data concerning coastal engineering, coastal water protection and nature conservation in an interoperable and harmonized manner for administrative and scientific purposes as well as for information of the general public. The Coastal Observation System for Northern and Arctic Seas (COSYNA) aims at developing and testing analysis systems for the operational synoptic description of the environmental status of the North Sea and of Arctic coastal waters. This is done by establishing a network of monitoring facilities and the provision of its data in near-real-time. In situ measurements with poles, ferry boxes, and buoys, together with remote sensing measurements, and the data assimilation of these data into simulation results enables COSYNA to provide pre-operational 'products', that are beyond the present routinely applied techniques in observation and modelling. The data allocation in near-real-time requires thoroughly executed data validation, which is processed on the fly before data is passed on to the COSYNA portal (http://kofserver2.hzg.de/codm/). Both projects apply OGC standards such as Web Mapping Service (WMS), Web Feature Service (WFS) and Sensor Observation Service (SOS), which ensures interoperability and extensibility. In addition, metadata as crucial components for searching and finding information in large data infrastructures is provided via the Catalogue Web Service (CS-W). MDI-DE and COSYNA rely on the metadata information system for marine metadata NOKIS, which reflects a metadata profile tailored for marine data according to the specifications of German coastal authorities. In spite of this common software base, interoperability between the two data collections requires constant alignments of the diverse data processed by the two portals. While monitoring data in the MDI-DE is currently rather campaign-based, COSYNA has to fit constantly evolving time series into metadata sets. With all data following the same metadata profile, we now reach full interoperability between the different data collections. The distributed marine information system provides options to search, find and visualise the harmonised results from continuous monitoring, field campaigns, numerical modeling and other data in one web client.
Zywietz, Christoph
2004-01-01
The evolution of information technology and of telematics and increasing efforts to establish an electronic health record stimulate the development and introduction of new concepts in health care. However, compared to other application areas, e.g., tourism, banking, commerce etc. the use of information technology in health care is still of limited success. In hospitals as well in ambulatory medicine (General Practitioner systems) computers are often only used for administrative purposes. Fully operational Hospital Information Systems (HIS) are rare and often island solutions. The situation is somewhat better for department systems (DIS), e.g., where image analysis, processing of biochemical data or of biosignals is in the clinical focus. Even before we have solved the various problems in health care data processing and management within the "conventional" care institutions new challenges are coming up with concepts of telemedicine for assisted and non-assisted home care for patients with chronic diseases or people at high risk. The major challenges for provision of tele-monitoring and alarming services are improvement of communication and interoperability of devices and care providers. A major obstacle in achieving such goals are lack of standards for devices as well for procedures and a lack of databases with information on "normal" variability of many medical parameters to be monitored by serial comparison in continuous medical care. Some of these aspects will be discussed in more detail.
Scalable and Resilient Middleware to Handle Information Exchange during Environment Crisis
NASA Astrophysics Data System (ADS)
Tao, R.; Poslad, S.; Moßgraber, J.; Middleton, S.; Hammitzsch, M.
2012-04-01
The EU FP7 TRIDEC project focuses on enabling real-time, intelligent, information management of collaborative, complex, critical decision processes for earth management. A key challenge is to promote a communication infrastructure to facilitate interoperable environment information services during environment events and crises such as tsunamis and drilling, during which increasing volumes and dimensionality of disparate information sources, including sensor-based and human-based ones, can result, and need to be managed. Such a system needs to support: scalable, distributed messaging; asynchronous messaging; open messaging to handling changing clients such as new and retired automated system and human information sources becoming online or offline; flexible data filtering, and heterogeneous access networks (e.g., GSM, WLAN and LAN). In addition, the system needs to be resilient to handle the ICT system failures, e.g. failure, degradation and overloads, during environment events. There are several system middleware choices for TRIDEC based upon a Service-oriented-architecture (SOA), Event-driven-Architecture (EDA), Cloud Computing, and Enterprise Service Bus (ESB). In an SOA, everything is a service (e.g. data access, processing and exchange); clients can request on demand or subscribe to services registered by providers; more often interaction is synchronous. In an EDA system, events that represent significant changes in state can be processed simply, or as streams or more complexly. Cloud computing is a virtualization, interoperable and elastic resource allocation model. An ESB, a fundamental component for enterprise messaging, supports synchronous and asynchronous message exchange models and has inbuilt resilience against ICT failure. Our middleware proposal is an ESB based hybrid architecture model: an SOA extension supports more synchronous workflows; EDA assists the ESB to handle more complex event processing; Cloud computing can be used to increase and decrease the ESB resources on demand. To reify this hybrid ESB centric architecture, we will adopt two complementary approaches: an open source one for scalability and resilience improvement while a commercial one can be used for ultra-speed messaging, whilst we can bridge between these two to support interoperability. In TRIDEC, to manage such a hybrid messaging system, overlay and underlay management techniques will be adopted. The managers (both global and local) will collect, store and update status information (e.g. CPU utilization, free space, number of clients) and balance the usage, throughput, and delays to improve resilience and scalability. The expected resilience improvement includes dynamic failover, self-healing, pre-emptive load balancing, and bottleneck prediction while the expected improvement for scalability includes capacity estimation, Http Bridge, and automatic configuration and reconfiguration (e.g. add or delete clients and servers).
Telemedicine system interoperability architecture: concept description and architecture overview.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard Layne, II
2004-05-01
In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.
Health Information Exchange as a Complex and Adaptive Construct: Scoping Review.
Akhlaq, Ather; Sheikh, Aziz; Pagliari, Claudia
2017-01-25
To understand how the concept of Health Information Exchange (HIE) has evolved over time. Supplementary analysis of data from a systematic scoping review of definitions of HIE from 1900 to 2014, involving temporal analysis of underpinning themes. The search identified 268 unique definitions of HIE dating from 1957 onwards; 103 in scientific databases and 165 in Google. These contained consistent themes, representing the core concept of exchanging health information electronically, as well as fluid themes, reflecting the evolving policy, business, organisational and technological context of HIE (including the emergence of HIE as an organisational 'entity'). These are summarised graphically to show how the concept has evolved around the world with the passage of time. The term HIE emerged in 1957 with the establishment of Occupational HIE, evolving through the 1990s with concepts such as electronic data interchange and mobile computing technology; then from 2006-10 largely aligning with the US Government's health information technology strategy and the creation of HIEs as organisational entities, alongside the broader interoperability imperative, and continuing to evolve today as part of a broader international agenda for sustainable, information-driven health systems. The concept of HIE is an evolving and adaptive one, reflecting the ongoing quest for integrated and interoperable information to improve the efficiency and effectiveness of health systems, in a changing technological and policy environment.
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.; Witherspoon, Keith R.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative, a requirement to standardize and control the naming conventions of very disparate systems and technologies is emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and Birds of a Feather (BoF) activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another standards body or technologies that are currently not standardized. For activities one through three, we will provide the analysis by either discipline or technology with rationale, identification and brief description of requirements and precedence. For activity four, we will provide a list of current standards bodies e.g. IETF and a list of potential candidates.
Liaw, S T; Rahimi, A; Ray, P; Taggart, J; Dennis, S; de Lusignan, S; Jalaludin, B; Yeo, A E T; Talaei-Khoei, A
2013-01-01
Effective use of routine data to support integrated chronic disease management (CDM) and population health is dependent on underlying data quality (DQ) and, for cross system use of data, semantic interoperability. An ontological approach to DQ is a potential solution but research in this area is limited and fragmented. Identify mechanisms, including ontologies, to manage DQ in integrated CDM and whether improved DQ will better measure health outcomes. A realist review of English language studies (January 2001-March 2011) which addressed data quality, used ontology-based approaches and is relevant to CDM. We screened 245 papers, excluded 26 duplicates, 135 on abstract review and 31 on full-text review; leaving 61 papers for critical appraisal. Of the 33 papers that examined ontologies in chronic disease management, 13 defined data quality and 15 used ontologies for DQ. Most saw DQ as a multidimensional construct, the most used dimensions being completeness, accuracy, correctness, consistency and timeliness. The majority of studies reported tool design and development (80%), implementation (23%), and descriptive evaluations (15%). Ontological approaches were used to address semantic interoperability, decision support, flexibility of information management and integration/linkage, and complexity of information models. DQ lacks a consensus conceptual framework and definition. DQ and ontological research is relatively immature with little rigorous evaluation studies published. Ontology-based applications could support automated processes to address DQ and semantic interoperability in repositories of routinely collected data to deliver integrated CDM. We advocate moving to ontology-based design of information systems to enable more reliable use of routine data to measure health mechanisms and impacts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Infrastructure for Planetary Sciences: Universal planetary database development project
NASA Astrophysics Data System (ADS)
Kasaba, Yasumasa; Capria, M. T.; Crichton, D.; Zender, J.; Beebe, R.
The International Planetary Data Alliance (IPDA), formally formed under COSPAR (Formal start: from the COSPAR 2008 at Montreal), is a joint international effort to enable global access and exchange of high quality planetary science data, and to establish archive stan-dards that make it easier to share the data across international boundaries. In 2008-2009, thanks to the many players from several agencies and institutions, we got fruitful results in 6 projects: (1) Inter-operable Planetary Data Access Protocol (PDAP) implementations [led by J. Salgado@ESA], (2) Small bodies interoperability [led by I. Shinohara@JAXA N. Hirata@U. Aizu], (3) PDAP assessment [led by Y. Yamamoto@JAXA], (4) Architecture and standards definition [led by D. Crichton@NASA], (5) Information model and data dictionary [led by S. Hughes@NASA], and (6) Venus Express Interoperability [led by N. Chanover@NMSU]. 'IPDA 2009-2010' is important, especially because the NASA/PDS system reformation is now reviewed as it develops for application at the international level. IPDA is the gate for the establishment of the future infrastructure. We are running 8 projects: (1) IPDA Assessment of PDS4 Data Standards [led by S. Hughes (NASA/JPL)], (2) IPDA Archive Guide [led by M.T. Capria (IASF/INAF) and D. Heather (ESA/PSA)], (3) IPDA Standards Identification [led by E. Rye (NASA/PDS) and G. Krishna (ISRO)], (4) Ancillary Data Standards [led by C. Acton (NASA/JPL)], (5) IPDA Registries Definition [led by D. Crichton (NASA/JPL)], (6) PDAP Specification [led by J. Salgado (ESA/PSA) and Y. Yamamoto (JAXA)], (7) In-teroperability Assessment [R. Beebe (NMSU) and D. Heather (ESA/PSA)], and (8) PDAP Geographic Information System (GIS) extension [N. Hirata (Univ. Aizu) and T. Hare (USGS: thare@usgs.gov)]. This paper presents our achievements and plans summarized in the IPDA 5th Steering Com-mittee meeting at DLR in July 2010. We are now just the gate for the establishment of the Infrastructure.
2007-03-01
Additionally, research shows that many over the past decade have proposed interoperability measures, notable of which have been: 1 ) the DoD...Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the...to comply with a collection of information if it does not display a currently valid OMB control number. 1 . REPORT DATE MAR 2007 2. REPORT TYPE 3
ARGOS Policy Brief on Semantic Interoperability
KALRA, Dipak; MUSEN, Mark; SMITH, Barry; CEUSTERS, Werner
2016-01-01
Semantic interoperability is one of the priority themes of the ARGOS Trans-Atlantic Observatory. This topic represents a globally recognised challenge that must be addressed if electronic health records are to be shared among heterogeneous systems, and the information in them exploited to the maximum benefit of patients, professionals, health services, research, and industry. Progress in this multi-faceted challenge has been piecemeal, and valuable lessons have been learned, and approaches discovered, in Europe and in the US that can be shared and combined. Experts from both continents have met at three ARGOS workshops during 2010 and 2011 to share understanding of these issues and how they might be tackled collectively from both sides of the Atlantic. This policy brief summarises the problems and the reasons why they are important to tackle, and also why they are so difficult. It outlines the major areas of semantic innovation that exist and that are available to help address this challenge. It proposes a series of next steps that need to be championed on both sides of the Atlantic if further progress is to be made in sharing and analysing electronic health records meaningfully. Semantic interoperability requires the use of standards, not only for EHR data to be transferred and structurally mapped into a receiving repository, but also for the clinical content of the EHR to be interpreted in conformity with the original meanings intended by its authors. Wide-scale engagement with professional bodies, globally, is needed to develop these clinical information standards. Accurate and complete clinical documentation, faithful to the patient’s situation, and interoperability between systems, require widespread and dependable access to published and maintained collections of coherent and quality-assured semantic resources, including models such as archetypes and templates that would (1) provide clinical context, (2) be mapped to interoperability standards for EHR data, (3) be linked to well specified multi-lingual terminology value sets, and (4) be derived from high quality ontologies. There is need to gain greater experience in how semantic resources should be defined, validated, and disseminated, how users (who increasingly will include patients) should be educated to improve the quality and consistency of EHR documentation and to make full use of it. There are urgent needs to scale up the authorship, acceptance, and adoption of clinical information standards, to leverage and harmonise the islands of standardisation optimally, to assure the quality of the artefacts produced, and to organise end-to-end governance of the development and adoption of solutions. PMID:21893897
Reuse and Interoperability of Avionics for Space Systems
NASA Technical Reports Server (NTRS)
Hodson, Robert F.
2007-01-01
The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.
Data Modeling Challenges of Advanced Interoperability.
Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka
2018-01-01
Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.
NASA Astrophysics Data System (ADS)
Schaap, D.
2015-12-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP 2 for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; establishing common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Glaves, Helen
2016-04-01
Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and Australia for routine harvesting of cruise data for delivery via the Partnership for Observation of Global Oceans (POGO) global portal; 3. the establishment of common standards for a Sensor Observation Service (SOS) for selected sensors installed on vessels and in real-time monitoring systems using sensor web enablement (SWE)
ERIC Educational Resources Information Center
Watson, Jason; Ahmed, Pervaiz K.
2004-01-01
This paper briefly introduces the trends towards e-learning and amplifies some examples of state of the art systems, pointing out that all of these are, to date, limited by adaptability and shareability of content and that it is necessary for industry to develop and use an inter-operability standard. Uses SCORM specifications to specify the…
[Analysis of health terminologies for use as ontologies in healthcare information systems].
Romá-Ferri, Maria Teresa; Palomar, Manuel
2008-01-01
Ontologies are a resource that allow the concept of meaning to be represented informatically, thus avoiding the limitations imposed by standardized terms. The objective of this study was to establish the extent to which terminologies could be used for the design of ontologies, which could be serve as an aid to resolve problems such as semantic interoperability and knowledge reusability in healthcare information systems. To determine the extent to which terminologies could be used as ontologies, six of the most important terminologies in clinical, epidemiologic, documentation and administrative-economic contexts were analyzed. The following characteristics were verified: conceptual coverage, hierarchical structure, conceptual granularity of the categories, conceptual relations, and the language used for conceptual representation. MeSH, DeCS and UMLS ontologies were considered lightweight. The main differences among these ontologies concern conceptual specification, the types of relation and the restrictions among the associated concepts. SNOMED and GALEN ontologies have declaratory formalism, based on logical descriptions. These ontologies include explicit qualities and show greater restrictions among associated concepts and rule combinations and were consequently considered as heavyweight. Analysis of the declared representation of the terminologies shows the extent to which they could be reused as ontologies. Their degree of usability depends on whether the aim is for healthcare information systems to solve problems of semantic interoperability (lightweight ontologies) or to reuse the systems' knowledge as an aid to decision making (heavyweight ontologies) and for non-structured information retrieval, extraction, and classification.
Modelling and approaching pragmatic interoperability of distributed geoscience data
NASA Astrophysics Data System (ADS)
Ma, Xiaogang
2010-05-01
Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location, intention, procedure, consequence, etc.) of local pragmatic contexts and thus context-dependent. Elimination of these elements will inevitably lead to information loss in semantic mediation between local ontologies. Correspondingly, understanding and effect of exchanged data in a new context may differ from that in its original context. Another problem is the dilemma on how to find a balance between flexibility and standardization of local ontologies, because ontologies are not fixed, but continuously evolving. It is commonly realized that we cannot use a unified ontology to replace all local ontologies because they are context-dependent and need flexibility. However, without coordination of standards, freely developed local ontologies and databases will bring enormous work of mediation between them. Finding a balance between standardization and flexibility for evolving ontologies, in a practical sense, requires negotiations (i.e. conversations, agreements and collaborations) between different local pragmatic contexts. The purpose of this work is to set up a computer-friendly model representing local pragmatic contexts (i.e. geodata sources), and propose a practical semantic negotiation procedure for approaching pragmatic interoperability between local pragmatic contexts. Information agents, objective facts and subjective dimensions are reviewed as elements of a conceptual model for representing pragmatic contexts. The author uses them to draw a practical semantic negotiation procedure approaching pragmatic interoperability of distributed geodata. The proposed conceptual model and semantic negotiation procedure were encoded with Description Logic, and then applied to analyze and manipulate semantic negotiations between different local ontologies within the National Mineral Resources Assessment (NMRA) project of China, which involves multi-source and multi-subject geodata sharing.
Space Station Information System - Concepts and international issues
NASA Technical Reports Server (NTRS)
Williams, R. B.; Pruett, David; Hall, Dana L.
1987-01-01
The Space Station Information System (SSIS) is outlined in terms of its functions and probable physical facilities. The SSIS includes flight element systems as well as existing and planned institutional systems such as the NASA Communications System, the Tracking and Data Relay Satellite System, and the data and communications networks of the international partners. The SSIS strives to provide both a 'user friendly' environment and a software environment which will allow for software transportability and interoperability across the SSIS. International considerations are discussed as well as project management, software commonality, data communications standards, data security, documentation commonality, transaction management, data flow cross support, and key technologies.
Myneni, Sahiti; Patel, Vimla L; Bova, G Steven; Wang, Jian; Ackerman, Christopher F; Berlinicke, Cynthia A; Chen, Steve H; Lindvall, Mikael; Zack, Donald J
2016-04-01
This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to (1) characterize specific problems faced by biomedical researchers with traditional information management practices, (2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to (3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine
King, H. Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas
2014-01-01
Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators. PMID:24748993
RF-CLASS: A Remote-sensing-based Interoperable Web service system for Flood Crop Loss Assessment
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Kang, L.
2014-12-01
Flood is one of the worst natural disasters in the world. Flooding often causes significant crop loss over large agricultural areas in the United States. Two USDA agencies, the National Agricultural Statistics Service (NASS) and Risk Management Agency (RMA), make decisions on flood statistics, crop insurance policy, and recovery management by collecting, analyzing, reporting, and utilizing flooded crop acreage and crop loss information. NASS has the mandate to report crop loss after all flood events. RMA manages crop insurance policy and uses crop loss information to guide the creation of the crop insurance policy and the aftermath compensation. Many studies have been conducted in the recent years on monitoring floods and assessing the crop loss due to floods with remote sensing and geographic information technologies. The Remote-sensing-based Flood Crop Loss Assessment Service System (RF-CLASS), being developed with NASA and USDA support, aims to significantly improve the post-flood agricultural decision-making supports in USDA by integrating and advancing the recently developed technologies. RF-CLASS will operationally provide information to support USDA decision making activities on collecting and archiving flood acreage and duration, recording annual crop loss due to flood, assessing the crop insurance rating areas, investigating crop policy compliance, and spot checking of crop loss claims. This presentation will discuss the remote sensing and GIS based methods for deriving the needed information to support the decision making, the RF-CLASS cybersystem architecture, the standards and interoperability arrangements in the system, and the current and planned capabilities of the system.
Medicaid information technology architecture: an overview.
Friedman, Richard H
2006-01-01
The Medicaid Information Technology Architecture (MITA) is a roadmap and tool-kit for States to transform their Medicaid Management Information System (MMIS) into an enterprise-wide, beneficiary-centric system. MITA will enable State Medicaid agencies to align their information technology (IT) opportunities with their evolving business needs. It also addresses long-standing issues of interoperability, adaptability, and data sharing, including clinical data, across organizational boundaries by creating models based on nationally accepted technical standards. Perhaps most significantly, MITA allows State Medicaid Programs to actively participate in the DHHS Secretary's vision of a transparent health care market that utilizes electronic health records (EHRs), ePrescribing and personal health records (PHRs).
DIMP: an interoperable solution for software integration and product data exchange
NASA Astrophysics Data System (ADS)
Wang, Xi Vincent; Xu, Xun William
2012-08-01
Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.
Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling
NASA Astrophysics Data System (ADS)
Tommasi, C.; Achille, C.
2017-02-01
Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?
Identity Management Systems in Healthcare: The Issue of Patient Identifiers
NASA Astrophysics Data System (ADS)
Soenens, Els
According to a recent recommendation of the European Commission, now is the time for Europe to enhance interoperability in eHealth. Although interoperability of patient identifiers seems promising for matters of patient mobility, patient empowerment and effective access to care, we see that today there is indeed a considerable lack of interoperability in the field of patient identification. Looking from a socio-technical rather than a merely technical point of view, one can understand the fact that the development and implementation of an identity management system in a specific healthcare context is influenced by particular social practices, affected by socio-economical history and the political climate and regulated by specific data protection legislations. Consequently, the process of making patient identification in Europe more interoperable is a development beyond semantic and syntactic levels. In this paper, we gives some examples of today’s patient identifier systems in Europe, discuss the issue of interoperability of (unique) patient identifiers from a socio-technical point of view and try not to ignore the ‘privacy side’ of the story.
Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui
2015-01-01
Introduction The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Methods Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. Results We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. Conclusions We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. PMID:25670753
NASA Technical Reports Server (NTRS)
Gerard, Mireille (Editor); Edwards, Pamela W. (Editor)
1988-01-01
Technological and planning issues for data management, processing, and communication on Space Station Freedom are discussed in reviews and reports by U.S., European, and Japanese experts. The space-information-system strategies of NASA, ESA, and NASDA are discussed; customer needs are analyzed; and particular attention is given to communication and data systems, standards and protocols, integrated system architectures, software and automation, and plans and approaches being developed on the basis of experience from past programs. Also included are the reports from workshop sessions on design to meet customer needs, the accommodation of growth and new technologies, and system interoperability.
Interoperable Acquisition for Systems of Systems: The Challenges
2006-09-01
Interoperable Acquisition for Systems of Systems: The Challenges James D. Smith II D. Mike Phillips September 2006 TECHNICAL NOTE...Failure of Program-Centric Risk Management 10 3.3.2 Absence of System-of-Systems Engineering 12 3.3.3 Disconnect Between System-of-Systems...SOFTWARE ENGINEERING INSTITUTE | vii viii | CMU/SEI-2006-TN-034 Abstract Large, complex systems development has always been challenging , even when the
Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences
NASA Astrophysics Data System (ADS)
Smith, P., II
2015-12-01
Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to EarthCube Funded projects with academic research libraries facilitating the data and information assets components of the USGS CDI SSF via institutional repositories and/or digital content management. This session will explore the USGS CDI SSF for cross-discipline collaboration considerations from a library perspective.
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
The value of personal health record (PHR) systems.
Kaelber, David; Pan, Eric C
2008-11-06
Personal health records (PHRs) are a rapidly growing area of health information technology despite a lack of significant value-based assessment.Here we present an assessment of the potential value of PHR systems, looking at both costs and benefits.We examine provider-tethered, payer-tethered, and third-party PHRs, as well as idealized interoperable PHRs. An analytical model was developed that considered eight PHR application and infrastructure functions. Our analysis projects the initial and annual costs and annual benefits of PHRs to the entire US over the next 10 years.This PHR analysis shows that all forms of PHRs have initial net negative value. However, at the end of 10 years, steady state annual net value ranging from$13 billion to -$29 billion. Interoperable PHRs provide the most value, followed by third-party PHRs and payer-tethered PHRs also showing positive net value. Provider-tethered PHRs constantly demonstrating negative net value.
Exploring Interoperability as a Multidimensional Challenge for Effective Emergency Response
ERIC Educational Resources Information Center
Santisteban, Hiram
2010-01-01
Purpose. The purpose of this research was to further an understanding of how the federal government is addressing the challenges of interoperability for emergency response or crisis management (FEMA, 2009) by informing the development of standards through the review of current congressional law, commissions, studies, executive orders, and…
2006-09-30
coastal phenomena. OBJECTIVES SURA is creating a SCOOP “Grid” that extends the interoperability enabled by the World Wide Web. The coastal ... community faces special challenges with respect to achieving a level of interoperability that can leverage emerging Grid technologies. With that in mind
Electronic health record interoperability as realized in the Turkish health information system.
Dogac, A; Yuksel, M; Avci, A; Ceyhan, B; Hülür, U; Eryilmaz, Z; Mollahaliloglu, S; Atbakan, E; Akdag, R
2011-01-01
The objective of this paper is to describe the techniques used in developing the National Health Information System of Turkey (NHIS-T), a nation-wide infrastructure for sharing electronic health records (EHRs). The UN/CEFACT Core Components Technical Specification (CCTS) methodology was applied to design the logical EHR structure and to increase the reuse of common information blocks in EHRs. The NHIS-T became operational on January 15, 2009. By June 2010, 99% of the public hospitals and 71% of the private and university hospitals were connected to NHIS-T with daily feeds of their patients' EHRs. Out of the 72 million citizens of Turkey, electronic healthcare records of 43 million citizens have already been created in NHIS-T. Currently, only the general practitioners can access the EHRs of their patients. In the second phase of the implementation and once the legal framework is completed, the proper patient consent mechanisms will be available through the personal health record system that is under development. At this time authorized healthcare professionals in secondary and tertiary healthcare systems can access the patients' EHRs. A number of factors affected the successful implementation of NHIS-T. First, all stakeholders have to adopt the specified standards. Second, the UN/CEFACT CCTS approach was applied which facilitated the development and understanding of rather complex EHR schemas. Finally, the comprehensive testing of vendor-based hospital information systems for their conformance to and interoperability with NHIS-T through an automated testing platform enhanced substantially the fast integration of vendor-based solutions with the NHIS-T.
Data interoperability software solution for emergency reaction in the Europe Union
NASA Astrophysics Data System (ADS)
Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.
2015-07-01
Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency-first responders: the Netherlands-Germany border fire.
Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes
NASA Astrophysics Data System (ADS)
Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.
2014-12-01
With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data Web developer-friendly with a RESTful service. This goal was achieved by defining a proxy layer on top of the existing SPARQL endpoint that 1) translates HTTP requests into SPARQL queries, and 2) renders the returned results as required by the request sender using content negotiation, suffixes and parameters.
Kim, Hwa Sun; Cho, Hune; Lee, In Keun
2011-06-01
We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medicaid Services (CMS)-1500 form based on information regarding the patients and physicians' clinical activities. It supports electronic insurance claims by creating reimbursement charges. It also contains an HL7 interface engine to exchange clinical messages between heterogeneous devices. The system partially prevents physician malpractice by suggesting proper treatments according to patient diagnoses and supports physicians by easily preparing documents for reimbursement and submitting claim documents to insurance organizations electronically, without additional effort by the user. To show the usability of the developed system, we performed an experiment that compares the time spent filling out the CMS-1500 form directly and time required create electronic claim data using the developed system. From the experimental results, we conclude that the system could save considerable time for physicians in making claim documents. The developed system might be particularly useful for those who need a reimbursement-specialized EHR system, even though the proposed system does not completely satisfy all criteria requested by the CMS and Office of the National Coordinator for Health Information Technology (ONC). This is because the criteria are not sufficient but necessary condition for the implementation of EHR systems. The system will be upgraded continuously to implement the criteria and to offer more stable and transparent transmission of electronic claim data.
NASA Astrophysics Data System (ADS)
San Gil, Inigo; White, Marshall; Melendez, Eda; Vanderbilt, Kristin
The thirty-year-old United States Long Term Ecological Research Network has developed extensive metadata to document their scientific data. Standard and interoperable metadata is a core component of the data-driven analytical solutions developed by this research network Content management systems offer an affordable solution for rapid deployment of metadata centered information management systems. We developed a customized integrative metadata management system based on the Drupal content management system technology. Building on knowledge and experience with the Sevilleta and Luquillo Long Term Ecological Research sites, we successfully deployed the first two medium-scale customized prototypes. In this paper, we describe the vision behind our Drupal based information management instances, and list the features offered through these Drupal based systems. We also outline the plans to expand the information services offered through these metadata centered management systems. We will conclude with the growing list of participants deploying similar instances.
An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems
ERIC Educational Resources Information Center
Moschoglou, Georgios
2013-01-01
This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…
A Framework for a Decision Support System in a Hierarchical Extended Enterprise Decision Context
NASA Astrophysics Data System (ADS)
Boza, Andrés; Ortiz, Angel; Vicens, Eduardo; Poler, Raul
Decision Support System (DSS) tools provide useful information to decision makers. In an Extended Enterprise, a new goal, changes in the current objectives or small changes in the extended enterprise configuration produce a necessary adjustment in its decision system. A DSS in this context must be flexible and agile to make suitable an easy and quickly adaptation to this new context. This paper proposes to extend the Hierarchical Production Planning (HPP) structure to an Extended Enterprise decision making context. In this way, a framework for DSS in Extended Enterprise context is defined using components of HPP. Interoperability details have been reviewed to identify the impact in this framework. The proposed framework allows overcoming some interoperability barriers, identifying and organizing components for a DSS in Extended Enterprise context, and working in the definition of an architecture to be used in the design process of a flexible DSS in Extended Enterprise context which can reuse components for futures Extended Enterprise configurations.
Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul
2015-01-01
Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.
Do, Hyoungho
2018-01-01
Objectives Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. Methods We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. Results ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. Conclusions ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01. PMID:29503752
Lee, Sungkee; Do, Hyoungho
2018-01-01
Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01.
Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat
2013-08-01
Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. Copyright © 2013 Elsevier Inc. All rights reserved.
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
2008-08-01
facilitate the use of existing architecture descriptions in performing interoperability measurement. Noting that “everything in the world can be expressed as...biological, botanical, and genetic research, it has also been used with great success in the fields of ecology, medicine, the social sciences, the...appropriate for at least three reasons. First, systems perform different interoperations in different scenarios (i.e., they are used differently); second
NASA Astrophysics Data System (ADS)
Glaves, H. M.
2015-12-01
In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them alongside existing services. The associated best practice and common standards are also being disseminated to the user community through relevant accreditation processes and related initiatives such as the Research Data Alliance and the Belmont Forum.
Lin, M.C.; Vreeman, D.J.; Huff, S.M.
2012-01-01
Objectives We wanted to develop a method for evaluating the consistency and usefulness of LOINC code use across different institutions, and to evaluate the degree of interoperability that can be attained when using LOINC codes for laboratory data exchange. Our specific goals were to: 1) Determine if any contradictory knowledge exists in LOINC. 2) Determine how many LOINC codes were used in a truly interoperable fashion between systems. 3) Provide suggestions for improving the semantic interoperability of LOINC. Methods We collected Extensional Definitions (EDs) of LOINC usage from three institutions. The version space approach was used to divide LOINC codes into small sets, which made auditing of LOINC use across the institutions feasible. We then compared pairings of LOINC codes from the three institutions for consistency and usefulness. Results The number of LOINC codes evaluated were 1,917, 1,267 and 1,693 as obtained from ARUP, Intermountain and Regenstrief respectively. There were 2,022, 2,030, and 2,301 version spaces among ARUP & Intermountain, Intermountain & Regenstrief and ARUP & Regenstrief respectively. Using the EDs as the gold standard, there were 104, 109 and 112 pairs containing contradictory knowledge and there were 1,165, 765 and 1,121 semantically interoperable pairs. The interoperable pairs were classified into three levels: 1) Level I – No loss of meaning, complete information was exchanged by identical codes. 2) Level II – No loss of meaning, but processing of data was needed to make the data completely comparable. 3) Level III – Some loss of meaning. For example, tests with a specific ‘method’ could be rolled-up with tests that were ‘methodless’. Conclusions There are variations in the way LOINC is used for data exchange that result in some data not being truly interoperable across different enterprises. To improve its semantic interoperability, we need to detect and correct any contradictory knowledge within LOINC and add computable relationships that can be used for making reliable inferences about the data. The LOINC committee should also provide detailed guidance on best practices for mapping from local codes to LOINC codes and for using LOINC codes in data exchange. PMID:22306382
OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.
Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk
2018-02-23
Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.
Introduction to Architectures: HSCB Information - What It Is and How It Fits (or Doesn’t Fit)
2010-10-01
Simulation Interoperability Workshop, 01E- SIW -080 [15] Barry G. Silverman, Gnana Gharathy, Kevin O’Brien, Jason Cornwell, “Human Behavior Models for Agents...Workshop, 10F- SIW -023, September 2010. [17] Christiansen, John H., “A flexible object-based software framework for modelling complex systems with
A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components
2005-05-01
interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.
Towards global environmental information and data management
NASA Astrophysics Data System (ADS)
Gurney, Robert; Allison, Lee; Cesar, Roberto; Cossu, Roberto; Dietz, Volkmar; Gemeinholzer, Birgit; Koike, Toshio; Mokrane, Mustapha; Peters, Dale; Thaller-Honold, Svetlana; Treloar, Andrew; Vilotte, Jean-Pierre; Waldmann, Christoph
2014-05-01
The Belmont Forum, a coalition of national science agencies from 13 countries, is supporting an 18-month effort to implement a 'Knowledge Hub' community-building and strategy development program as a first step to coordinate and streamline international efforts on community governance, interoperability and system architectures so that environmental data and information can be exchanged internationally and across subject domains easily and efficiently. This initiative represents a first step to build collaboratively an international capacity and e-infrastructure framework to address societally relevant global environmental change challenges. The project will deliver a community-owned strategy and implementation plan, which will prioritize international funding opportunities for Belmont Forum members to build pilots and exemplars in order to accelerate delivery of end-to end global change decision support systems. In 2012, the Belmont Forum held a series of public town hall meetings, and a two-day scoping meeting of scientists and program officers, which concluded that transformative approaches and innovative technologies are needed for heterogeneous data/information to be integrated and made interoperable for researchers in disparate fields and for myriad uses across international, institutional, disciplinary, spatial and temporal boundaries. Pooling Belmont Forum members' resources to bring communities together for further integration, cooperation, and leveraging of existing initiatives and resources has the potential to develop the e-infrastructure framework necessary to solve pressing environmental problems, and to support the aims of many international data sharing initiatives. The plan is expected to serve as the foundation of future Belmont Forum calls for proposals for e-Infrastructures and Data Management. The Belmont Forum is uniquely able to align resources of major national funders to support global environmental change research on specific technical and governance challenges, and the development of focused pilot systems that could be complementary to other initiatives such as GEOSS, ICSU World Data System, and Global Framework for Climate Services (GFCS). The development of this Belmont Forum Knowledge Hub represents an extraordinary effort to bring together international leaders in interoperability, governance and other fields pertinent to decision-support systems in global environmental change research. It is also addressing related issues such as ensuring a cohort of environmental scientists who can use up-to-date computing techniques for data and information management, and investigating which legal issues need common international attention.
Coalition readiness management system preliminary interoperability experiment (CReaMS PIE)
NASA Astrophysics Data System (ADS)
Clark, Peter; Ryan, Peter; Zalcman, Lucien; Robbie, Andrew
2003-09-01
The United States Navy (USN) has initiated the Coalition Readiness Management System (CReaMS) Initiative to enhance coalition warfighting readiness through advancing development of a team interoperability training and combined mission rehearsal capability. It integrates evolving cognitive team learning principles and processes with advanced technology innovations to produce an effective and efficient team learning environment. The JOint Air Navy Networking Environment (JOANNE) forms the Australian component of CReaMS. The ultimate goal is to link Australian Defence simulation systems with the USN Battle Force Tactical Training (BFTT) system to demonstrate and achieve coalition level warfare training in a synthetic battlespace. This paper discusses the initial Preliminary Interoperability Experiment (PIE) involving USN and Australian Defence establishments.
NASA Astrophysics Data System (ADS)
Oggioni, A.; Tagliolato, P.; Schleidt, K.; Carrara, P.; Grellet, S.; Sarretta, A.
2016-02-01
The state of the art in biodiversity data management unfortunately encompases a plethora of diverse data formats. Compared to other research fields, there is a lack in harmonization and standardization of these data. While data from traditional biodiversity collections (e.g. from museums) can be easily represented by existing standard as provided by TDWG, the growing number of field observations stemming from both VGI activities (e.g. iNaturalist) as well as from automated systems (e.g. animal biotelemetry) would at the very least require upgrades of current formats. Moreover, from an eco-informatics perspective, the integration and use of data from different scientific fields is the norm (abiotic data, geographic information, etc.); the possibility to represent this information and biodiversity data in a homogeneous way would be an advantage for interoperability, allowing for easy integration across environmental media. We will discuss the possibility to exploit the Open Geospatial Consortium/ISO standard, Observations and Measurements (O&M) [1], a generic conceptual model developed for observation data but with strong analogies with the biodiversity-oriented OBOE ontology [2]. The applicability of OGC O&M for the provision of biodiviersity occurence data has been suggested by the INSPIRE Cross Thematic Working Group on Observations & Measurements [3], Inspire Environmental Monitoring Facilities Thematic Working Group [4] and New Zealand Environmental Information Interoperability Framework [5]. This approach, in our opinion, could be an advantage for the biodiversity community. We will provide some examples for encoding biodiversity occurence data using the O&M standard in addition to highlighting the advatages offered by O&M in comparison to other representation formats. [1] Cox, S. (2013). Geographic information - Observations and measurements - OGC and ISO 19156. [2] Madin, J., Bowers, S., Schildhauer, M., Krivov, S., Pennington, D., & Villa, F. (2007). An ontology for describing and synthesizing ecological observation data. Ecological Informatics, 2(3), 279-296. [3] INSPIRE_D2.9_O&M_Guidelines_v2.0rc3.pdf[4] INSPIRE_DataSpecification_EF_v3.0.pdf[5] Watkins, A. (2012) Biodiversity Interoperability through Open Geospatial Standards
Requirements Development for Interoperability Simulation Capability for Law Enforcement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holter, Gregory M.
2004-05-19
The National Counterdrug Center (NCC) was initially authorized by Congress in FY 1999 appropriations to create a simulation-based counterdrug interoperability training capability. As the lead organization for Research and Analysis to support the NCC, the Pacific Northwest National Laboratory (PNNL) was responsible for developing the requirements for this interoperability simulation capability. These requirements were structured to address the hardware and software components of the system, as well as the deployment and use of the system. The original set of requirements was developed through a process of conducting a user-based survey of requirements for the simulation capability, coupled with an analysismore » of similar development efforts. The user-based approach ensured that existing concerns with respect to interoperability within the law enforcement community would be addressed. Law enforcement agencies within the designated pilot area of Cochise County, Arizona, were surveyed using interviews and ride-alongs during actual operations. The results of this survey were then accumulated, organized, and validated with the agencies to ensure the accuracy of the results. These requirements were then supplemented by adapting operational requirements from existing systems to ensure system reliability and operability. The NCC adopted a development approach providing incremental capability through the fielding of a phased series of progressively more capable versions of the system. This allowed for feedback from system users to be incorporated into subsequent revisions of the system requirements, and also allowed the addition of new elements as needed to adapt the system to broader geographic and geopolitical areas, including areas along the southwest and northwest U.S. borders. This paper addresses the processes used to develop and refine requirements for the NCC interoperability simulation capability, as well as the response of the law enforcement community to the use of the NCC system. The paper also addresses the applicability of such an interoperability simulation capability to a broader set of law enforcement, border protection, site/facility security, and first-responder needs.« less
HTML5 microdata as a semantic container for medical information exchange.
Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken
2014-01-01
Achieving interoperability between clinical electronic medical records (EMR) systems and cloud computing systems is challenging because of the lack of a universal reference method as a standard for information exchange with a secure connection. Here we describe an information exchange scheme using HTML5 microdata, where the standard semantic container is an HTML document. We embed HL7 messages describing laboratory test results in the microdata. We also annotate items in the clinical research report with the microdata. We mapped the laboratory test result data into the clinical research report using an HL7 selector specified in the microdata. This scheme can provide secure cooperation between the cloud-based service and the EMR system.
National Geothermal Data System (USA): an Exemplar of Open Access to Data
NASA Astrophysics Data System (ADS)
Allison, M. Lee; Richard, Stephen; Blackman, Harold; Anderson, Arlene; Patten, Kim
2014-05-01
The National Geothermal Data System's (NGDS - www.geothermaldata.org) formal launch in April, 2014 will provide open access to millions of data records, sharing -relevant geoscience and longer term to land use data to propel geothermal development and production. NGDS serves information from all of the U.S. Department of Energy's sponsored development and research projects and geologic data from all 50 states, using free and open source software. This interactive online system is opening new exploration opportunities and potentially shortening project development by making data easily discoverable, accessible, and interoperable. We continue to populate our prototype functional data system with multiple data nodes and nationwide data online and available to the public. Data from state geological surveys and partners includes more than 6 million records online, including 1.72 million well headers (oil and gas, water, geothermal), 670,000 well logs, and 497,000 borehole temperatures and is growing rapidly. There are over 312 interoperable Web services and another 106 WMS (Web Map Services) registered in the system as of January, 2014. Companion projects run by Southern Methodist University and U.S. Geological Survey (USGS) are adding millions of additional data records. The DOE Geothermal Data Repository, currently hosted on OpenEI, is a system node and clearinghouse for data from hundreds of U.S. DOE-funded geothermal projects. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG). NGDS complies with the White House Executive Order of May 2013, requiring all federal agencies to make their data holdings publicly accessible online in open source, interoperable formats with common core and extensible metadata. The National Geothermal Data System is being designed, built, deployed, and populated primarily with support from the US Department of Energy, Geothermal Technologies Office. To keep this system operational after the original implementation will require four core elements: continued serving of data and applications by providers; maintenance of system operations; a governance structure; and an effective business model. Each of these presents a number of challenges currently under consideration.
The PSML format and library for norm-conserving pseudopotential data curation and interoperability
NASA Astrophysics Data System (ADS)
García, Alberto; Verstraete, Matthieu J.; Pouillon, Yann; Junquera, Javier
2018-06-01
Norm-conserving pseudopotentials are used by a significant number of electronic-structure packages, but the practical differences among codes in the handling of the associated data hinder their interoperability and make it difficult to compare their results. At the same time, existing formats lack provenance data, which makes it difficult to track and document computational workflows. To address these problems, we first propose a file format (PSML) that maps the basic concepts of the norm-conserving pseudopotential domain in a flexible form and supports the inclusion of provenance information and other important metadata. Second, we provide a software library (libPSML) that can be used by electronic structure codes to transparently extract the information in the file and adapt it to their own data structures, or to create converters for other formats. Support for the new file format has been already implemented in several pseudopotential generator programs (including ATOM and ONCVPSP), and the library has been linked with SIESTA and ABINIT, allowing them to work with the same pseudopotential operator (with the same local part and fully non-local projectors) thus easing the comparison of their results for the structural and electronic properties, as shown for several example systems. This methodology can be easily transferred to any other package that uses norm-conserving pseudopotentials, and offers a proof-of-concept for a general approach to interoperability.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi
2008-01-01
Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
NASA Astrophysics Data System (ADS)
Mitchell, A. E.; Lowe, D. R.; Murphy, K. J.; Ramapriyan, H. K.
2011-12-01
Initiated in 1990, NASA's Earth Observing System Data and Information System (EOSDIS) is currently a petabyte-scale archive of data designed to receive, process, distribute and archive several terabytes of science data per day from NASA's Earth science missions. Comprised of 12 discipline specific data centers collocated with centers of science discipline expertise, EOSDIS manages over 6800 data products from many science disciplines and sources. NASA supports global climate change research by providing scalable open application layers to the EOSDIS distributed information framework. This allows many other value-added services to access NASA's vast Earth Science Collection and allows EOSDIS to interoperate with data archives from other domestic and international organizations. EOSDIS is committed to NASA's Data Policy of full and open sharing of Earth science data. As metadata is used in all aspects of NASA's Earth science data lifecycle, EOSDIS provides a spatial and temporal metadata registry and order broker called the EOS Clearing House (ECHO) that allows efficient search and access of cross domain data and services through the Reverb Client and Application Programmer Interfaces (APIs). Another core metadata component of EOSDIS is NASA's Global Change Master Directory (GCMD) which represents more than 25,000 Earth science data set and service descriptions from all over the world, covering subject areas within the Earth and environmental sciences. With inputs from the ECHO, GCMD and Soil Moisture Active Passive (SMAP) mission metadata models, EOSDIS is developing a NASA ISO 19115 Best Practices Convention. Adoption of an international metadata standard enables a far greater level of interoperability among national and international data products. NASA recently concluded a 'Metadata Harmony Study' of EOSDIS metadata capabilities/processes of ECHO and NASA's Global Change Master Directory (GCMD), to evaluate opportunities for improved data access and use, reduce efforts by data providers and improve metadata integrity. The result was a recommendation for EOSDIS to develop a 'Common Metadata Repository (CMR)' to manage the evolution of NASA Earth Science metadata in a unified and consistent way by providing a central storage and access capability that streamlines current workflows while increasing overall data quality and anticipating future capabilities. For applications users interested in monitoring and analyzing a wide variety of natural and man-made phenomena, EOSDIS provides access to near real-time products from the MODIS, OMI, AIRS, and MLS instruments in less than 3 hours from observation. To enable interactive exploration of NASA's Earth imagery, EOSDIS is developing a set of standard services to deliver global, full-resolution satellite imagery in a highly responsive manner. EOSDIS is also playing a lead role in the development of the CEOS WGISS Integrated Catalog (CWIC), which provides search and access to holdings of participating international data providers. EOSDIS provides a platform to expose and share information on NASA Earth science tools and data via Earthdata.nasa.gov while offering a coherent and interoperable system for the NASA Earth Science Data System (ESDS) Program.
NASA Astrophysics Data System (ADS)
Mitchell, A. E.; Lowe, D. R.; Murphy, K. J.; Ramapriyan, H. K.
2013-12-01
Initiated in 1990, NASA's Earth Observing System Data and Information System (EOSDIS) is currently a petabyte-scale archive of data designed to receive, process, distribute and archive several terabytes of science data per day from NASA's Earth science missions. Comprised of 12 discipline specific data centers collocated with centers of science discipline expertise, EOSDIS manages over 6800 data products from many science disciplines and sources. NASA supports global climate change research by providing scalable open application layers to the EOSDIS distributed information framework. This allows many other value-added services to access NASA's vast Earth Science Collection and allows EOSDIS to interoperate with data archives from other domestic and international organizations. EOSDIS is committed to NASA's Data Policy of full and open sharing of Earth science data. As metadata is used in all aspects of NASA's Earth science data lifecycle, EOSDIS provides a spatial and temporal metadata registry and order broker called the EOS Clearing House (ECHO) that allows efficient search and access of cross domain data and services through the Reverb Client and Application Programmer Interfaces (APIs). Another core metadata component of EOSDIS is NASA's Global Change Master Directory (GCMD) which represents more than 25,000 Earth science data set and service descriptions from all over the world, covering subject areas within the Earth and environmental sciences. With inputs from the ECHO, GCMD and Soil Moisture Active Passive (SMAP) mission metadata models, EOSDIS is developing a NASA ISO 19115 Best Practices Convention. Adoption of an international metadata standard enables a far greater level of interoperability among national and international data products. NASA recently concluded a 'Metadata Harmony Study' of EOSDIS metadata capabilities/processes of ECHO and NASA's Global Change Master Directory (GCMD), to evaluate opportunities for improved data access and use, reduce efforts by data providers and improve metadata integrity. The result was a recommendation for EOSDIS to develop a 'Common Metadata Repository (CMR)' to manage the evolution of NASA Earth Science metadata in a unified and consistent way by providing a central storage and access capability that streamlines current workflows while increasing overall data quality and anticipating future capabilities. For applications users interested in monitoring and analyzing a wide variety of natural and man-made phenomena, EOSDIS provides access to near real-time products from the MODIS, OMI, AIRS, and MLS instruments in less than 3 hours from observation. To enable interactive exploration of NASA's Earth imagery, EOSDIS is developing a set of standard services to deliver global, full-resolution satellite imagery in a highly responsive manner. EOSDIS is also playing a lead role in the development of the CEOS WGISS Integrated Catalog (CWIC), which provides search and access to holdings of participating international data providers. EOSDIS provides a platform to expose and share information on NASA Earth science tools and data via Earthdata.nasa.gov while offering a coherent and interoperable system for the NASA Earth Science Data System (ESDS) Program.
Implementing the HL7v3 standard in Croatian primary healthcare domain.
Koncar, Miroslav
2004-01-01
The mission of HL7 Inc. is to provide standards for the exchange, management and integration of data that supports clinical patient care and the management, delivery and evaluation of healthcare services. The scope of this work includes the specifications of flexible, cost-effective approaches, standards, guidelines, methodologies, and related services for interoperability between healthcare information systems. In the field of medical information technologies, HL7 provides the world's most advanced information standards. Versions 1 and 2 of the HL7 standard have on the one hand solved many issues, but on the other demonstrated the size and complexity of the health information sharing problem. As the solution, a complete new methodology has been adopted, which is being encompassed in version 3 recommendations. This approach standardizes the Reference Information Model (RIM), which is the source of all domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely-coupled systems that are designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project, we have decided to go directly to HL7v3. Implementing the HL7v3 standard in healthcare applications represents a challenging task. By using standardized refinement and localization methods we were able to define information models for Croatian primary healthcare domain. The scope of our work includes clinical, financial and administrative data management, where in some cases we were compelled to introduce new HL7v3-compliant models. All of the HL7v3 transactions are digitally signed, using the W3C XML Digital Signature standard.
ERIC Educational Resources Information Center
Liu, Xiaoming; Maly, Kurt; Zubair, Mohammad; Nelson, Michael L.; Erickson, John S.; DiLauro, Tim; Choudhury, G. Sayeed; Patton, Mark; Warner, James W.; Brown, Elizabeth W.; Heery, Rachel; Carpenter, Leona; Day, Michael
2001-01-01
Includes five articles that discuss the OAI (Open Archive Initiative), an interface between data providers and service providers; information objects and digital rights management interoperability; digitizing library collections, including automated name authority control, metadata, and text searching engines; and building digital library services…
Building AN International Polar Data Coordination Network
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Yarmey, L.; Manley, W. F.; Gaylord, A. G.; Tweedie, C. E.
2013-12-01
In the spirit of the World Data Center system developed to manage data resulting from the International Geophysical Year of 1957-58, the International Polar Year 2007-2009 (IPY) resulted in significant progress towards establishing an international polar data management network. However, a sustained international network is still evolving. In this paper we argue that the fundamental building blocks for such a network exist and that the time is right to move forward. We focus on the Arctic component of such a network with linkages to Antarctic network building activities. A review of an important set of Network building blocks is presented: i) the legacy of the IPY data and information service; ii) global data management services with a polar component (e.g. World Data System); iii) regional systems (e.g. Arctic Observing Viewer; iv) nationally focused programs (e.g. Arctic Observing Viewer, Advanced Cooperative Arctic Data and Information Service, Polar Data Catalogue, Inuit Knowledge Centre); v) programs focused on the local (e.g. Exchange for Local Observations and Knowledge of the Arctic, Geomatics and Cartographic Research Centre). We discuss current activities and results with respect to three priority areas needed to establish a strong and effective Network. First, a summary of network building activities reports on a series of productive meetings, including the Arctic Observing Summit and the Polar Data Forum, that have resulted in a core set of Network nodes and participants and a refined vision for the Network. Second, we recognize that interoperability for information sharing fundamentally relies on the creation and adoption of community-based data description standards and data delivery mechanisms. There is a broad range of interoperability frameworks and specifications available; however, these need to be adapted for polar community needs. Progress towards Network interoperability is reviewed, and a prototype distributed data systems is demonstrated. We discuss remaining challenges. Lastly, to establish a sustainable Arctic Data Coordination Network (ADCN) as part of a broader polar Network will require adequate continued resources. We conclude by outlining proposed business models for the emerging Arctic Data Coordination Network and a broader polar Network.
NASA Technical Reports Server (NTRS)
Lynnes, Christopher
2016-01-01
The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk covered the program of cloud computing prototypes being undertaken for the Earth Observing System Data and Information System (EOSDIS). Also discussed were dataset interoperability recommendations ratified via the EOSDIS Standards Office and the HDF Product Designer tool with respect to its possible applicability to data in network Common Data Form (NetCDF) version 4.
Xu Chen; Berry, Damon; Stephens, Gaye
2015-01-01
Computerised identity management is in general encountered as a low-level mechanism that enables users in a particular system or region to securely access resources. In the Electronic Health Record (EHR), the identifying information of both the healthcare professionals who access the EHR and the patients whose EHR is accessed, are subject to change. Demographics services have been developed to manage federated patient and healthcare professional identities and to support challenging healthcare-specific use cases in the presence of diverse and sometimes conflicting demographic identities. Demographics services are not the only use for identities in healthcare. Nevertheless, contemporary EHR specifications limit the types of entities that can be the actor or subject of a record to health professionals and patients, thus limiting the use of two level models in other healthcare information systems. Demographics are ubiquitous in healthcare, so for a general identity model to be usable, it should be capable of managing demographic information. In this paper, we introduce a generalised identity reference model (GIRM) based on key characteristics of five surveyed demographic models. We evaluate the GIRM by using it to express the EN13606 demographics model in an extensible way at the metadata level and show how two-level modelling can support the exchange of instances of demographic identities. This use of the GIRM to express demographics information shows its application for standards-compliant two-level modelling alongside heterogeneous demographics models. We advocate this approach to facilitate the interoperability of identities between two-level model-based EHR systems and show the validity and the extensibility of using GIRM for the expression of other health-related identities.
Ryan, Amanda; Eklund, Peter
2008-01-01
Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).
Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui
2015-05-01
The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Documenting Models for Interoperability and Reusability ...
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod
Creating personalised clinical pathways by semantic interoperability with electronic health records.
Wang, Hua-Qiong; Li, Jing-Song; Zhang, Yi-Fan; Suzuki, Muneou; Araki, Kenji
2013-06-01
There is a growing realisation that clinical pathways (CPs) are vital for improving the treatment quality of healthcare organisations. However, treatment personalisation is one of the main challenges when implementing CPs, and the inadequate dynamic adaptability restricts the practicality of CPs. The purpose of this study is to improve the practicality of CPs using semantic interoperability between knowledge-based CPs and semantic electronic health records (EHRs). Simple protocol and resource description framework query language is used to gather patient information from semantic EHRs. The gathered patient information is entered into the CP ontology represented by web ontology language. Then, after reasoning over rules described by semantic web rule language in the Jena semantic framework, we adjust the standardised CPs to meet different patients' practical needs. A CP for acute appendicitis is used as an example to illustrate how to achieve CP customisation based on the semantic interoperability between knowledge-based CPs and semantic EHRs. A personalised care plan is generated by comprehensively analysing the patient's personal allergy history and past medical history, which are stored in semantic EHRs. Additionally, by monitoring the patient's clinical information, an exception is recorded and handled during CP execution. According to execution results of the actual example, the solutions we present are shown to be technically feasible. This study contributes towards improving the clinical personalised practicality of standardised CPs. In addition, this study establishes the foundation for future work on the research and development of an independent CP system. Copyright © 2013 Elsevier B.V. All rights reserved.
A Reference Architecture for Space Information Management
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.
2006-01-01
We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.
Latest developments for the IAGOS database: Interoperability and metadata
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume
2014-05-01
In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for intercomparison. The optimal data transfer protocol is being investigated to insure the interoperability. To facilitate satellite and model validation, tools will be made available for co-location and comparison with IAGOS. We will enhance the JOIN application in order to properly display aircraft data as vertical profiles and along individual flight tracks and to allow for graphical comparison to model results that are accessible through interoperable web services, such as the daily products from the GMES/Copernicus atmospheric service.
Sinaci, A Anil; Laleci Erturkmen, Gokce B
2013-10-01
In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.
Interacting with Multi-Robot Systems Using BML
2013-06-01
Pullen, U. Schade, J. Simonsen & R. Gomez-Veiga, NATO MSG-048 C-BML Final Report Summary. 2010 Fall Simulation Interoperability Workshop (10F- SIW -039...NATO MSG-085. 2012 Spring Simulation Interoperability Workshop (12S- SIW -045), Orlando, FL, March 2012. [3] T. Remmersmann, U. Schade, L. Khimeche...B. Grautreau & R. El Abdouni Khayari, Lessons Recognized: How to Combine BML and MSDL. 2012 Spring Simulation Interoperability Workshop (12S- SIW -012
UAS in the NAS Flight Test Series 3 Overview
NASA Technical Reports Server (NTRS)
Murphy, James R.
2015-01-01
The UAS Integration in the NAS Project is conducting a series of flight tests to acheive the following objectives: 1.) Validate results previously collected during project simulations with live data 2.) Evaluate TCAS IISS interoperability 3.) Test fully integrated system in a relevant live test environment 4.) Inform final DAA and C2 MOPS 5.) Reduce risk for Flight Test Series 4.
Development and Operations of the Astrophysics Data System
NASA Technical Reports Server (NTRS)
Murray, Stephen S.
1998-01-01
Preparations for the AAS meeting in January are progressing. We will have a talk, a poster, and a demonstration. We organized a meeting during the AAS conference to discuss bibliographic codes in order to make sure the different information providers can inter-operate. Our new server should be on-line for the AAS meeting. This will improve the search speed considerably.
Consensus-Driven Development of a Terminology for Biobanking, the Duke Experience.
Ellis, Helena; Joshi, Mary-Beth; Lynn, Aenoch J; Walden, Anita
2017-04-01
Biobanking at Duke University has existed for decades and has grown over time in silos and based on specialized needs, as is true with most biomedical research centers. These silos developed informatics systems to support their own individual requirements, with no regard for semantic or syntactic interoperability. Duke undertook an initiative to implement an enterprise-wide biobanking information system to serve its many diverse biobanking entities. A significant part of this initiative was the development of a common terminology for use in the commercial software platform. Common terminology provides the foundation for interoperability across biobanks for data and information sharing. We engaged experts in research, informatics, and biobanking through a consensus-driven process to agree on 361 terms and their definitions that encompass the lifecycle of a biospecimen. Existing standards, common terms, and data elements from published articles provided a foundation on which to build the biobanking terminology; a broader set of stakeholders then provided additional input and feedback in a secondary vetting process. The resulting standardized biobanking terminology is now available for sharing with the biobanking community to serve as a foundation for other institutions who are considering a similar initiative.
Consensus-Driven Development of a Terminology for Biobanking, the Duke Experience
Joshi, Mary-Beth; Lynn, Aenoch J.; Walden, Anita
2017-01-01
Biobanking at Duke University has existed for decades and has grown over time in silos and based on specialized needs, as is true with most biomedical research centers. These silos developed informatics systems to support their own individual requirements, with no regard for semantic or syntactic interoperability. Duke undertook an initiative to implement an enterprise-wide biobanking information system to serve its many diverse biobanking entities. A significant part of this initiative was the development of a common terminology for use in the commercial software platform. Common terminology provides the foundation for interoperability across biobanks for data and information sharing. We engaged experts in research, informatics, and biobanking through a consensus-driven process to agree on 361 terms and their definitions that encompass the lifecycle of a biospecimen. Existing standards, common terms, and data elements from published articles provided a foundation on which to build the biobanking terminology; a broader set of stakeholders then provided additional input and feedback in a secondary vetting process. The resulting standardized biobanking terminology is now available for sharing with the biobanking community to serve as a foundation for other institutions who are considering a similar initiative. PMID:28338350
ERIC Educational Resources Information Center
Data Research Associates, Inc., St. Louis, MO.
The topic of open systems as it relates to the needs of libraries to establish interoperability between dissimilar computer systems can be clarified by an understanding of the background and evolution of the issue. The International Standards Organization developed a model to link dissimilar computers, and this model has evolved into consensus…
Achieving control and interoperability through unified model-based systems and software engineering
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel
2005-01-01
Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.
Constellation's Command, Control, Communications and Information (C3I) Architecture
NASA Technical Reports Server (NTRS)
Breidenthal, Julian C.
2007-01-01
Operations concepts are highly effective for: 1) Developing consensus; 2) Discovering stakeholder needs, goals, objectives; 3) Defining behavior of system components (especially emergent behaviors). An interoperability standard can provide an excellent lever to define the capabilities needed for system evolution. Two categories of architectures are needed in a program of this size are: 1) Generic - Needed for planning, design and construction standards; 2) Specific - Needed for detailed requirement allocations, interface specs. A wide variety of architectural views are needed to address stakeholder concerns, including: 1) Physical; 2) Information (structure, flow, evolution); 3) Processes (design, manufacturing, operations); 4) Performance; 5) Risk.
Accelerating Harmonization in Digital Health.
Moore, Carolyn; Werner, Laurie; BenDor, Amanda Puckett; Bailey, Mike; Khan, Nighat
2017-01-01
Digital tools play an important role in supporting front-line health workers who deliver primary care. This paper explores the current state of efforts undertaken to move away from single-purpose applications of digital health towards integrated systems and solutions that align with national strategies. Through examples from health information systems, data and health worker training, this paper demonstrates how governments and stakeholders are working to integrate digital health services. We emphasize three factors as crucial for this integration: development and implementation of national digital health strategies; technical interoperability and collaborative approaches to ensure that digital health has an impact on the primary care level. Consolidation of technologies will enable an integrated, scaleable approach to the use of digital health to support health workers. As this edition explores a paradigm shift towards harmonization in primary healthcare systems, this paper explores complementary efforts undertaken to move away from single-purpose applications of digital health towards integrated systems and solutions that align with national strategies. It describes a paradigm shift towards integrated and interoperable systems that respond to health workers' needs in training, data and health information; and calls for the consolidation and integration of digital health tools and approaches across health areas, functions and levels of the health system. It then considers the critical factors that must be in place to support this paradigm shift. This paper aims not only to describe steps taken to move from fractured pilots to effective systems, but to propose a new perspective focused on consolidation and collaboration guided by national digital health strategies.
Reams, Christopher; Powell, Mallory; Edwards, Rob
2014-01-01
Purpose: This case study describes the collaboration between a state public health department, a major research university, and a health extension service funded as part of the Health Information Technology for Economic and Clinical Health (HITECH) Act to establish an interoperable health information system for disease surveillance through electronic reporting of systemic therapy data from numerous oncology practices in Kentucky. The experience of the Kentucky cancer surveillance system can help local and state entities achieve greater effectiveness in designing communication efforts to increase usage of electronic health records (EHRs) and health information exchanges (HIEs), help eligible clinicians meet these new standards in patient care, and conduct disease surveillance in a learning health system. Innovation: We document and assess the statewide efforts of early health information technology (HIT) adopters in Kentucky to facilitate the nation’s first electronic transmission of a clinical document architecture (CDA) from a physician office to a state cancer surveillance registry in November 2012. Successful transmission of the CDA not only represented a landmark for technology innovators, informaticists, and clinicians, but it also set in motion a new communication mechanism by which state and federal agencies can capture and trade vital cancer statistics in a way that is safe, secure, and timely. The corresponding impact this has on cancer surveillance and comparative effective research is immense. With guidance from the Centers for Disease Control and Prevention (CDC), the Kentucky Cancer Registry (KCR), the Kentucky Health Information Exchange (KHIE), and the Kentucky Regional Extension Center (KREC) have moved one step further in transforming the interoperable health environment for improved disease surveillance. Credibility: This case study describes the efforts of established and reputable agencies, including the KCR, the state department of health, state and federal governmental agencies, and a major research university in leveraging existing networks, infrastructure, and federally awarded funding to implement interoperable health information systems for disease surveillance. Project assessment through quasi-qualitative interviews with key stakeholders facilitated evaluation of attitudes and beliefs for continued use of the cancer surveillance model. Conclusion and Discussion: In Kentucky, the cancer reporting initiative leveraged and enhanced a solid foundation for statewide collaboration to achieve better health and improved disease surveillance through a learning health system. Leveraging the Meaningful Use (MU) program as an overarching policy and structural driver is imperative. The cancer reporting initiative in Kentucky suggests that future surveillance and reporting initiatives will require locally adaptable solutions and that there is a need for increased technical assistance in rural settings. Kentucky’s experience also indicates that stakeholders should be diligent in identifying state-level criteria that align with MU for vetting EHR vendors. PMID:25848604
The BACnet Campus Challenge - Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masica, Ken; Tom, Steve
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
The BACnet Campus Challenge - Part 1
Masica, Ken; Tom, Steve
2015-12-01
Here, the BACnet protocol was designed to achieve interoperability among building automation vendors and evolve over time to include new functionality as well as support new communication technologies such as the Ethernet and IP protocols as they became prevalent and economical in the market place. For large multi-building, multi-vendor campus environments, standardizing on the BACnet protocol as an implementation strategy can be a key component in meeting the challenge of an interoperable, flexible, and scalable building automation system. The interoperability of BACnet is especially important when large campuses with legacy equipment have DDC upgrades to facilities performed over different timemore » frames and use different contractors that install equipment from different vendors under the guidance of different campus HVAC project managers. In these circumstances, BACnet can serve as a common foundation for interoperability when potential variability exists in approaches to the design-build process by numerous parties over time. Likewise, BACnet support for a range of networking protocols and technologies can be a key strategy for achieving flexible and scalable automation systems as campuses and enterprises expand networking infrastructures using standard interoperable protocols like IP and Ethernet.« less
Development Model for Research Infrastructures
NASA Astrophysics Data System (ADS)
Wächter, Joachim; Hammitzsch, Martin; Kerschke, Dorit; Lauterjung, Jörn
2015-04-01
Research infrastructures (RIs) are platforms integrating facilities, resources and services used by the research communities to conduct research and foster innovation. RIs include scientific equipment, e.g., sensor platforms, satellites or other instruments, but also scientific data, sample repositories or archives. E-infrastructures on the other hand provide the technological substratum and middleware to interlink distributed RI components with computing systems and communication networks. The resulting platforms provide the foundation for the design and implementation of RIs and play an increasing role in the advancement and exploitation of knowledge and technology. RIs are regarded as essential to achieve and maintain excellence in research and innovation crucial for the European Research Area (ERA). The implementation of RIs has to be considered as a long-term, complex development process often over a period of 10 or more years. The ongoing construction of Spatial Data Infrastructures (SDIs) provides a good example for the general complexity of infrastructure development processes especially in system-of-systems environments. A set of directives issued by the European Commission provided a framework of guidelines for the implementation processes addressing the relevant content and the encoding of data as well as the standards for service interfaces and the integration of these services into networks. Additionally, a time schedule for the overall construction process has been specified. As a result this process advances with a strong participation of member states and responsible organisations. Today, SDIs provide the operational basis for new digital business processes in both national and local authorities. Currently, the development of integrated RIs in Earth and Environmental Sciences is characterised by the following properties: • A high number of parallel activities on European and national levels with numerous institutes and organisations participating. The maturity of individual scientific domains differs considerably. • Technologically and organisationally many different RI components have to be integrated. Individual systems are often complex and have a long-term history. Existing approaches are on different maturity levels, e.g. in relation to the standardisation of interfaces. • The concrete implementation process consists of independent and often parallel development activities. In many cases no detailed architectural blue-print for the envisioned system exists. • Most of the funding currently available for RI implementation is provided on a project basis. To increase the synergies in infrastructure development the authors propose a specific RI Maturity Model (RIMM) that is specifically qualified for open system-of-system environments. RIMM is based on the concepts of Capability Maturity Models for organisational development, concretely the Levels of Conceptual Interoperability Model (LCIM) specifying the technical, syntactical, semantic, pragmatic, dynamic, and conceptual layers of interoperation [1]. The model is complemented by the identification and integration of growth factors (according to the Nolan Stages Theory [2]). These factors include supply and demand factors. Supply factors comprise available resources, e.g., data, services and IT-management capabilities including organisations and IT-personal. Demand factors are the overall application portfolio for RIs but also the skills and requirements of scientists and communities using the infrastructure. RIMM thus enables a balanced development process of RI and RI components by evaluating the status of the supply and demand factors in relation to specific levels of interoperability. [1] Tolk, A., Diallo, A., Turnitsa, C. (2007): Applying the Levels of Conceptual Interoperability Model in Support of Integratability, Interoperability, and Composability for System-of-Systems Engineering. Systemics, Cybernetics and Informatics, Volume 5 - Number 5. [2] Mutsaers, E.-J., van der Zee, H., and Giertz, H. (1998): The evolution of information technology. Information Management & Computer Security, Volume 6 - Issue 3.
Metadata mapping and reuse in caBIG.
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-02-05
This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.
Rezaeibagha, Fatemeh; Win, Khin Than; Susilo, Willy
Even though many safeguards and policies for electronic health record (EHR) security have been implemented, barriers to the privacy and security protection of EHR systems persist. This article presents the results of a systematic literature review regarding frequently adopted security and privacy technical features of EHR systems. Our inclusion criteria were full articles that dealt with the security and privacy of technical implementations of EHR systems published in English in peer-reviewed journals and conference proceedings between 1998 and 2013; 55 selected studies were reviewed in detail. We analysed the review results using two International Organization for Standardization (ISO) standards (29100 and 27002) in order to consolidate the study findings. Using this process, we identified 13 features that are essential to security and privacy in EHRs. These included system and application access control, compliance with security requirements, interoperability, integration and sharing, consent and choice mechanism, policies and regulation, applicability and scalability and cryptography techniques. This review highlights the importance of technical features, including mandated access control policies and consent mechanisms, to provide patients' consent, scalability through proper architecture and frameworks, and interoperability of health information systems, to EHR security and privacy requirements.
Interoperability Gap Challenges for Learning Object Repositories & Learning Management Systems
ERIC Educational Resources Information Center
Mason, Robert T.
2011-01-01
An interoperability gap exists between Learning Management Systems (LMSs) and Learning Object Repositories (LORs). Learning Objects (LOs) and the associated Learning Object Metadata (LOM) that is stored within LORs adhere to a variety of LOM standards. A common LOM standard found in LORs is the Sharable Content Object Reference Model (SCORM)…
Crasto, Chiquito J.; Marenco, Luis N.; Liu, Nian; Morse, Thomas M.; Cheung, Kei-Hoi; Lai, Peter C.; Bahl, Gautam; Masiar, Peter; Lam, Hugo Y.K.; Lim, Ernest; Chen, Huajin; Nadkarni, Prakash; Migliore, Michele; Miller, Perry L.; Shepherd, Gordon M.
2009-01-01
This article presents the latest developments in neuroscience information dissemination through the SenseLab suite of databases: NeuronDB, CellPropDB, ORDB, OdorDB, OdorMapDB, ModelDB and BrainPharm. These databases include information related to: (i) neuronal membrane properties and neuronal models, and (ii) genetics, genomics, proteomics and imaging studies of the olfactory system. We describe here: the new features for each database, the evolution of SenseLab’s unifying database architecture and instances of SenseLab database interoperation with other neuroscience online resources. PMID:17510162
A Definitive Interoperability Test Methodology for the Malicious Activity Simulation Tool (MAST)
2013-03-01
Information Assurance Range DON Department of the Navy DON CIO Department of the Navy Chief Information Officer DoS Denial of Service EOL End-of...came “as part of PMW 160’s solution to the risk posed by Windows™ NT End-of-Life 43 ( EOL ).” Second, “[it] marked the beginning of a steady and...sometimes outdated, systems and programs. [34]. Table 1 shows the basic implementation and EOL timeline, the OS version for both server and
Fingerprint verification on medical image reporting system.
Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah
2008-03-01
The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties.
Big issues, small systems: managing with information in medical research.
Jones, J; Preston, H
2000-08-01
This subject of this article is the design of a database system for handling files related to the work of the Molecular Genetics Department of the International Blood Group Reference Laboratory. It examines specialist information needs identified within this organization and it indicates how the design of the Rhesus Information Tracking System was able to meet current needs. Rapid Applications Development prototyping forms the basis of the investigation, linked to interview, questionnaire, and observation techniques in order to establish requirements for interoperability. In particular, the place of this specialist database within the much broader information strategy of the National Blood Service will be examined. This unique situation is analogous to management activities in broader environments and a number of generic issues are highlighted by the research.
NASA Technical Reports Server (NTRS)
Graves, Sara J.
1994-01-01
Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.
Department of Defense Air Traffic Control and Airspace Management Systems
1989-08-08
service. The potential near-term impacts of incompatible and non- interoperable systems on the Air Force are described in terms of safety and...impacts of incompatible and non-interoperable systems on the Air Force are described in terms of safety and operational effectiveness and probable...derogation of safety , from the standpoint of aircraft collision avoidance, is probable where service specific systems are operating in adjacent or
Bridging Hydroinformatics Services Between HydroShare and SWATShare
NASA Astrophysics Data System (ADS)
Merwade, V.; Zhao, L.; Song, C. X.; Tarboton, D. G.; Goodall, J. L.; Stealey, M.; Rajib, A.; Morsy, M. M.; Dash, P. K.; Miles, B.; Kim, I. L.
2016-12-01
Many cyberinfrastructure systems in the hydrologic and related domains emerged in the past decade with more being developed to address various data management and modeling needs. Although clearly beneficial to the broad user community, it is a challenging task to build interoperability across these systems due to various obstacles including technological, organizational, semantic, and social issues. This work presents our experience in developing interoperability between two hydrologic cyberinfrastructure systems - SWATShare and HydroShare. HydroShare is a large-scale online system aiming at enabling the hydrologic user community to share their data, models, and analysis online for solving complex hydrologic research questions. On the other side, SWATShare is a focused effort to allow SWAT (Soil and Water Assessment Tool) modelers share, execute and analyze SWAT models using high performance computing resources. Making these two systems interoperable required common sign-in through OAuth, sharing of models through common metadata standards and use of standard web-services for implementing key import/export functionalities. As a result, users from either community can leverage the resources and services across these systems without having to manually importing, exporting, or processing their models. Overall, this use case is an example that can serve as a model for the interoperability among other systems as no one system can provide all the functionality needed to address large interdisciplinary problems.
Dandanell, G
1992-01-01
The interoperator distance between a synthetic operator Os and the deoP2O2-galK fusion was varied between 46 and 176 bp. The repression of the deoP2 directed galK expression as a function of the interoperator distance (center-to-center) was measured in vivo in a single-copy system. The results show that the DeoR repressor efficiently can repress transcription at all the interoperator distances tested. The degree of repression depends very little on the spacing between the operators, however, a weak periodic dependency of 8-11 bp may exist. PMID:1437558
Content-based management service for medical videos.
Mendi, Engin; Bayrak, Coskun; Cecen, Songul; Ermisoglu, Emre
2013-01-01
Development of health information technology has had a dramatic impact to improve the efficiency and quality of medical care. Developing interoperable health information systems for healthcare providers has the potential to improve the quality and equitability of patient-centered healthcare. In this article, we describe an automated content-based medical video analysis and management service that provides convenience and ease in accessing the relevant medical video content without sequential scanning. The system facilitates effective temporal video segmentation and content-based visual information retrieval that enable a more reliable understanding of medical video content. The system is implemented as a Web- and mobile-based service and has the potential to offer a knowledge-sharing platform for the purpose of efficient medical video content access.
NASA Astrophysics Data System (ADS)
Horita, Flávio E. A.; Albuquerque, João Porto de; Degrossi, Lívia C.; Mendiondo, Eduardo M.; Ueyama, Jó
2015-07-01
Effective flood risk management requires updated information to ensure that the correct decisions can be made. This can be provided by Wireless Sensor Networks (WSN) which are a low-cost means of collecting updated information about rivers. Another valuable resource is Volunteered Geographic Information (VGI) which is a comparatively new means of improving the coverage of monitored areas because it is able to supply supplementary information to the WSN and thus support decision-making in flood risk management. However, there still remains the problem of how to combine WSN data with VGI. In this paper, an attempt is made to investigate AGORA-DS, which is a Spatial Decision Support System (SDSS) that is able to make flood risk management more effective by combining these data sources, i.e. WSN with VGI. This approach is built over a conceptual model that complies with the interoperable standards laid down by the Open Geospatial Consortium (OGC) - e.g. Sensor Observation Service (SOS) and Web Feature Service (WFS) - and seeks to combine and present unified information in a web-based decision support tool. This work was deployed in a real scenario of flood risk management in the town of São Carlos in Brazil. The evidence obtained from this deployment confirmed that interoperable standards can support the integration of data from distinct data sources. In addition, they also show that VGI is able to provide information about areas of the river basin which lack data since there is no appropriate station in the area. Hence it provides a valuable support for the WSN data. It can thus be concluded that AGORA-DS is able to combine information provided by WSN and VGI, and provide useful information for supporting flood risk management.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... this document. FOR FURTHER INFORMATION CONTACT: Brenda Boykin, Wireless Telecommunications Bureau, (202... power levels of up to 1000 kW.\\2\\ The Lower A Block is also adjacent to the unpaired Lower 700 MHz E Block, where licensees (along with Lower 700 MHz D Block licensees) may operate at power levels up to 50...
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Mattman, C. A.; Ramirez, P. M.
2009-12-01
Experience suggests that no single search paradigm will meet all of a community’s search requirements. Traditional forms based search is still considered critical by a significant percentage of most science communities. However text base and facet based search are improving the community’s perception that search can be easy and that the data is available and can be located. Finally semantic search promises ways to find data that were not conceived when the metadata was first captured and organized. This situation suggests that successful science information systems must be able to deploy new search applications quickly, efficiently, and often for ad-hoc purposes. Federated registries allow data to be packaged or associated with their metadata and managed as simple registry objects. Standard reference models for federated registries now exist that ensure registry objects are uniquely identified at registration and that versioning, classification, and cataloging are addressed automatically. Distributed but locally governed, federated registries also provide notification of registry events and federated query, linking, and replication of registry objects. Key principles for shared ontology development in the space sciences are that the ontology remains independent of its implementation and be extensible, flexible and scalable. The dichotomy between digital things and physical/conceptual things in the domain need to be unified under a standard model, such as the Open Archive Information System (OAIS) Information Object. Finally the fact must be accepted that ontology development is a difficult task that requires time, patience and experts in both the science domain and information modeling. The Planetary Data System (PDS) has adopted this architecture for it next generation information system, PDS 2010. The authors will report on progress, briefly describe key elements, and illustrate how the new system will be phased into operations to handle both legacy and new science data. In particular the shared ontology is being used to drive system implementation through the generation of standards documents and software configuration files. The resulting information system will help meet the expectations of modern scientists by providing more of the information interconnectedness, correlative science, and system interoperability that they desire. Fig.1 - Data Driven Architecture
Testbeds for Assessing Critical Scenarios in Power Control Systems
NASA Astrophysics Data System (ADS)
Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem
The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.
Nogueira, J R M; Cook, T W; Cavalini, L T
2015-01-01
Healthcare information technologies have the potential to transform nursing care. However, healthcare information systems based on conventional software architecture are not semantically interoperable and have high maintenance costs. Health informatics standards, such as controlled terminologies, have been proposed to improve healthcare information systems, but their implementation in conventional software has not been enough to overcome the current challenge. Such obstacles could be removed by adopting a multilevel model-driven approach, such as the openEHR specifications, in nursing information systems. To create an openEHR archetype model for the Functional Status concepts as published in Nursing Outcome Indicators Catalog of the International Classification for Nursing Practice (NOIC-ICNP). Four methodological steps were followed: 1) extraction of terms from the NOIC-ICNP terminology; 2) identification of previously published openEHR archetypes; 3) assessment of the adequacy of those openEHR archetypes to represent the terms; and 4) development of new openEHR archetypes when required. The "Barthel Index" archetype was retrieved and mapped to the 68 NOIC-ICNP Functional Status terms. There were 19 exact matches between a term and the correspondent archetype node and 23 archetype nodes that matched to one or more NOIC-INCP. No matches were found between the archetype and 14 of the NOIC-ICNP terms, and nine archetype nodes did not match any of the NOIC-ICNP terms. The openEHR model was sufficient to represent the semantics of the Functional Status concept according to the NOIC-ICNP, but there were differences in data granularity between the terminology and the archetype, thus producing a significantly complex mapping, which could be difficult to implement in real healthcare information systems. However, despite the technological complexity, the present study demonstrated the feasibility of mapping nursing terminologies to openEHR archetypes, which emphasizes the importance of adopting the multilevel model-driven approach for the achievement of semantic interoperability between healthcare information systems.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2016-04-01
The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential impact for the individual regional data infrastructures of implementing these prototype interoperability solutions is also being evaluated to determine both the technical and financial implications of their integration within existing systems. These impact assessments form part of the strategy to encourage wider adoption of the ODIP solutions and approach beyond the current scope of the project which is focussed on regional marine data systems in Europe, Australia, the USA and, more recently, Canada.
NASA Astrophysics Data System (ADS)
Li, Ni; Huai, Wenqing; Wang, Shaodan
2017-08-01
C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.
Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya
2018-05-01
This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.
Interoperable cross-domain semantic and geospatial framework for automatic change detection
NASA Astrophysics Data System (ADS)
Kuo, Chiao-Ling; Hong, Jung-Hong
2016-01-01
With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.
NASA Astrophysics Data System (ADS)
Nieland, Simon; Kleinschmit, Birgit; Förster, Michael
2015-05-01
Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.
Developing Interoperable Air Quality Community Portals
NASA Astrophysics Data System (ADS)
Falke, S. R.; Husar, R. B.; Yang, C. P.; Robinson, E. M.; Fialkowski, W. E.
2009-04-01
Web portals are intended to provide consolidated discovery, filtering and aggregation of content from multiple, distributed web sources targeted at particular user communities. This paper presents a standards-based information architectural approach to developing portals aimed at air quality community collaboration in data access and analysis. An important characteristic of the approach is to advance beyond the present stand-alone design of most portals to achieve interoperability with other portals and information sources. We show how using metadata standards, web services, RSS feeds and other Web 2.0 technologies, such as Yahoo! Pipes and del.icio.us, helps increase interoperability among portals. The approach is illustrated within the context of the GEOSS Architecture Implementation Pilot where an air quality community portal is being developed to provide a user interface between the portals and clearinghouse of the GEOSS Common Infrastructure and the air quality community catalog of metadata and data services.
A multi-service data management platform for scientific oceanographic products
NASA Astrophysics Data System (ADS)
D'Anca, Alessandro; Conte, Laura; Nassisi, Paola; Palazzo, Cosimo; Lecci, Rita; Cretì, Sergio; Mancini, Marco; Nuzzo, Alessandra; Mirto, Maria; Mannarini, Gianandrea; Coppini, Giovanni; Fiore, Sandro; Aloisio, Giovanni
2017-02-01
An efficient, secure and interoperable data platform solution has been developed in the TESSA project to provide fast navigation and access to the data stored in the data archive, as well as a standard-based metadata management support. The platform mainly targets scientific users and the situational sea awareness high-level services such as the decision support systems (DSS). These datasets are accessible through the following three main components: the Data Access Service (DAS), the Metadata Service and the Complex Data Analysis Module (CDAM). The DAS allows access to data stored in the archive by providing interfaces for different protocols and services for downloading, variables selection, data subsetting or map generation. Metadata Service is the heart of the information system of the TESSA products and completes the overall infrastructure for data and metadata management. This component enables data search and discovery and addresses interoperability by exploiting widely adopted standards for geospatial data. Finally, the CDAM represents the back-end of the TESSA DSS by performing on-demand complex data analysis tasks.
Advanced radiology information system.
Kolovou, L; Vatousi, M; Lymperopoulos, D; Koukias, M
2005-01-01
The innovative features of an advanced Radiology Information System (RIS) are presented in this paper. The interoperability of RIS with the other Intra-hospital Information Systems that interacts with, dealing with the compatibility and open architecture issues, are accomplished by two novel mechanisms [1]. The first one is the particular message handling system that is applied for the exchange of information, according to the Health Level Seven (HL7) protocol's specifications and serves the transfer of medical and administrative data among the RIS applications and data store unit. The same mechanism allows the secure and HL7-compatible interactions with the Hospital Information System (HIS) too. The second one implements the translation of information between the formats that HL7 and Digital Imaging and Communication in Medicine (DICOM) protocols specify, providing the communication between RIS and Picture and Archive Communication System (PACS). The whole structure ensures the automation of the every-day procedures that the ;medical protocol' specifies and provides its services through a friendly and easy to manage graphical user interface.
Maturity Model for Advancing Smart Grid Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Mark; Widergren, Steven E.; Mater, J.
2013-10-28
Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met withmore » process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.« less
NASA Astrophysics Data System (ADS)
Alameh, N.; Bambacus, M.; Cole, M.
2006-12-01
Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG has played a key role in recent GEOSS Service Network (GSN) demos and workshops, acting not only as a service and data catalog and discovery client, but also as a portrayal and visualization client to distributed data.
Droc, Gaëtan; Larivière, Delphine; Guignon, Valentin; Yahiaoui, Nabila; This, Dominique; Garsmeur, Olivier; Dereeper, Alexis; Hamelin, Chantal; Argout, Xavier; Dufayard, Jean-François; Lengelle, Juliette; Baurens, Franc-Christophe; Cenci, Alberto; Pitollat, Bertrand; D’Hont, Angélique; Ruiz, Manuel; Rouard, Mathieu; Bocs, Stéphanie
2013-01-01
Banana is one of the world’s favorite fruits and one of the most important crops for developing countries. The banana reference genome sequence (Musa acuminata) was recently released. Given the taxonomic position of Musa, the completed genomic sequence has particular comparative value to provide fresh insights about the evolution of the monocotyledons. The study of the banana genome has been enhanced by a number of tools and resources that allows harnessing its sequence. First, we set up essential tools such as a Community Annotation System, phylogenomics resources and metabolic pathways. Then, to support post-genomic efforts, we improved banana existing systems (e.g. web front end, query builder), we integrated available Musa data into generic systems (e.g. markers and genetic maps, synteny blocks), we have made interoperable with the banana hub, other existing systems containing Musa data (e.g. transcriptomics, rice reference genome, workflow manager) and finally, we generated new results from sequence analyses (e.g. SNP and polymorphism analysis). Several uses cases illustrate how the Banana Genome Hub can be used to study gene families. Overall, with this collaborative effort, we discuss the importance of the interoperability toward data integration between existing information systems. Database URL: http://banana-genome.cirad.fr/ PMID:23707967
Promoting meaningful use of health information technology in Israel: ministry of health vision.
Gerber, Ayala; Topaz, Maxim Max
2014-01-01
The Ministry of Health (MOH) of Israel has overall responsibility for the healthcare system. In recent years the MOH has developed strong capabilities in the areas of technology assessment and prioritization of new technologies. Israel completed the transition to computerized medical records a decade ago in most care settings; however, the processes in Israel was spontaneous, without government control and standards settings, therefore large variations among systems and among organizations were created. Currently, the main challenge is to convert the information scattered in different systems, to organized, visible information and to make it available to various levels in health management. The MOH's solution is of implementing a selected information system from a specific vendor, at all the hospitals and all HMO's clinics, in order to achieve interoperability. The sys-tem will enable access to the patient's medical record history from any location.
Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel
2013-03-01
The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol
ERIC Educational Resources Information Center
Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.
2006-01-01
Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…
Large scale healthcare data integration and analysis using the semantic web.
Timm, John; Renly, Sondra; Farkash, Ariel
2011-01-01
Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.
An Architecture for Semantically Interoperable Electronic Health Records.
Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa
2017-01-01
Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.
Design and Implementation of a REST API for the Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
NASA Astrophysics Data System (ADS)
Pittaway, Jeff; Archer, Norm
Medical interventions are often delayed or erroneous when information needed for diagnosing or prescribing is missing or unavailable. In support of increased information flows, the healthcare industry has invested substantially in standards intended to specify, routinize, and make uniform the type and format of medical information in clinical healthcare information systems such as Electronic Medical Record systems (EMRs). However, fewer than one in four Canadian physicians have adopted EMRs. Deeper analysis illustrates that physicians may perceive value in standardized EMRs when they need to exchange information in highly structured situations among like participants and like environments. However, standards present restrictive barriers to practitioners when they face equivocal situations, unforeseen contingencies, or exchange information across different environments. These barriers constitute a compelling explanation for at least part of the observed low EMR adoption rates. Our recommendations to improve the perceived value of standardized clinical information systems espouse re-conceptualizing the role of standards to embrace greater flexibility in some areas.
Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites
NASA Technical Reports Server (NTRS)
Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng
2007-01-01
This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.
Ambient assisted living healthcare frameworks, platforms, standards, and quality attributes.
Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard
2014-03-04
Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions.
Ambient Assisted Living Healthcare Frameworks, Platforms, Standards, and Quality Attributes
Memon, Mukhtiar; Wagner, Stefan Rahr; Pedersen, Christian Fischer; Beevi, Femina Hassan Aysha; Hansen, Finn Overgaard
2014-01-01
Ambient Assisted Living (AAL) is an emerging multi-disciplinary field aiming at exploiting information and communication technologies in personal healthcare and telehealth systems for countering the effects of growing elderly population. AAL systems are developed for personalized, adaptive, and anticipatory requirements, necessitating high quality-of-service to achieve interoperability, usability, security, and accuracy. The aim of this paper is to provide a comprehensive review of the AAL field with a focus on healthcare frameworks, platforms, standards, and quality attributes. To achieve this, we conducted a literature survey of state-of-the-art AAL frameworks, systems and platforms to identify the essential aspects of AAL systems and investigate the critical issues from the design, technology, quality-of-service, and user experience perspectives. In addition, we conducted an email-based survey for collecting usage data and current status of contemporary AAL systems. We found that most AAL systems are confined to a limited set of features ignoring many of the essential AAL system aspects. Standards and technologies are used in a limited and isolated manner, while quality attributes are often addressed insufficiently. In conclusion, we found that more inter-organizational collaboration, user-centered studies, increased standardization efforts, and a focus on open systems is needed to achieve more interoperable and synergetic AAL solutions. PMID:24599192
Progress of Interoperability in Planetary Research for Geospatial Data Analysis
NASA Astrophysics Data System (ADS)
Hare, T. M.; Gaddis, L. R.
2015-12-01
For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghatikar, Girish; Mashayekh, Salman; Stadler, Michael
Distributed power systems in the U.S. and globally are evolving to provide reliable and clean energy to consumers. In California, existing regulations require significant increases in renewable generation, as well as identification of customer-side distributed energy resources (DER) controls, communication technologies, and standards for interconnection with the electric grid systems. As DER deployment expands, customer-side DER control and optimization will be critical for system flexibility and demand response (DR) participation, which improves the economic viability of DER systems. Current DER systems integration and communication challenges include leveraging the existing DER and DR technology and systems infrastructure, and enabling optimized cost,more » energy and carbon choices for customers to deploy interoperable grid transactions and renewable energy systems at scale. Our paper presents a cost-effective solution to these challenges by exploring communication technologies and information models for DER system integration and interoperability. This system uses open standards and optimization models for resource planning based on dynamic-pricing notifications and autonomous operations within various domains of the smart grid energy system. It identifies architectures and customer engagement strategies in dynamic DR pricing transactions to generate feedback information models for load flexibility, load profiles, and participation schedules. The models are tested at a real site in California—Fort Hunter Liggett (FHL). Furthermore, our results for FHL show that the model fits within the existing and new DR business models and networked systems for transactive energy concepts. Integrated energy systems, communication networks, and modeling tools that coordinate supply-side networks and DER will enable electric grid system operators to use DER for grid transactions in an integrated system.« less
ERIC Educational Resources Information Center
Hill, Linda L.; Crosier, Scott J.; Smith, Terrence R.; Goodchild, Michael; Iannella, Renato; Erickson, John S.; Reich, Vicky; Rosenthal, David S. H.
2001-01-01
Includes five articles. Topics include requirements for a content standard to describe computational models; architectures for digital rights management systems; access control for digital information objects; LOCKSS (Lots of Copies Keep Stuff Safe) that allows libraries to run Web caches for specific journals; and a Web site from the U.S.…
Electronic Health Records Data and Metadata: Challenges for Big Data in the United States.
Sweet, Lauren E; Moulaison, Heather Lea
2013-12-01
This article, written by researchers studying metadata and standards, represents a fresh perspective on the challenges of electronic health records (EHRs) and serves as a primer for big data researchers new to health-related issues. Primarily, we argue for the importance of the systematic adoption of standards in EHR data and metadata as a way of promoting big data research and benefiting patients. EHRs have the potential to include a vast amount of longitudinal health data, and metadata provides the formal structures to govern that data. In the United States, electronic medical records (EMRs) are part of the larger EHR. EHR data is submitted by a variety of clinical data providers and potentially by the patients themselves. Because data input practices are not necessarily standardized, and because of the multiplicity of current standards, basic interoperability in EHRs is hindered. Some of the issues with EHR interoperability stem from the complexities of the data they include, which can be both structured and unstructured. A number of controlled vocabularies are available to data providers. The continuity of care document standard will provide interoperability in the United States between the EMR and the larger EHR, potentially making data input by providers directly available to other providers. The data involved is nonetheless messy. In particular, the use of competing vocabularies such as the Systematized Nomenclature of Medicine-Clinical Terms, MEDCIN, and locally created vocabularies inhibits large-scale interoperability for structured portions of the records, and unstructured portions, although potentially not machine readable, remain essential. Once EMRs for patients are brought together as EHRs, the EHRs must be managed and stored. Adequate documentation should be created and maintained to assure the secure and accurate use of EHR data. There are currently a few notable international standards initiatives for EHRs. Organizations such as Health Level Seven International and Clinical Data Interchange Standards Consortium are developing and overseeing implementation of interoperability standards. Denmark and Singapore are two countries that have successfully implemented national EHR systems. Future work in electronic health information initiatives should underscore the importance of standards and reinforce interoperability of EHRs for big data research and for the sake of patients.
A portal for the ocean biogeographic information system
Zhang, Yunqing; Grassle, J. F.
2002-01-01
Since its inception in 1999 the Ocean Biogeographic Information System (OBIS) has developed into an international science program as well as a globally distributed network of biogeographic databases. An OBIS portal at Rutgers University provides the links and functional interoperability among member database systems. Protocols and standards have been established to support effective communication between the portal and these functional units. The portal provides distributed data searching, a taxonomy name service, a GIS with access to relevant environmental data, biological modeling, and education modules for mariners, students, environmental managers, and scientists. The portal will integrate Census of Marine Life field projects, national data archives, and other functional modules, and provides for network-wide analyses and modeling tools.
Text mining resources for the life sciences.
Przybyła, Piotr; Shardlow, Matthew; Aubin, Sophie; Bossy, Robert; Eckart de Castilho, Richard; Piperidis, Stelios; McNaught, John; Ananiadou, Sophia
2016-01-01
Text mining is a powerful technology for quickly distilling key information from vast quantities of biomedical literature. However, to harness this power the researcher must be well versed in the availability, suitability, adaptability, interoperability and comparative accuracy of current text mining resources. In this survey, we give an overview of the text mining resources that exist in the life sciences to help researchers, especially those employed in biocuration, to engage with text mining in their own work. We categorize the various resources under three sections: Content Discovery looks at where and how to find biomedical publications for text mining; Knowledge Encoding describes the formats used to represent the different levels of information associated with content that enable text mining, including those formats used to carry such information between processes; Tools and Services gives an overview of workflow management systems that can be used to rapidly configure and compare domain- and task-specific processes, via access to a wide range of pre-built tools. We also provide links to relevant repositories in each section to enable the reader to find resources relevant to their own area of interest. Throughout this work we give a special focus to resources that are interoperable-those that have the crucial ability to share information, enabling smooth integration and reusability. © The Author(s) 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Kutsch, W. L.; Zhao, Z.; Hardisty, A.; Hellström, M.; Chin, Y.; Magagna, B.; Asmi, A.; Papale, D.; Pfeil, B.; Atkinson, M.
2017-12-01
Environmental Research Infrastructures (ENVRIs) are expected to become important pillars not only for supporting their own scientific communities, but also a) for inter-disciplinary research and b) for the European Earth Observation Program Copernicus as a contribution to the Global Earth Observation System of Systems (GEOSS) or global thematic data networks. As such, it is very important that data-related activities of the ENVRIs will be well integrated. This requires common policies, models and e-infrastructure to optimise technological implementation, define workflows, and ensure coordination, harmonisation, integration and interoperability of data, applications and other services. The key is interoperating common metadata systems (utilising a richer metadata model as the `switchboard' for interoperation with formal syntax and declared semantics). The metadata characterises data, services, users and ICT resources (including sensors and detectors). The European Cluster Project ENVRIplus has developed a reference model (ENVRI RM) for common data infrastructure architecture to promote interoperability among ENVRIs. The presentation will provide an overview of recent progress and give examples for the integration of ENVRI data in global integration networks.
Extravehicular activity space suit interoperability.
Skoog, A I; McBarron JW 2nd; Severin, G I
1995-10-01
The European Agency (ESA) and the Russian Space Agency (RKA) are jointly developing a new space suit system for improved extravehicular activity (EVA) capabilities in support of the MIR Space Station Programme, the EVA Suit 2000. Recent national policy agreements between the U.S. and Russia on planned cooperations in manned space also include joint extravehicular activity (EVA). With an increased number of space suit systems and a higher operational frequency towards the end of this century an improved interoperability for both routine and emergency operations is of eminent importance. It is thus timely to report the current status of ongoing work on international EVA interoperability being conducted by the Committee on EVA Protocols and Operations of the International Academy of Astronauts initiated in 1991. This paper summarises the current EVA interoperability issues to be harmonised and presents quantified vehicle interface requirements for the current U.S. Shuttle EMU and Russian MIR Orlan DMA and the new European/Russian EVA Suit 2000 extravehicular systems. Major critical/incompatible interfaces for suits/mother-craft of different combinations are discussed, and recommendations for standardisations given.
EVA safety: Space suit system interoperability
NASA Technical Reports Server (NTRS)
Skoog, A. I.; McBarron, J. W.; Abramov, L. P.; Zvezda, A. O.
1995-01-01
The results and the recommendations of the International Academy of Astronautics extravehicular activities (IAA EVA) Committee work are presented. The IAA EVA protocols and operation were analyzed for harmonization procedures and for the standardization of safety critical and operationally important interfaces. The key role of EVA and how to improve the situation based on the identified EVA space suit system interoperability deficiencies were considered.
Saleh, Kutaiba; Stucke, Stephan; Uciteli, Alexandr; Faulbrück-Röhr, Sebastian; Neumann, Juliane; Tahar, Kais; Ammon, Danny; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Portheine, Frank; Herre, Heinrich; Kaeding, André; Specht, Martin
2017-01-01
With the growing strain of medical staff and complexity of patient care, the risk of medical errors increases. In this work we present the use of Fast Healthcare Interoperability Resources (FHIR) as communication standard for the integration of an ontology- and agent-based system to identify risks across medical processes in a clinical environment.
SCHeMA web-based observation data information system
NASA Astrophysics Data System (ADS)
Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise
2016-04-01
It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.
Asan medical information system for healthcare quality improvement.
Ryu, Hyeon Jeong; Kim, Woo Sung; Lee, Jae Ho; Min, Sung Woo; Kim, Sun Ja; Lee, Yong Su; Lee, Young Ha; Nam, Sang Woo; Eo, Gi Seung; Seo, Sook Gyoung; Nam, Mi Hyun
2010-09-01
This purpose of this paper is to introduce the status of the Asan Medical Center (AMC) medical information system with respect to healthcare quality improvement. Asan Medical Information System (AMIS) is projected to become a completely electronic and digital information hospital. AMIS has played a role in improving the health care quality based on the following measures: safety, effectiveness, patient-centeredness, timeliness, efficiency, privacy, and security. AMIS CONSISTED OF SEVERAL DISTINCTIVE SYSTEMS: order communication system, electronic medical record, picture archiving communication system, clinical research information system, data warehouse, enterprise resource planning, IT service management system, and disaster recovery system. The most distinctive features of AMIS were the high alert-medication recognition & management system, the integrated and severity stratified alert system, the integrated patient monitoring system, the perioperative diabetic care monitoring and support system, and the clinical indicator management system. AMIS provides IT services for AMC, 7 affiliated hospitals and over 5,000 partners clinics, and was developed to improve healthcare services. The current challenge of AMIS is standard and interoperability. A global health IT strategy is needed to get through the current challenges and to provide new services as needed.
The unexpected high practical value of medical ontologies.
Pinciroli, Francesco; Pisanelli, Domenico M
2006-01-01
Ontology is no longer a mere research topic, but its relevance has been recognized in several practical fields. Current applications areas include natural language translation, e-commerce, geographic information systems, legal information systems and biology and medicine. It is the backbone of solid and effective applications in health care and can help to build more powerful and more interoperable medical information systems. The design and implementation of ontologies in medicine is mainly focused on the re-organization of medical terminologies. This is obviously a difficult task and requires a deep analysis of the structure and the concepts of such terminologies, in order to define domain ontologies able to provide both flexibility and consistency to medical information systems. The aim of this special issue of Computers in Biology and Medicine is to report the current evolution of research in biomedical ontologies, presenting both papers devoted to methodological issues and works with a more applicative emphasis.
Health information technology and the medical school curriculum.
Triola, Marc M; Friedman, Erica; Cimino, Christopher; Geyer, Enid M; Wiederhorn, Jo; Mainiero, Crystal
2010-12-01
Medical schools must teach core biomedical informatics competencies that address health information technology (HIT), including explaining electronic medical record systems and computerized provider order entry systems and their role in patient safety; describing the research uses and limitations of a clinical data warehouse; understanding the concepts and importance of information system interoperability; explaining the difference between biomedical informatics and HIT; and explaining the ways clinical information systems can fail. Barriers to including these topics in the curricula include lack of teachers; the perception that informatics competencies are not applicable during preclinical courses and there is no place in the clerkships to teach them; and the legal and policy issues that conflict with students' need to develop skills. However, curricular reform efforts are creating opportunities to teach these topics with new emphasis on patient safety, team-based medical practice, and evidence-based care. Overarching HIT competencies empower our students to be lifelong technology learners.
Critical issues in NASA information systems
NASA Technical Reports Server (NTRS)
1987-01-01
The National Aeronautics and Space Administration has developed a globally-distributed complex of earth resources data bases since LANDSAT 1 was launched in 1972. NASA envisages considerable growth in the number, extent, and complexity of such data bases, due to the improvements expected in its remote sensing data rates, and the increasingly multidisciplinary nature of its scientific investigations. Work already has begun on information systems to support multidisciplinary research activities based on data acquired by the space station complex and other space-based and terrestrial sources. In response to a request from NASA's former Associate Administrator for Space Science and Applications, the National Research Council convened a committee in June 1985 to identify the critical issues involving information systems support to space science and applications. The committee has suggested that OSSA address four major information systems issues; centralization of management functions, interoperability of user involvement in the planning and implementation of its programs, and technology.
Bar Coding and Tracking in Pathology.
Hanna, Matthew G; Pantanowitz, Liron
2016-03-01
Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2016 Elsevier Inc. All rights reserved.
Bar Coding and Tracking in Pathology.
Hanna, Matthew G; Pantanowitz, Liron
2015-06-01
Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology. Copyright © 2015 Elsevier Inc. All rights reserved.
A security mediator for health care information.
Wiederhold, G.; Bilello, M.; Sarathy, V.; Qian, X.
1996-01-01
The TIHI (Trusted Interoperation of Healthcare Information) project addresses a security issue that arises when some information is being shared among collaborating enterprises, although not all enterprise information is sharable. It assumes that protection exists to prevent intrusion by adversaries through secure transmission and firewalls. The TIHI system design provides a gateway, owned by the enterprise security officer, to mediate queries and responses. The latter are typically transmitted via the Internet. The enterprise policy is determined by rules provided to the mediator. We show examples of typical rules. The problem and our solution, although developed in a healthcare context, is equally valid among collaborating enterprises. PMID:8947640
Metadata mapping and reuse in caBIG™
Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis
2009-01-01
Background This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG™). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG™ framework or other frameworks that use metadata repositories. Results The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG™ framework and potentially any framework that uses a metadata repository. Conclusion This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG™. This effort contributes to facilitating the development of interoperable systems within caBIG™ as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies. PMID:19208192
Leveraging Health Information Technology to Improve Quality in Federal Healthcare.
Weigel, Fred K; Switaj, Timothy L; Hamilton, Jessica
2015-01-01
Healthcare delivery in America is extremely complex because it is comprised of a fragmented and nonsystematic mix of stakeholders, components, and processes. Within the US healthcare structure, the federal healthcare system is poised to lead American medicine in leveraging health information technology to improve the quality of healthcare. We posit that through developing, adopting, and refining health information technology, the federal healthcare system has the potential to transform federal healthcare quality by managing the complexities associated with healthcare delivery. Although federal mandates have spurred the widespread use of electronic health records, other beneficial technologies have yet to be adopted in federal healthcare settings. The use of health information technology is fundamental in providing the highest quality, safest healthcare possible. In addition, health information technology is valuable in achieving the Agency for Healthcare Research and Quality's implementation goals. We conducted a comprehensive literature search using the Google Scholar, PubMed, and Cochrane databases to identify an initial list of articles. Through a thorough review of the titles and abstracts, we identified 42 articles as having relevance to health information technology and quality. Through our exclusion criteria of currency of the article, citation frequency, applicability to the federal health system, and quality of research supporting conclusions, we refined the list to 11 references from which we performed our analysis. The literature shows that the use of computerized physician order entry has significantly increased accurate medication dosage and decreased medication errors. The use of clinical decision support systems have significantly increased physician adherence to guidelines, although there is little evidence that indicates any significant correlation to patient outcomes. Research shows that interoperability and usability are continuing challenges for implementation. The Veterans Administration is the only entity within the federal health system that has published research on the use of health information technology to improve quality. The federal healthcare system has existing systems in place with computerized physician order entry systems and clinical decision support systems, but these should be advanced. Particular focus and attention should be placed on data mining capabilities, integrating the electronic health record across all aspects of care, using the electronic health record to improve quality at the point of care, and developing interoperable and usable health information technology.
a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps
NASA Astrophysics Data System (ADS)
Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.
2016-06-01
Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.
GSFC Information Systems Technology Developments Supporting the Vision for Space Exploration
NASA Technical Reports Server (NTRS)
Hughes, Peter; Dennehy, Cornelius; Mosier, Gary; Smith, Dan; Rykowski, Lisa
2004-01-01
The Vision for Space Exploration will guide NASA's future human and robotic space activities. The broad range of human and robotic missions now being planned will require the development of new system-level capabilities enabled by emerging new technologies. Goddard Space Flight Center is actively supporting the Vision for Space Exploration in a number of program management, engineering and technology areas. This paper provides a brief background on the Vision for Space Exploration and a general overview of potential key Goddard contributions. In particular, this paper focuses on describing relevant GSFC information systems capabilities in architecture development; interoperable command, control and communications; and other applied information systems technology/research activities that are applicable to support the Vision for Space Exploration goals. Current GSFC development efforts and task activities are presented together with future plans.
A Framework for Resilient Remote Monitoring
2014-08-01
of low-level observables are availa- ble, audited , and recorded. This establishes the need for a re- mote monitoring framework that can integrate with...Security, WS-Policy, SAML, XML Signature, and XML Encryption. Pearson Higher Education, 2004. [3] OMG, “Common Secure Interoperability Protocol...www.darpa.mil/Our_Work/I2O/Programs/Integrated_Cyb er_Analysis_System_%28ICAS%29.aspx. [8] D. Miller and B. Pearson , Security information and event man
2008-09-30
and Accountability Act of 1996 ) prohibit health care providers from sharing certain information about patients, and the Posse Comitatus Act (1878...Publishers, pp. 333-380. Carley, K.M., and Svoboda, D.M. ( 1996 ). Modeling Organizational Adaptation as a Simulated Annealing Process. Sociological...Privacy: Interdisciplinary Frameworks and Solution. Hershey , PA: IGI Global. Salas, E., Sims, D.E., & Burke, C.S. (2005). Is there a big five in
Vest, Joshua R; Kash, Bita A
2016-03-01
Community health information exchanges have the characteristics of a public good, and they support population health initiatives at the state and national levels. However, current policy equally incentivizes health systems to create their own information exchanges covering more narrowly defined populations. Noninteroperable electronic health records and vendors' expensive custom interfaces are hindering health information exchanges. Moreover, vendors are imposing the costs of interoperability on health systems and community health information exchanges. Health systems are creating networks of targeted physicians and facilities by funding connections to their own enterprise health information exchanges. These private networks may change referral patterns and foster more integration with outpatient providers. The United States has invested billions of dollars to encourage the adoption of and implement the information technologies necessary for health information exchange (HIE), enabling providers to efficiently and effectively share patient information with other providers. Health care providers now have multiple options for obtaining and sharing patient information. Community HIEs facilitate information sharing for a broad group of providers within a region. Enterprise HIEs are operated by health systems and share information among affiliated hospitals and providers. We sought to identify why hospitals and health systems choose either to participate in community HIEs or to establish enterprise HIEs. We conducted semistructured interviews with 40 policymakers, community and enterprise HIE leaders, and health care executives from 19 different organizations. Our qualitative analysis used a general inductive and comparative approach to identify factors influencing participation in, and the success of, each approach to HIE. Enterprise HIEs support health systems' strategic goals through the control of an information technology network consisting of desired trading partners. Community HIEs support obtaining patient information from the broadest set of providers, but with more dispersed benefits to all participants, the community, and patients. Although not an either/or decision, community and enterprise HIEs compete for finite organizational resources like time, skilled staff, and money. Both approaches face challenges due to vendor costs and less-than-interoperable technology. Both community and enterprise HIEs support aggregating clinical data and following patients across settings. Although they can be complementary, community and enterprise HIEs nonetheless compete for providers' attention and organizational resources. Health policymakers might try to encourage the type of widespread information exchange pursued by community HIEs, but the business case for enterprise HIEs clearly is stronger. The sustainability of a community HIE, potentially a public good, may necessitate ongoing public funding and supportive regulation. © 2016 Milbank Memorial Fund.
NASA Technical Reports Server (NTRS)
Kearney, Mike
2013-01-01
The primary goal of Consultative Committee for Space Data Systems (CCSDS) is interoperability between communications and data systems of space agencies' vehicles, facilities, missions and programs. Of all of the technologies used in spaceflight, standardization of communications and data systems brings the most benefit to multi-agency interoperability. CCSDS Started in 1982 developing standards at the lower layers of the protocol stack. The CCSDS scope has grown to cover standards throughout the entire ISO communications stack, plus other Data Systems areas (architecture, archive, security, XML exchange formats, etc.
Leveraging standards to support patient-centric interdisciplinary plans of care.
Dykes, Patricia C; DaDamio, Rebecca R; Goldsmith, Denise; Kim, Hyeon-eui; Ohashi, Kumiko; Saba, Virginia K
2011-01-01
As health care systems and providers move towards meaningful use of electronic health records, the once distant vision of collaborative patient-centric, interdisciplinary plans of care, generated and updated across organizations and levels of care, may soon become a reality. Effective care planning is included in the proposed Stages 2-3 Meaningful Use quality measures. To facilitate interoperability, standardization of plan of care messaging, content, information and terminology models are needed. This degree of standardization requires local and national coordination. The purpose of this paper is to review some existing standards that may be leveraged to support development of interdisciplinary patient-centric plans of care. Standards are then applied to a use case to demonstrate one method for achieving patient-centric and interoperable interdisciplinary plan of care documentation. Our pilot work suggests that existing standards provide a foundation for adoption and implementation of patient-centric plans of care that are consistent with federal requirements.
Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network.
Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R
2016-08-15
Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors.
Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network
Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R.
2016-01-01
Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors. PMID:27537878
Interoperable and standard e-Health solution over Bluetooth.
Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J
2010-01-01
The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.
A National contribution to the GEO Science and Technology roadmap: GIIDA Project
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Mazzetti, Paolo; Guzzetti, Fausto; Oggioni, Alessandro; Pirrone, Nicola; Santolieri, Rosalia; Viola, Angelo; Tartari, Gianni; Santoro, Mattia
2010-05-01
The GIIDA (Gestione Integrata e Interoperativa dei Dati Ambientali) project is an initiative of the Italian National Research Council (CNR) launched in 2008 as an inter-departmental project, aiming to design and develop a multidisciplinary e-infrastructure (cyber-infrastructure) for the management, processing, and evaluation of Earth and Environmental resources -i.e. data, services, models, sensors, best practices. GIIDA has been contributing to the implementation of the GEO (Group of Earth Observation) Science and Technology (S&T) roadmap by: (a) linking relevant S&T communities to GEOSS (GEO System of Systems); (b) ensuring that GEOSS is built based on state-of-the-art science and technology. GIIDA co-ordinates the CNR's digital infrastructure development for Earth Observation resources sharing and cooperates with other national agencies and existing projects pursuing the same objective. For the CNR, GIIDA provides an interface to European and international interoperability programmes (e.g. INSPIRE, and GMES). It builds a national network for dialogue and resolution of issues at varying scientific and technical levels. To achieve such goals, GIIDA introduced a set of guidance principles: • To shift from a "traditional" data centric approach to a more advanced service-based solution for Earth System Science and Environmental information. • To shift the focus from Data to Information Spatial Infrastructures in order to support decision-making. • To be interoperable with analogous National (e.g. SINAnet, and the INSPIRE National Infrastructure) and international initiatives (e.g. INSPIRE, GMES, SEIS, and GEOSS). • To reinforce the Italian presence in the European and international programmes concerning digital infrastructures, geospatial information, and the Mega-Science approach. • To apply the National and International Information Technology (IT) standards for achieving multi-disciplinary interoperability in the Earth and Space Sciences (e.g. ISO, OGC, CEN, CNIPA) In keeping with GEOSS, GIIDA infrastructure adopts a System of Systems architectural approach in order to federate the existing systems managed by a set of recognized Thematic Areas (i.e. Risks, Biodiversity, Climate Change, Air Quality, Land and Water Quality, Ocean and Marine resources, Joint Research and Public Administration infrastructures). GIIDA system of systems will contribute to develop multidisciplinary teams studying the global Earth systems in order to address the needs coming from the GEO Societal Benefit Areas (SBAs). GIIDA issued a Call For Pilots receiving more than 20 high-level projects which are contributing to the GIIDA system development. A national-wide research environmental infrastructure must be interconnected with analogous digital infrastructures operated by other important stakeholders, such as public users and private companies. In fact, the long-term sustainability of a "System of Systems" requires synergies between all the involved stakeholders' domains: Users, Governance, Capacity provision, and Research. Therefore, in order to increase the effectiveness of the GIIDA contribution process to a national environmental e-infrastructure, collaborations were activated with relevant actors of the other stakeholders' domains at the national level (e.g. ISPRA SINAnet).
Deep Space Network information system architecture study
NASA Technical Reports Server (NTRS)
Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.
1992-01-01
The purpose of this article is to describe an architecture for the DSN information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990's. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies--i.e., computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.
A Simple XML Producer-Consumer Protocol
NASA Technical Reports Server (NTRS)
Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)
2001-01-01
There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of performance data. These standard protocols and representations must support tasks such as profiling parallel applications, monitoring the status of computers and networks, and monitoring the performance of services provided by a computational grid. This paper describes a proposed protocol and data representation for the exchange of events in a distributed system. The protocol exchanges messages formatted in XML and it can be layered atop any low-level communication protocol such as TCP or UDP Further, we describe Java and C++ implementations of this protocol and discuss their performance. The next section will provide some further background information. Section 3 describes the main communication patterns of our protocol. Section 4 describes how we represent events and related information using XML. Section 5 describes our protocol and Section 6 discusses the performance of two implementations of the protocol. Finally, an appendix provides the XML Schema definition of our protocol and event information.
SMART on FHIR: a standards-based, interoperable apps platform for electronic health records
Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B
2016-01-01
Objective In early 2010, Harvard Medical School and Boston Children’s Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). Methods We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. Results We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. Conclusion In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. PMID:26911829
Integrating reasoning and clinical archetypes using OWL ontologies and SWRL rules.
Lezcano, Leonardo; Sicilia, Miguel-Angel; Rodríguez-Solano, Carlos
2011-04-01
Semantic interoperability is essential to facilitate the computerized support for alerts, workflow management and evidence-based healthcare across heterogeneous electronic health record (EHR) systems. Clinical archetypes, which are formal definitions of specific clinical concepts defined as specializations of a generic reference (information) model, provide a mechanism to express data structures in a shared and interoperable way. However, currently available archetype languages do not provide direct support for mapping to formal ontologies and then exploiting reasoning on clinical knowledge, which are key ingredients of full semantic interoperability, as stated in the SemanticHEALTH report [1]. This paper reports on an approach to translate definitions expressed in the openEHR Archetype Definition Language (ADL) to a formal representation expressed using the Ontology Web Language (OWL). The formal representations are then integrated with rules expressed with Semantic Web Rule Language (SWRL) expressions, providing an approach to apply the SWRL rules to concrete instances of clinical data. Sharing the knowledge expressed in the form of rules is consistent with the philosophy of open sharing, encouraged by archetypes. Our approach also allows the reuse of formal knowledge, expressed through ontologies, and extends reuse to propositions of declarative knowledge, such as those encoded in clinical guidelines. This paper describes the ADL-to-OWL translation approach, describes the techniques to map archetypes to formal ontologies, and demonstrates how rules can be applied to the resulting representation. We provide examples taken from a patient safety alerting system to illustrate our approach. Copyright © 2010 Elsevier Inc. All rights reserved.
Hankin, Steven C.; Blower, Jon D.; Carval, Thierry; Casey, Kenneth S.; Donlon, Craig; Lauret, Olivier; Loubrieu, Thomas; Srinivasan, Ashwanth; Trinanes, Joaquin; Godøy, Øystein; Mendelssohn, Roy; Signell, Richard P.; de La Beaujardiere, Jeff; Cornillon, Peter; Blanc, Frederique; Rew, Russ; Harlan, Jack; Hall, Julie; Harrison, D.E.; Stammer, Detlef
2010-01-01
It is generally recognized that meeting society's emerging environmental science and management needs will require the marine data community to provide simpler, more effective and more interoperable access to its data. There is broad agreement, as well, that data standards are the bedrock upon which interoperability will be built. The path that would bring the marine data community to agree upon and utilize such standards, however, is often elusive. In this paper we examine the trio of standards 1) netCDF files; 2) the Climate and Forecast (CF) metadata convention; and 3) the OPeNDAP data access protocol. These standards taken together have brought our community a high level of interoperability for "gridded" data such as model outputs, satellite products and climatological analyses, and they are gaining rapid acceptance for ocean observations. We will provide an overview of the scope of the contribution that has been made. We then step back from the information technology considerations to examine the community or "social" process by which the successes were achieved. We contrast the path by which the World Meteorological Organization (WMO) has advanced the Global Telecommunications System (GTS) - netCDF/CF/OPeNDAP exemplifying a "bottom up" standards process whereas GTS is "top down". Both of these standards are tales of success at achieving specific purposes, yet each is hampered by technical limitations. These limitations sometimes lead to controversy over whether alternative technological directions should be pursued. Finally we draw general conclusions regarding the factors that affect the success of a standards development effort - the likelihood that an IT standard will meet its design goals and will achieve community-wide acceptance. We believe that a higher level of thoughtful awareness by the scientists, program managers and technology experts of the vital role of standards and the merits of alternative standards processes can help us as a community to reach our interoperability goals faster.
A distributed component framework for science data product interoperability
NASA Technical Reports Server (NTRS)
Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.
2000-01-01
Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.
2014-01-01
termed the Galileo -GPS Time Offset (GGTO), and it will be Type 35 in the GPS CNAV message. Knowledge of the GGTO makes it possible for a properly...U.S. Naval Observatory (USNO) [1]. Interoperability with Galileo , and perhaps someday with other Global Navigation Satellite Systems (GNSS), is to...Interoperability with Galileo , and perhaps someday with other Global Navigation Satellite Systems (GNSS), is to be established through transmission of the
A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto.
Li, Shirley; Kuo, Mu-Hsing; Ryan, David
2016-01-01
A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere.
Marco-Ruiz, Luis; Maldonado, J Alberto; Karlsen, Randi; Bellika, Johan G
2015-01-01
Clinical Decision Support Systems (CDSS) help to improve health care and reduce costs. However, the lack of knowledge management and modelling hampers their maintenance and reuse. Current EHR standards and terminologies can allow the semantic representation of the data and knowledge of CDSS systems boosting their interoperability, reuse and maintenance. This paper presents the modelling process of respiratory conditions' symptoms and signs by a multidisciplinary team of clinicians and information architects with the help of openEHR, SNOMED and clinical information modelling tools for a CDSS. The information model of the CDSS was defined by means of an archetype and the knowledge model was implemented by means of an SNOMED-CT based ontology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, John; Halbgewachs, Ron; Chavez, Adrian
The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relatingmore » to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock utilities into proprietary and closed systems.« less
EuroGEOSS/GENESIS ``e-Habitat'' AIP-3 Use Scenario
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Dubois, G.; Santoro, M.; Peedell, S.; de Longueville, B.; Nativi, S.; Craglia, M.
2010-12-01
Natural ecosystems are in rapid decline. Major habitats are disappearing at a speed never observed before. The current rate of species extinction is several orders of magnitude higher than the background rate from the fossil record. Protected Areas (PAs) and Protected Area Systems are designed to conserve natural and cultural resources, to maintain biodiversity (ecosystems, species, genes) and ecosystem services. The scientific challenge of understanding how environmental and climatological factors impact on ecosystems and habitats requires the use of information from different scientific domains. Thus, multidisciplinary interoperability is a crucial requirement for a framework aiming to support scientists. The Group on Earth Observations (or GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS). This emerging public infrastructure is interconnecting a diverse and growing array of instruments and systems for monitoring and forecasting changes in the global environment. This “system of systems” supports multidisciplinary and cross-disciplinary scientific researches. The presented GEOSS-based interoperability framework facilitates the discovery and exploitation of datasets and models from heterogeneous scientific domains and Information Technology services (data sources). The GEO Architecture and Data Committee (ADC) launched the Architecture Implementation Pilot (AIP) Initiative to develop and deploy new processes and infrastructure components for the GEOSS Common Infrastructure (GCI) and the broader GEOSS architecture. The current AIP Phase 3 (AIP-3) aims to increase GEOSS capacity to support several strategic Societal Benefit Areas (SBAs) including: Disaster Management, Health/Air Quality, Biodiversity, Energy, Health/Disease and Water. As to Biodiversity, the EC-funded EuroGEOSS (http://www.eurogeoss.eu) and GENESIS (http://www.genesis-fp7.eu) projects have developed a use scenario called “e-Habitat”. This scenario demonstrates how a GEOSS-based interoperability infrastructure can aid decision makers to assess and possibly forecast the irreplaceability of a given protected area, an essential indicator for assessing the criticality of threats this protected area is exposed to. Based on the previous AIP-Phase2 experience, the EuroGEOSS and GENESIS projects enhanced the successfully experimented interoperability infrastructure with: a) a discovery broker service which underpins semantics enabled queries: the EuroGEOSS/GENESIS Discovery Augmentation Component (DAC); b) environmental modeling components (i.e. OGC WPS instances) implementing algorithms to predict evolution of PAs ecosystems; c) a workflow engine to: i) browse semantic repositories; ii) retrieve concepts of interest; iii) search for resources (i.e. datasets and models) related to such concepts; iv) execute WPS instances. This presentation introduces the enhanced infrastructure developed by the EuroGEOSS/GENESIS AIP-3 Pilot to implement the “e-Habitat” use scenario. The presented infrastructure is accessible through the GEO Portal and is going to be used for demonstrating the “e-Habitat” model at the GEO Ministerial Meeting - Beijing, November 2010.
[HL7 standard--features, principles, and methodology].
Koncar, Miroslav
2005-01-01
The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.
A wearable context aware system for ubiquitous healthcare.
Kang, Dong-Oh; Lee, Hyung-Jik; Ko, Eun-Jung; Kang, Kyuchang; Lee, Jeunwoo
2006-01-01
Recent developments of information technologies are leading the advent of the era of ubiquitous healthcare, which means healthcare services at any time and at any places. The ubiquitous healthcare service needs a wearable system for more continual measurement of biological signals of a user, which gives information of the user from wearable sensors. In this paper, we propose a wearable context aware system for ubiquitous healthcare, and its systematic design process of a ubiquitous healthcare service. Some wearable sensor systems are introduced with Zigbee communication. We develop a context aware framework to send information from wearable sensors to healthcare service entities as a middleware to solve the interoperability problem between sensor makers and healthcare service providers. And, we propose a systematic process of design of ubiquitous healthcare services with the context aware framework. In order to show the feasibility of the proposed system, some application examples are given, which are applied to remote monitoring, and a self check service.
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
Regional interoperability: making systems connect in complex disasters.
Briggs, Susan Miller
2009-08-01
Effective use of the Incident Command System (ICS) is the key to regional interoperability. Many different organizations with different command structures and missions respond to a disaster. The ICS allows different kinds of agencies (fire, police, and medical) to work together effectively in response to a disaster. Functional requirements, not titles, determine the organizational hierarchy of the ICS structure. The ICS is a modular/adaptable system for all disasters regardless of etiology and for all organizations regardless of size.
A future-proof architecture for telemedicine using loose-coupled modules and HL7 FHIR.
Gøeg, Kirstine Rosenbeck; Rasmussen, Rune Kongsgaard; Jensen, Lasse; Wollesen, Christian Møller; Larsen, Søren; Pape-Haugaard, Louise Bilenberg
2018-07-01
Most telemedicine solutions are proprietary and disease specific which cause a heterogeneous and silo-oriented system landscape with limited interoperability. Solving the interoperability problem would require a strong focus on data integration and standardization in telemedicine infrastructures. Our objective was to suggest a future-proof architecture, that consisted of small loose-coupled modules to allow flexible integration with new and existing services, and the use of international standards to allow high re-usability of modules, and interoperability in the health IT landscape. We identified core features of our future-proof architecture as the following (1) To provide extended functionality the system should be designed as a core with modules. Database handling and implementation of security protocols are modules, to improve flexibility compared to other frameworks. (2) To ensure loosely coupled modules the system should implement an inversion of control mechanism. (3) A focus on ease of implementation requires the system should use HL7 FHIR (Fast Interoperable Health Resources) as the primary standard because it is based on web-technologies. We evaluated the feasibility of our architecture by developing an open source implementation of the system called ORDS. ORDS is written in TypeScript, and makes use of the Express Framework and HL7 FHIR DSTU2. The code is distributed on GitHub. All modules have been tested unit wise, but end-to-end testing awaits our first clinical example implementations. Our study showed that highly adaptable and yet interoperable core frameworks for telemedicine can be designed and implemented. Future work includes implementation of a clinical use case and evaluation. Copyright © 2018 Elsevier B.V. All rights reserved.
Harmonizing clinical terminologies: driving interoperability in healthcare.
Hamm, Russell A; Knoop, Sarah E; Schwarz, Peter; Block, Aaron D; Davis, Warren L
2007-01-01
Internationally, there are countless initiatives to build National Healthcare Information Networks (NHIN) that electronically interconnect healthcare organizations by enhancing and integrating current information technology (IT) capabilities. The realization of such NHINs will enable the simple and immediate exchange of appropriate and vital clinical data among participating organizations. In order for institutions to accurately and automatically exchange information, the electronic clinical documents must make use of established clinical codes, such as those of SNOMED-CT, LOINC and ICD-9 CM. However, there does not exist one universally accepted coding scheme that encapsulates all pertinent clinical information for the purposes of patient care, clinical research and population heatlh reporting. In this paper, we propose a combination of methods and standards that target the harmonization of clinical terminologies and encourage sustainable, interoperable infrastructure for healthcare.
Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
A methodology proposal for collaborative business process elaboration using a model-driven approach
NASA Astrophysics Data System (ADS)
Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé
2015-05-01
Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).
EPA Scientific Knowledge Management Assessment and ...
A series of activities have been conducted by a core group of EPA scientists from across the Agency. The activities were initiated in 2012 and the focus was to increase the reuse and interoperability of science software at EPA. The need for increased reuse and interoperability is linked to the increased complexity of environmental assessments in the 21st century. This complexity is manifest in the form of problems that require integrated multi-disciplinary solutions. To enable the means to develop these solutions (i.e., science software systems) it is necessary to integrate software developed by disparate groups representing a variety of science domains. Thus, reuse and interoperability becomes imperative. This report briefly describes the chronology of activities conducted by the group of scientists to provide context for the primary purpose of this report, that is, to describe the proceedings and outcomes of the latest activity, a workshop entitled “Workshop on Advancing US EPA integration of environmental and information sciences”. The EPA has been lagging in digital maturity relative to the private sector and even other government agencies. This report helps begin the process of improving the agency’s use of digital technologies, especially in the areas of efficiency and transparency. This report contributes to SHC 1.61.2.
Archive interoperability in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise
2003-02-01
Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.
Achieving interoperability for metadata registries using comparative object modeling.
Park, Yu Rang; Kim, Ju Han
2010-01-01
Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.
Interoperable Archetypes With a Three Folded Terminology Governance.
Pederson, Rune; Ellingsen, Gunnar
2015-01-01
The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.
CCP interoperability and system stability
NASA Astrophysics Data System (ADS)
Feng, Xiaobing; Hu, Haibo
2016-09-01
To control counterparty risk, financial regulations such as the Dodd-Frank Act are increasingly requiring standardized derivatives trades to be cleared by central counterparties (CCPs). It is anticipated that in the near term future, CCPs across the world will be linked through interoperability agreements that facilitate risk sharing but also serve as a conduit for transmitting shocks. This paper theoretically studies a networked network with CCPs that are linked through interoperability arrangements. The major finding is that the different configurations of networked network CCPs contribute to the different properties of the cascading failures.
The Development of Clinical Document Standards for Semantic Interoperability in China
Yang, Peng; Pan, Feng; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping
2011-01-01
Objectives This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. Methods The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Results Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. Conclusions This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study. PMID:22259722
Metadata to Describe Genomic Information.
Delgado, Jaime; Naro, Daniel; Llorente, Silvia; Gelpí, Josep Lluís; Royo, Romina
2018-01-01
Interoperable metadata is key for the management of genomic information. We propose a flexible approach that we contribute to the standardization by ISO/IEC of a new format for efficient and secure compressed storage and transmission of genomic information.